Sunday, April 27, 2008
A recent article in the NEW YORK TIMES, "Study Suggests Math Teachers Scrap Balls and Slices," by the all-too-credulous reporter Kenny Chang, uncritically touts the results of a research study done by Jennifer A. Kaminski, a research scientist at the Center for Cognitive Science at Ohio State, and her colleagues, Vladimir M. Sloutsky and Andrew F. Heckler. Why is this of relevance to mathematics educators? Well, the study employed the so-called gold standard of research: a randomized, controlled experiment. And the educational conservatives, always hot for "data-based" evidence (as long as it's evidence that supports their politics), were quick to notice.
Well, now we know how many holes it takes to fill the Albert Hall, but I doubt we know anything useful about teaching and learning math from this study. It has only slightly more holes than the above-mentioned concert venue.
First, as mentioned in the article, the researchers studied only college students and extrapolated boldly to the entire rest of the population, particularly K-12 students. But aside from age/developmental issues, they are comparing K-12 kids with students at Ohio State University, a group that is not representative of a cross-section of K-12 students, many of whom would never gain entrance to that institution. Are these inferences really valid? Any mildly skeptical person should wonder at the enthusiasm coming from the anti-reformers.
Second, compare what the OSU students were asked to do with what K-12 students are asked to do in math class. Do you agree that K-12 mathematics is a decontextualized set of arbitrary rules that can be taught PURELY through abstract symbols? Try that in K-5, for starters. Just introduce the notation of arithmetic of whole numbers, then move on to integers, then to fractions. No context AT ALL. I'm not talking about "real-world applications" so much as any sort of correlation whatsoever with prior experience and knowledge. Have fun.
This sort of experiment, which of course is impressive to those who are looking for confirmatory evidence of what they already believe, is always problematic because it creates a nonsensical, decontextualized task, often one of the few things where it's possible to control for a lot of variables and noise, but unfortunately not necessarily like anything outside of the lab, and uses the results to judge things that are at best VAGUELY like the controlled situation.
I would bet a year's salary that if someone were to do a similar study that "proved" the opposite thesis, reform critics would bury it and dismiss the results if someone brought it to their attention.
What seems to be the conclusion of this study is that K-14 math should be taught like upper-division college and graduate school math. What an idiotic idea. If anything, we need a bit more of K-12 math informing some of how college and graduate school math below the doctoral level is taught (I omit doctoral level courses on the assumption that anyone who makes it that far has likely bought pretty deeply into the status quo and is in any event capable of surviving bad math teaching regardless of whether s/he accepts it as good math teaching).
I don't doubt, of course, that concrete examples of meaningless, arbitrary rules are not any guarantee that one will do well at learning those meaningless, arbitrary rules. But if our goal were to have students learn such things, we wouldn't be teaching them mathematics. Most of mathematics is neither meaningless nor arbitrary. Aside from trappings like notation, terminology, and things like order of operations, or what to make an axiom, what terms to accept as undefined, etc., which are more a matter of judgment, taste, or style, I have operated under the assumption that mathematical ideas make sense. I don't think I'd evaluate the usefulness of manipulatives, applications, or the like based primarily on this experiment.
As for the anti-reformers who praise of this study, well, they'll believe anything. As long as it supports their biases, of course.