## Wednesday, December 5, 2007

### Finding The LCD, or Why Does My Math Teacher Insist On Making Me Hate Math?

Recently, there's been an on-going conversation/argument on math-teach@mathforum.org, unusual in its frequently bordering on civility and even some degree of agreement among antagonists, regarding the role of teaching/learning/using the Lowest Common Denominator (LCD) or Least Common Multiple (LCM), which are effectively the same thing, when working with, say, addition and subtraction of fractions. It should be noted that similar issues arise later in working with rational algebraic expressions, particularly when there are polynomials in one or more of the numerators or denominators that can be factored in some of the standard ways that students have worked with previously. But my concern is with something else, specifically the arbitrary way in which many K-5 teachers insist that to add/subtract fractions one MUST find the LCD. Failure to do so often results in all sorts of negative consequences, from stern glances to loss of credit on classwork and tests. On my view, this is a classic example of abuse of teacher authority, as well as an indication in many instances of mathematical ignorance on the part of some teachers.

It is false to state that in order to add or subtract fractions (or rational algebraic expressions, for that matter) that one must first find the LCD. What is needed is only to find A common denominator and create equivalent fractions for the numbers or expressions one wishes to add or subtract that have the same denominator. The rest is fairly trivial.

Since one can ALWAYS find a common denominator by simply multiplying together the denominators of any two (or more) fractions with unlike denominators, the issue becomes a matter of taste, to some extent, but also convenience, speed, and perhaps elegance or aesthetic qualities as well. But it suffices to find ANY common denominator if one so chooses. If the goal is to find the sum or difference, we just need equivalent fractions with the same denominator.

Some people seem to believe, however, that there is a huge need to put students through the grind of doing lots of examples in which they factor a bunch of denominators and possibly numerators (not just with numbers, but in rational algebraic expressions). They argue that this is essential stuff for algebra and above. But of course that is true only to the extent that textbook authors join in the conspiracy to present cooked problems that always submit to just the techniques of factoring that we make the heart and soul of Algebra 1 courses. It's hard to deny that if you rig the game you can guarantee the outcome. If in the real world, we knew that most problems people have to solve were going to have convenient factorizations in them, then by all means it would be worthwhile to obsess on this sort of thing. But that's not really the case, no matter what math textbook authors might mislead us into believing. And so all this focus on finding the simplifications, while mathematically nice, is not necessarily what people will want or need to do outside of math class: they're going to be more interested in getting the numbers crunched efficiently and accurately and then doing what actually matters: making sense of the results. Elegance and "oh, isn't that slick" mathematical aesthetics are indulgences for those who have the time and inclination. Nothing wrong with that at all, but do we really think misleading students that such will be the situation most of the time when they need to deal with fractions or rational algebraic expressions? Do we really want them to think that most people don't grab the calculator, possible with a CAS system, or sit down at their computer? Please. Pull the other leg, it's shorter.

Yet I read people arguing with very straight faces that we have to teach this and teach it thoroughly and that it is essential for algebraic readiness (so to not make finding LCDs a MAJOR emphasis is to deprive students of their futures. Seriously. You can't make this stuff up).

When I and others retort that the main goal should be to understand what fractions are and how their basic arithmetic operations work, and that methods and techniques, while worth exploring, are not in and of themselves valuable without understanding, and that further, it's possible to learn to do an algorithm by following steps but really not having much or even any understanding of the symbolic manipulations one is doing, I'm told that being able to repeat steps IS an indication of understanding, though maybe not of a deep and thorough kind. I find this truly bizarre.

I think my preference would be to say that when students make choices about how and what to do and can intelligently explain their reasons for the choices they've made, then they show understanding. Following cookbook steps to solve cooked-up, convenient problems may be little or nothing more than donkey-like behavior, what Douglas Hofstadter has described as "sphexishness."

I'd prefer not to focus specifically on procedures/algorithms, but not to exclude them from consideration, either. That's because as students move through elementary mathematics, it is easy to forget that there are deep levels of understanding relative to their degree of mathematical maturity and experience that can easily be forgotten, overlooked, or consciously ignored by overly-focusing on procedures. Not only does making algorithms for calculation THE primary focus (not that it isn't a part of what should be attended to) of early mathematics instruction do a disservice to the students, it readily can hide and distort important information about who actually has a very good mathematical head on his/her shoulders AND which students may have significant difficulties despite seeming success at following steps they don't really understand at all.