Explain this 'new math'

I always just added it up in my head. My younger son ran in to trouble when he was told to estimate, because he could get the exact answer in his head. Older son is a math whiz, having scored a 5 on the Calculus BC AP exam as a junior. I got a D- in Algebra 2 with tutoring. I suspect poor teaching was partly to blame in my case, but I have a hard time when I can’t BS my way through it. I’ve heard people say that anyone who can’t successfully complete second year Algebra isn’t college material, but I graduated with honors and also earned a Master’s.

I see the numbers in my mind, just like I would on a piece of paper. That, for me, just seems needlessly complicated.

Magiver, you do realize, do you not, that Indistinguishable is a professional mathematician, and that it is hence extremely unlikely that he’d be dismissive of math skills. If it appears to you that he is, then the most likely explanation is that you have in some way misunderstood what he said.

Thanks. I didn’t have the time to determine the actual grade, and math is not really my field.

Do you think the grading levels of the Common Core for math are realistic?

That’s a question better asked of a developmental psychologist, not of a mathematician.

By the way, there’s an excellent, excellent analysis of the Common Core State Standards for Mathematics (CCSS-M) on the Mathematics Educators stack exchange:

I won’t quote it because it’s very long, and the mods would get ornery (though all Stack Exchange stuff is licensed under Creative Commons - Attribution, so technically it’s legally okay), but if you’re at all interested in the perspective on CCSS-M from actual K-12 math educators, it’s a good answer. It’s good enough that it goes over opposing views and potential arguments from both sides, it’s not merely one person’s opinion.

Your first paragraph above is a utility-based argument against the business need for even basic numeracy for most employees in most modern retail environments. Then in your second paragraph you suggest that even waiters and sales clerks should have learned in school about some highfalutin abstraction which you call “conceptual understanding.” There is a bit of tension between the two positions, but I guess you draw them together somewhat by stipulating that no one need ever demonstrate that they can do stuff like add or subtract, and God forbid they should ever be required to practice any damn thing.

Your approach is guaranteed to leave us with the same half-innumerate nation we are stuck with now. So the machines designed by the other half of the nation will do most or nearly all of the routine calculating? Well of course a fully rather than 50% numerate nation might be expected to invent more and better machines, but hell, we’re doing so well now that there is no need to improve, aren’t we?

You were commenting on this statement by another member:

“Okay, that is one pack of gun for 50 cents and $3.95 for the jerky. That’ll be $25.07, please.”

The phrase “That’ll be” identifies the speaker as not as an innumerate customer, but as an innumerate business employee.

It’s worth noting that the Common Core standards were created by looking at commonalities between education systems, both in the States and internationally, that consistently achieve excellent results.

In some states, they aren’t much of a change at all. In others, the changes are substantial because states figured out they could inflate their “success” by having extremely low targets.

I do think they are realistic- millions of kids achieve similar standards routinely. But it won’t be an automatic pass for all vaguely normal kids, and states that have been slacking will have a tough transition.

I think we differ on expectations of basic math skills.

This.

It isn’t a “get of my lawn” argument. I’ve dealt with teenagers in good school systems who couldn’t calculate miles-per-gallon or cost-per-ounce. They are virtually paralyzed doing anything useful for themselves without explicit instructions in front of them.

I see the confusion between us now. I was taking OffByOne’s example’s intent to be “A salesclerk could deliberately swindle you as a customer, if you did not sufficiently drill on your arithmetic algorithms to counter-check the cash register”. You were taking the example’s intent (possibly more correctly!) to be “A customer could complain that you as a salesclerk accidentally swindled them, if you did not sufficiently drill on your arithmetic algorithms to counter-check the cash register”.

Either way, though, it is “number sense” that is being called for here, and not the memorization of and training in quick manual execution of arithmetic algorithms, in that in the world we live in, neither the salesclerk nor the customer in a typical transaction manually keeps track of a to-the-penny running total of prices at transaction time. The very purpose of having the cash register run this calculation is so that others do not have to run it, but rather can simply rely on it with loose “number sense” sanity checks. The skill the salesclerk needs is to be able to effectively use their calculator, not to effectively operate in a counterfactual world where they lack their calculator.

While I therefore do not think manual execution of decimal notation arithmetic algorithms is illustrated by this example, let me also speak to a more broad class of arguments: it is the case that certain skills may be anomalously useful for particular jobs, which are not nearly so useful more widely. It is not necessary to bring everyone up ahead of time to the level of fluency with all skills that all jobs may require, nor do we in most contexts even attempt this; rather, more reasonably, one may obtain job-specific training in the relevant niche skills at such point as that one takes on or wishes to pursue a particular niche occupation.

This is the exact reasoning people give for why the new math is good. The argument is that the old math way is to have the students memorize a set of explicit instructions for doing arithmetic, so they can get the answer without understanding what they’re doing. So they’re unable to deal with any situation that isn’t covered by their explicit instructions.

Richard Feynman makes a similar argument about algebra, and surely more articulately than I would.

First off the term “new math” is a misnomer. It’s been around at least since I was in school. No teacher would waste time trying to teach 8 different ways of doing the same thing. that’s a recipe for confusion and a massive waste of time.

Nobody is claiming that the techniques used through most of the 20th century are the best. What people are claiming is that current generations of children appear to lack rudimentary math skills.

All basic math involves explicit instructions of one kind or another so I don’t understand your statement at all.

I just square 70. [4900]

Then I subtract 70 X 3 [4900 - 210 = 4690] (this is 70 X 67)

Then I subtract 67 X 3 [4690 - 201 = 4489].

I had a crazy, out there idea.

There’s a computer game called “SpaceChem”. It provides a way to program a series of logical operations by setting up the nodes on a grid, that little operators called “waldos” travel between, following the instructions on the grid.

To me, it’s an incredibly easy programming language to learn. It has been deliberately limited for the purpose of the game - there are a small number of operations and the grid is very small - but this limit could be relaxed. No “program” written in the language can crash or corrupt memory, you can observe the entire state of the program at any moment in time by looking at the position of the waldo and the nodes on the grid that can change state.

Anyways, you could write a visual programming language like this. And, the actual operators on the grid would each be primitive enough that someone could construct an ALU capable of implementing them directly. First graders would learn math by programming computers to do all but the most primitive of operations by programming on their tablet…(and so on, up the grade levels)

You don’t really understand a math operation until you have programmed a computer to do it.

This is essentially how the “diamond age” by Neil Stephenson described the ideal schooling : teach a person the fundamentals in the context of using those fundamentals on a real, state of the art system, not some ancient method. Once they know the fundamentals, optionally give them education in how to understand the rest of the system that implements the learning environment. (the Primer itself, a nanotech book, had books on how to build one given to the student at the end of the story)

So the actual tablets you teach students how to program on run an operating system that is both open source and simple enough to understand. The actual program rendering the game GUI and grid is also written to be clean and understandable and modular. The robots that make these tablets are another topic that is well understood. And so on and so forth.

The thing is, mathematicians have a bias towards not thinking that arithmetic is important, since it was usually easy for them to learn and has little to do with the math they wind up doing later. When you get to advanced math, that’s when arithmetic starts seeming less important. But you can’t learn a math sense without learning arithmetic.

And, yeah, there are multiple ways to learn your arithmetic. But that doesn’t mean this current standard of forcing you to do all of them all the time makes any sense. Once you find a method or two that works for you, you don’t really need to keep drilling on the other methods. Yet that’s what happens in all those worksheets used to show “common core” sucks. They’re forcing the kids who have already learned how to do it to do a bunch of extra work that will ultimately only discourage them from using math–the same way having too many homework problems always has.

That’s what bothers me. Math is all about learning in steps. You work until you get one thing down, then you use the shortcut that makes doing the next steps easier. You use a calculator because you already know how to do it by hand or in your head, but the calculator gets you to the more important math faster. Eventually, you have your calculator do your differential equations–you understand the concept, but don’t want to waste the time when it’s just a part of the process.

I say this as someone who thinks memorization is the wrong way to teach math–or anything else, for that matter. I was taught the Montessori method, which basically boils down to using objects to teach why the old methods (multiply each digit with carry, plus distribution) work. Once we got the concept, we were introduced to a times table, but I never had to actively memorize it.

But I sure as hell think my foundation in arithmetic is what put me ahead of the curve in all subsequent math classes. Sure, maybe learning to do square roots by hand was pointless, and starting with algebra would have been better, but the stuff I had to learn to be able to do that was quite helpful.

Especially when, for most people, the old way was the One True Way to fail to learn math.

You misunderstand. Arithmetic is important… but it’s not math.

“Learning arithmetic” can mean different things. As I said, everyone needs to learn the basic conceptual understanding of arithmetic; that is, what addition and multiplication and so on mean and how they behave, how to interpret decimal notation, etc. And, sure, you would not count as fully having become fluent with these things if you could not, given sufficient time, figure your way through “What (in decimal notation) is 36 * 19?” and such things, by algorithmic or non-algorithmic reasoning.

But, following up on your own provided example:

The average student is never drilled on calculating square roots by hand, and is rather permitted use of a calculator should decimal expansion of a square root be called upon, yet we feel satisfied that they understand the concept of square roots.

Nor are they made to memorize algorithms for computation of logarithms. Again, recourse to a calculator is permitted, if only because in earlier times recourse to a slide rule could serve the same purpose.

Nor do we use trigonometry classes to drill them on algorithmic calculation of sine, cosine, etc. The concepts are introduced, a “number sense” is built up, the individual rules which would aid computation were computation the goal are in many cases studied, but actually practicing computation of these things is saved for those students who are interested in such things to learn in an optional computational methods course. For everyone else… there’s the calculator.

Statisticians do not train themselves to reckon the Gaussian error function, but rather to look up its values in a pre-printed table… or, in the modern era, in the particularly convenient printed-on-demand “table” provided by an electronic calculator.

In all these areas, we do not demand students memorize and practice algorithms for efficient computation of a function in order to demonstrate conceptual understanding of said function.

Addition and multiplication (and subtraction, and division, oh, god, yes, division) are no different. In a world without ubiquitous access to calculators, it may have been useful for the general population to be drilled on providing such service manually (though, again, I will point out that the purposes people seem to think most exemplify the need for this manual calculation (e.g., shopping) are ones humanity was able to get by for thousands of years carrying out without even being aware of decimal notation, much less standardizing decimal notation computation algorithms for all children to learn). We no longer live in that world.

If anything, we should be training people to use calculators effectively. Those who wish to furthermore train themselves into expert manual long dividers or what have you can do so outside the universal curriculum, just as those who with to train themselves into expert manual arcsine calculators do now.

As an aside, on the college physics homework I just graded, one of the students actually did work out a calculation via long division. I gave credit for it (the calculation was correct), but also left a note on the paper that we’re allowed to use calculators.