The age-old (well, it’s older than me) question: At what level and to what extent should students be able to use calculators?
I recognize the potential for * The Feeling of Power*. But, so what? In almost every situation where solving a math problem that matters, I would want a calculator at least to check my math. Using a calculator may result in children unable to add multiply without a calculator, true. Again, so what? On some level, there is a look-up tabe embedded in your brain that tells you what 7*6 is. What difference does it make if that table is external?
In high-school math? I don’t think the students should be allowed to have calculators at all, except in a few cases. Math problems to test all manner of complicated crap can be tailored such that calculators are unnecessary, as long as the teacher accepts answers like “2sqrt(5) + e” and such. When I was in highschool, we were allowed to have any kind of calculator we wanted. What this resulted in was - and I’m not exaggerating here - is that kids would use calculators to add two 1-digit numbers. It was really pathetic. They became incapable of performing even the simplest arithmetic without a computer.
Now, if the assignment is some sort of real-world word problem, where the answer pertains to reality somewhat, calculators are fine. Students shouldn’t have to waste their time multiplying 4-digit numbers together, and an actual number gives the student an idea of whether their answer is right. (“Bob is 3.4 inches tall? WTF?”) And in science classes, of course, calculators are a must. But there are precious few math ideas that can’t be taught and practiced without calculators.
Agreed that complex problems can be solved and simplified while giving an exact answer (as compared to a decimal answer.) I just did an assignment where I had to show that p and d orbitals, when either half or completely filled, were not affected by the angle for calculating the probablity of an electron. Basically, this nasty looking equation using all sorts of trig simplifed down to the radial part of the equation times 1, though it wasn’t always easy to get the simplification.
Now, if I’m doing a real-world chemistry problem (or physics or math) I’m using my calculator. Just like a slide rule, calculators and computers speed and simplify calculations. I couldn’t do logs with a mantissa table to save my life, but I know how to use them for all sorts of things and letting my calculator do the grunt work.
For what it’s worth, I’ve been using a TI-86 since high school. I’ve still to figure out how to use all the functions–it’s generally easier to just do the calculations than try to enter them correctly to compute a derivative or something. I’ve yet to find a use for the test functions, have yet to figure out how to program the thing (the language is awful in my opinion), and such. If I’m doing something massive I’m putting it into Excel or Maple.
And? Provided they get the answer, what is the problem?
With specalty software like MatLab out there, there are precious few math ideas that can’t be taught and practiced with calculators. Can you defend the presupposition that math without calculator > math with calculator?
It’s pretty obvious from this statement that you haven’t got very far in math. Once you get into almost any of the active research areas, most of what you’re doing would benefit very little from the various mathematical software packages. Those do have their place, but there’s a lot that they can’t do.
As far as calculators go, I think they impede the development of number sense. People get the attitude that whatever the calculator says is right, and don’t stop to check. It says the probability is -583, then the probability is -583. I’ve also noticed that people who use calculators a lot are terrible at estimating.
I do agree that it’s more important to know how to solve a problem than to do the arithmetic yourself, but that doesn’t mean that being able to do arithmetic isn’t important.
Seriously, if people know what an answer should be, then they will catch it whether they goofed in the formulas or in the keys. If they don’t know what should be happening, then working the problem by hand won’t reveal any more insight then plugging in numbers.
I am challenging this assertion. You’re a smart person, but I really would prefer cites or arguments.
*Originally posted by robertliguori * 2nd semester college calculus.
Right. Not far at all, no matter how it might seem. If you’d like to get a feeling for higher math, try out an introductory abstract algebra class. You’ll see how far a calculator or Matlab can get you there.
Actually, Matlab or Mathematica could do a bit for you, depending on what types of problem you have. But the calculator? Useless.
Just thinking, out of the 21 math or mathematically-themed classes I took in college, there was some use for a calculator in 3 of them. By “some use”, I mean “I used it at some point”. That’s 18 classes out of 21 where a calculator was of no use at all.
Right, but a lot of people who use a caculator for almost anything don’t develop any sense of what the answer should be. Doing problems by hand makes you more familiar with what types of output you can expect from given inputs and why. That, I think, is the strongest argument I can offer.
For those who missed it due to the quote chopping, the assertion is “Being able to do arithmetic is important”. There’s a lot of hemming and hawing on this issue, but not a whole lot of facts (AFAIK). See what I said above for the strongest argument I have.
I’m a freshman engineering major (for now anyway). The school of engineering here requires four calculus classes. We aren’t allowed to use any calculators at all. I’m not sure about elective math classes. We are allowed to use calculators/Matlab for stuff like [url=http://ea2.mccormick.northwestern.edu/homework/EA2DPW03.pdf]this[/url ]in physics and chemistry.
Then you wouldn’t mind people carrying around dictionaries for referencing any word larger than two syllables.
The aim of “No calculators for arithmetic” isn’t so students remember what 9x9 is ?
It’s so that they can figure it out using their brain alone. Math(ematics) isn’t a bunch of rules applied to numbers. It’s a science based on a consistent set of rules and functions. Learning to internalize application of these rules can only help foster rational thought.
I get a lot of flak over this but I think that students should be given calculators from the beginning and shown how to use them. In parallel with that, a few problems should be worked longhand regularly and the time spent on computational practice should be spent on learning how to get order-of-magnitude estimates so they can spot calculator answers that are way off.
My justification is that I went to school when you drilled constantly on arithmetic and my recollection is that we didn’t turn out to be arithmetic whizzes. Those who have a flair and liking for math will go into that field if they are encouraged to do so. No amount of drudgery will improve those who don’t have the knack and dudgery will turn off those who need math and should be able to use it. A calculator allows them, i.e. most people, to make use of mathematics.
Again, I disagree. If I learned the reason why a certain formula gives answers within a certain range, then that knowledge applies whether or not I am solving problems by hand. Conversely, if I was taught (as I was in many classes) to memorize formulas and use them, then I’m not actually losing anything by not memorizing them. Actually, I could make a case that using calculators makes you more famaliar with meta-rules because you can, if you had a mind to, evaluate functions far faster then you could by hand and note that when solving problem type A, a negative means you buggered up, f’rinstance.
If the dictionary was so easy to use that it was contestably faster than remembering words, hell yes.
Okay. Hypothetical: let’s distuinguish learning about math from learning how to do math. Let’s learn (and be tested on) why the integral is the area under the curve, and have it assumed that we can (n^(x+1))/(x+1) or use the calculator, as we please. I ask again, is there any practial difference between plugging numbers into memorized formulas and plugging them into a calculator?
Er, no. That was why I clicked the little “new thread” button and typed stuff and hit the “submit” button.
I’m talking about arithmetic. Doing 345+2414 without a calculator. I know students who can’t do that much, since they never felt the need to. ElJeffe talks about students using a calculator to add single-digit numbers. To me, that’s absurd.
Memorizing formulas isn’t the only alternative to using calculators. The important thing in math is to understand where those formulas come from - how they are derived, and what assumptions they are based on. It’s possible to design exams which test your understanding of those equations and formulas instead of merely finding out whether you’ve memorized them or not.
In any case, if any math class beyond elementary school is using problems that can be helped by calculators, I don’t think they are teaching it correctly. Physics, chemistry and accounting are different, those classes should teach you how to make the best use of calculators.
If students are incapable of doing routine arithmetic by hand, clearly they lack some fundamental understanding of what basic arithmetic even is. I don’t have a problem with capable students using calculators as a tool, but I find it difficult to believe that incapable students with a calculator could ever amount to anything worthwhile–without having even a basic understanding of arithmetic, how are they even going to know how to use the calculator in a useful way in the first place? These are the types of students who use a calculator to conclude that one gallon of 40[sup]o[/sup] water plus one gallon of 40[sup]o[/sup] creates two gallons of 80[sup]o[/sup] degree water (and believe me, these students do exist).
Actually, I’ve found some exams are actually easier if you don’t have the option of using a calculator.
As an example, I took the CPA exam in the early 90’s in one of the last years when calculators were forbidden. The problems were set up in such a way that they required only basic arithmetic. So, if you were working through a problem and found you were, say, multiplying 3.25874 x 4157.88936, you knew you’d made a mistake somewhere. Knowing that definitely helped me on test day as I realized on more than one occassion that the arithmetic was getting too complex and I had better reexamine what I was doing.
(That only works when you have a good test designer, but I did have similar situations in grad school.)
Personally, I don’t mind the use of calculators as early as in high school classes. But I do think students need to flex their brain muscles on basic arithmetic at some point.
You’re not always going to have a calculator with you. Few times will you have to multiply 3.25874 x 4157.88936 without a calculator. But I’ve been in several meetings where (just as an example) someone has suddenly asked whether a flat contract for $2000/month is less costly than one with a flat fee of $500/month + $17 per sale. It helps if you can do the math right there and not send someone running for a calculator!
I too have only gone through 2nd semester calc, where we were not allowed to use calculators, which I understood. We were never tested on anything that a calculator would help with. But now I’m in physics, where we are not allowed to use calculators either. The reasoning in physics is that if the professor allows one calculator, he has to allow all, and some (like mine) have the capability to store all our equations. But, our tests consist of long division, and multiplying 6 decimal places. (To test our knowledge of sig figs at the same time) Drives me nuts. Instead of checking my logic for the test problems, I’m redoing my elementary math.
btw, I thought that anything above college algebra was higher math. Maybe not here on this message board; I’ve been kinda a longtime lurker and there seems to be alot of professional people. However, in my experience, having had any calculus is pretty impressive.