How is a square root (and, indeed, any other root) determined without a calculator, and without resorting to guessing?
I’m sure there must be a clever way, but I’ve never come across it.
How is a square root (and, indeed, any other root) determined without a calculator, and without resorting to guessing?
I’m sure there must be a clever way, but I’ve never come across it.
When I was a tyke before calculators were commonly available there were tables that you could use. If you needed a number between two values in a table you would have to interpolate. There were similar tables for trig functions.
Haj
I was taught a way of determining a square root. The process looked a little like division. That was before the days of even good adding machines. (Well…almost.) But I don’t remember the specifics. I think one of my family members might know. I will check.
If you follow the link on that page to “longhand square roots”, you find this:
http://mathforum.org/library/drmath/view/52610.html
That technique actually used to be taught along with other arithmetic skills. Even before calculators had become commonplace, it had been dropped from most curricula, I think because it was seen as so much rote busywork, since students seldom grasped why “longhand” computation of square roots worked, or why they should want to do such a thing. At one time they even taught torturous hand techniques for cube roots.
Here’s a very efficient way (especially if you know some nearby squares).
Let’s say you want to know the square root of N:
Well, N[sup]1/2[/sup] ~ (E[sup]2[/sup] + N)/2*E, where E is your estimate for the correct figure for the square root
Example:
If you want to know the square root of 150:
Let the estimate be E = 12
So, 150[sup]1/2[/sup] ~ (12[sup]2[/sup] + 150)/2*12
i.e. 150[sup]1/2[/sup] ~ (144 + 150)/24 = 294/24
Someone once posted a link to a site that had a nice geometric explanation for this, but I can’t seem to find it right now.
I was taught the tortuous (did you mean this, or really torturous?) way of finding square roots longhand. This was in tenth or eleventh grade, around 1976. Calculators were becoming common, but were often viewed as cheating in class. We still had slide rule competitions.
I got nothing out of the technique.
KarlGauss’s method is exactly the same as hajario’s. It just has steps 2 and 3 combined into a single one.
And technically, KarlGauss’s method is actually Newton’s method.
I’ll take a guess that the OP was asking how a person would routinely calculate nth degree roots.
Answer: Look up the logarithm of the number in a published book of log tables, divide by n, then look up the antilog in the same book.
Alternatively: Use a slide rule to determine the log, do the division and determine the antilog.
Either adjective works - the method is tortous - winding and twisting, and applying it it torturous as far as the student is concerned.
As already observed, the technique presented by KarlGauss is a special case of Newton’s method for this particular problem, and if used with a reasonable estimate, will converge very rapidly.
Background:
In general, Newton’s method is brushed over in calculus classes, and brought up again almost immediately should you study numerical analysis. It’s a good technique, with the drawback that it has pathological cases, and can shoot off into the wild blue yonder if not provided with a good enough initial estimate.
Newtons’s method is a technique for iteratively finding zeros of a function by calculating the new estimate as:
e[sub]i+1[/sub] = e[sub]i[/sub] - f(e[sub]i[/sub])/f’(e[sub]i[/sub])
f’ indicating the derivative of f. The general idea is that f’ gives you the slope of the tangent line to the function at a given point. That expression is what you wind up with if you then take the zero of the tangent line. If your guess was pretty good, and the curve doesn’t change much between your guess and the zero, it will get very close.
Plug f(x) = x^2 - n (f’(x) = 2x) into the above, and KarlGauss’s method pops out.
Excuse me, I should have said “should you study numerical METHODS”.
I was in Honors Algebra in 1978-9 and our teacher spent one class teaching us this. He said right up front this is really hard and confusing and something you’ll never need to use even in math class so don’t sweat it too much. But given the universal use of calculators by that time he felt we should at least be shown that it was possible and the method for doing it.
This was an advanced 8[sup]th[/sup] grade class taking 9[sup]th[/sup] grade Regents Algebra so everyone liked math, but everyone hated trying to do this!
My high school math tracher showed us a trick for use with a calculator without a square root key. It had to have a memory in key (whatever the number was on display would be put directly into memory replacing whatever was there). It had to do with guessing the apporoximate root and the previous guess and dividing by two. Worked well enough.
At the very least, the people who write the square root algorithms for computers have to know the hand method. On paper, it can be described as “guess the next digit” and test. In a computer, it is “guess the next bit”. You try 1 and if that is too large, it is 0. Very efficient when doing multiword square roots. The trouble with Newton’s method is that it requres lot of long divisions and that is much slower than this one bit at a time; I once tested this.
It is also a good example of an algorithm. I guess I learned it in around 7th grade and have not forgotten it, nor why it works.
Newton’s method is far more efficient than the psuedo-long division method. It converges quite quickly, doubling the number of sig. bits on each iteration. Yes, it does do division but that doesn’t mean “long division”. For very large numbers, division is done using methods based on FFT that are far faster than long division.*
For a special square root only function for a small number of bits (<=64) on a computer, the actual method is to use interpolating polynomials instead.
*Almost all the complex arithmetic operations you learn in elem. and high schools are the slow way of doing things.
Hari Seldon’s method requires one divide and one add per iteration, and gives one significant digit per iteration.
Newton’s method requires one divide, one add and one multiply per iteration, but it doubles the number of significant digits per iteration. This is called quadratic convergence.
Just how well did you test this Hari Seldon, to conclude that (1) was faster than (2)?
Sorry, Hari Seldon. I now realise that your method involves a multiply rather than a divide.