How are square roots of non-perfect squares determined?

Welcome to the boards Freezair SilverEye.

Nothing to add to what others have said. Just three things to say really.

  1. I never learned how to extract square roots by hand in school either. As has been noted, it’s nowhere near as fundamental as the four basic operations.

b) You mentioned the TI-85! Yay, that’s the one I’ve had for about ten years :cool: ! They have indeed moved on. I doubt to better things, but there ya go.

and

Pi) Circles are actually really interesting. :smiley:

Just for completeness, here’s a Wikipedia article that outlines several different methods of computing square roots. Some are useful for computation by hand; others are not.

I agree, having been in high school in the last years they taught slide rule. It reinforced the use of scientific notation and forced you to at least be aware of the right order of magnitude for your answer.

Being quick at mental arithmetic is very helpful for students when they’re doing algebra, or things like factoring, or figuring out a quadratic equation. Lots of kids can answer “what is 68?" easily, but when they see 48 in a problem they don’t immediately think "That’s 68.”

David, I don’t think it’s changed all that much since, well, since EVER. Pythagoras died before even youwere born. :wink: I still use a “Machinery’s Handbook” from 1969 because Unified Threads haven’t changed in the past 38 years.

Had a prof who wanted us to learn how to use logs in 1977. “What’ll you do if you need to perform a calculation and your calculator’s battery has died?” Fine, considering the life of a calculator battery in 1977 (as short as an hour), but what if we didn’t have our log tables, either?

“You should ALWAYS carry your log and root tables.”

The old x87 FPU did have a sqrt instruction. (Cite. ). I rather expect it was implemented in microcode, not hardware, though.

[Holds Up Hand]
Yep, wouldn’t be without it although it is currently out on loan.
Also have a slide rule here from way back when. The batteries never seem to run out.
No calculator though, they never seem to last or the kids swipe them.
[/HUH]

Also may I say welcome Freezair SilverEye. Excellent first question, certainly received some good responses.

That was one single minded prof. I have a slide rule with carrying case that I can hang on my belt. It contains a complete set of log tables, to the accuracy I need, and in a handy form.

I actually managed to figure that out myself five or six years ago, and was quite disappointed to find out that it was old news.

It’s even older than you probably thought, in fact. It’s frequently known as “Heron’s square-root formula” after the ancient Greek mathematician of that name, and was probably familiar to the Babylonians in the early second millennium BCE.

I might be worth mentioning that “Newton Raphson” is a general numeric method for finding the zeros of functions. The instructions provided above are what the method turns into for the special case f(x) = x^2 - n.

Conceptually, it’s fairly easy to see how the general case works, if you’ve had an introduction to calculus - the derivative of the function gives you the slope of a tangent line to the function. Simply calculate the zero of a line of slope f’(x0) passing through (x0,f(x0)) where x0 is your guess. That zero is your new guess. Once you are close enough that the function slope is not changing rapidly as it approaches the zero, it converges VERY rapidly. The technique usually has stability problems as it can shoot off into the wild blue yonder if your initial guess isn’t good enough, but for a parabola with its vertex on the x axis, it will be stable, although slow to converge if you start with a poor guess.

I might also add that, in general, Newton Raphson can get dicey for realizing a stopping rule when you program it, particularly for cases where the zero is a turning point, like here. As you get close to the zero, your slope becomes very near zero, although f(x0) does too, and round off error can have drastic effects on your final iterations.

Bleh, forget the last remark. I’m not thinking straight. The turning point is always 0 in this case.

Another method which is less efficient than Newton-Raphson, but which is even more universally applicable (i.e., a variation of the method is useful for many problems other than square roots), is the method of bisection. Again, using the square root of 2 as an example:

I know that 1 is too small to be sqrt(2), since 1^2 = 1, and I know that 2 is too big, since 2^2 = 4. So try something halfway in between: 1.5. Well, that’s still too big, since 1.5^2 = 2.25. So now I try something between 1 and 1.5. 1.25 is too small, since 1.25^2 = 1.5625. So now I try between 1.25 and 1.5, and so on.

I think my daughter swiped my well worn copy for the table of integrals.

When I was in college, doing integration by computer was a current AI problem. Now it’s standard. I was born way too early.

It can be more efficient, actually.

For my thesis, I wrote several programs (which now come with your Excel, but weren’t available then and actually mine gave the user more control) to calculate probabilities (Gaussian, t, F and Xi2).

You could calculate the probability to the right or the left of a number, or given the probability get the number. The second case required iteration; I proved that starting with bisection and switching over to N-R once your interval was small enough was the best of both worlds (my advisor was looking at me real funny, since his notion had been to prove that N-R was always faster; well, it wasn’t, and in some cases bisecting was faster, my method always found these special cases before switching over). Sometimes N-R would get itself into a loop, where it tried value A, then B, then C, then D, then A… so I also put in some triggers to detect loops or “this is taking too long” and go back to bisecting.

Not only is bisecting so simple it hurts, it doesn’t produce loops.