Complex calculations before computers

It is not a question of breaking. It is a question of accuracy. If you are trying to track the position of something over a path the length of the universe to the accuracy of the width of a proton you need more than 10 digits for all but the most trivial scenarios.

engineers liked to fly the geek flag.

scientists used circular slide rules that fit in your pocket.

Kepler spent decades doing nothing but one complicated calculation after another to come up with his famous laws.

A related question: How many trig functions are there? Most of you probably know of three — sine, cosine, and tangent — and some of you might say zero, but, historically, the answer has ranged to upwards of a dozen, including such Seussian names as haversine and exsecant.

Why? Tables:

(The ‘zero’ answer is because trigonometric functions are all just complex exponentiation in drag.)

My favorite is hacovercosine.

There’s a cool diagram on Wikipedia that shows a lot of the obsolete ones, like versine and exsecant.

Of course, part of the reason it took them so long is that they didn’t really have much experience with such problems, wasn’t sure what methods to use, and probably had to look up some relevant formulae (which in those days meant a trip to the library, not just a click on Google). I could probably do the same calculation in an hour, and someone who actually does orbit calculations professionally might well do it in ten minutes.

I think it was in Gleick’s biography of him (Genius), that I read how one of the first true computers was received at Los Alamos - in pieces, mind you, literally in a bunch of boxes, with no particular instructions and no particular pattern to what was packed with what. Feynman sat down, thought about it for a while, and proceded to put the damn thing together. And, of course, it worked!

Think about that for a minute.

He had plenty of characters that used slide rules, but I don’t recall any of them trying to calculate a ballistic with one. Even his earliest stories had complex mechanical calculators, but by the late forties they are electronic.

The ballistic computer in The Rolling Stones (1952) was a triplex model that strongly reminds me of the computer used on the Space Shuttle, including being made by IBM.

Heinlein would have been familiar with mechanical fire control systems that would have been in use when he was in the navy in the thirties.

So that’s why it’s called the tangent function. :smack: Seriously, I just learned it as opposite/adjacent or sin/cos and never gave it a second thought. I should turn in my engineering degree.

I recall years ago reading a book about the advent of the PC. It started off with older computational devices and worked its way through calculators. The author stated that what really aided each advance in computing was a killer app. The first for the PC was the spreadsheet. It had never occurred to me that the things one does in a spreadsheet used to take place on sheets of paper affixed to walls. You would change something in one place and be instructed to make another change somewhere else on the other side of the room.

We’ve already had one xkcd ref upthread, but the OP just demands this one too xkcd: Los Alamos

What does “precision” mean in this context? Why exactly does adding 1 to that number yield the same number rather than the number plus 1?

If you read Turing’s early papers on effective computability they only really make sense when you remember that computers in the 1930s were generally women with maths and physics degrees.

Perhaps off-topic, but ENIAC has a pretty torturous claim to the “first computer”. If Turing completeness is the qualifier, then Zuse’s computers were the first. Otherwise, if influence over all following computing hardware is the qualifier (which the author in that article seems to use to disqualify Zuse’s computers) then the Manchester Baby is the first, in that it was the first computer that was actually commercialised, and the direct ancestor of all mainstream computing hardware today, in the Ferranti Mk 1. To describe the ENIAC as the first computer means you strangely promote some aspects of its design to undue prominence (like the fact it was electronic) whilst ignoring the fact it didn’t have the ability to store programs (unlike, again, the Manchester Baby).

My father-in-law was a surveyor before calculators. He eventually became a landscape architect and taught at one of the best faculties in Canada. Among other things, he swore that his experience surveying made the mathematics much easier - visualizing the formula as a surveying field allowed him to have a range of expectation, greatly reducing his errors.

He was famous for insisting on ‘no calculators, show your work’ in the first couple of exams. In his day, it was done with a pencil, paper, slide rule and maybe a book of tables.

When he was the surveyor for an archaeological expedition to Spain in '68, he broke down and bought his first electronic calculator. It was huge, expensive and delicate, but it could do square roots!!

He has found the same thing that others have mentioned - errors that are due to having no concept of the scale of the problem, stemming from a blind acceptance of what the calculator says.

I should also mention - in the ‘Horatio Hornblower’ series by C. S. Forster, and also in the Aubery/Maturin series by Patrick O’Brian, much is made of the heroes’ mathematical ability. Hornblower would seem to be naturally gifted, whereas Jack Aubery earned his sense of math through sweat and tears, and through the help of Queenie, his governess, if I remember correctly.

Not Heinlein, but EE “Doc” Smith definitely had a scene in the Lensman series in which a crewmember whipped out his slide rule to plot a course to the next galaxy.

I’m pretty sure the novel that people are sort of remembering is “Starman Jones”.
A decent story, but badly dated due to the main plot device involving manual calculations and math tables.

IIRC they used human astrogators to calculate with a triple redundancy, vote out the odd result error catching mechanism. Unfortunately for them, two officers made the same mistake on the jump outward and outvoted the correct answer.

The titular character needed to calculate an inverse for the previously botched jump calculation without any of the original paperwork, and also without their 10-digit log/trig tables which had been destroyed.

But if we’re trying to track the path of a proton over the length of the universe, we’re going to have a limit in the precision of our instruments. Yes, we can have an arbitrary precision in our value of pi. But we only have so many significant figures in our telescopes and other apparatus. So having more precision in our value of pi gets us nowhere.

So, if someone asks us the circumference of a circle that’s 1 mile in radius, how many digits of pi do you need? Well, how accurate was that 1 mile radius? If you’re only given 1 significant figure–1 mile–then the most accurate you can be is to say that the circumference is 3 miles. If they tell you it was 1.0 miles, then you can say 3.1 miles. If they say 1.00 miles, you can say 3.14 miles.

So how accurate is your measurement of the radius of the known universe? It has at most 1 or 2 significant figures.

And this is the reason more than a couple of digits of pi don’t help. Your calculation is only as good as your worst measurement, and increasing precision on the other measurements doesn’t increase the precision of your calculation.

When the Hunt-class destroyers began coming out of the yard in 1940 their stability proved to be much less satisfactory than the calculations had predicted. The calculations in those days were long and tedious, and (as normal) a check set had been performed independently with the two answers only being compared at the end. Investigation showed that both sets contained an identical error, and it became clear that one man had cribbed his figures from the other.

Floating point numbers are inherently stored in something resembling scientific notation, with a fixed number of digits. So 1.000000e7 + 1.000000e0 should be something like 1.0000001e7. But the computer doesn’t store that many digits, so it rounds it to 1.000000e7, which, after all, is almost the same thing, right? But it’s the same number as we had before we did the addition.

Kelly Johnson and teams of hundreds sitting behind desks with slide rules and log tables and no computers created the greatest aircraft known to man, the YF-12which went on to become the SR-71, still the fastest aircraft in the sky.