Computer Speed (4th July 2003)

Back in 1994, I remember seeing a programme celebrating the 25th anniversary of the moon landings. They went through the history of the Apollo space programme and there is one great fact I remember.

The computers they were using in the Apollo spacecraft that attempted to / went to the moon were less powerful than the computers in our cars today. (And don’t forget that was almost 10 years ago).

Can this really be true?

Yes. Computers are so commonplace now that people have a distorted notion of just what the onboard computers were required to do on Apollo. You think of running Windows and Internet Explorer and Microsoft Word, and wonder how to get by on less than 100Meg of RAM.

But see, the Apollo onboard computers were very simple machines. They did simple computations for preprogrammed tasks.

For an overview of this topic, I rever you to Clavius, a web site devoted to addressing myths about Apollo and combatting the Apollo-was-a-hoax crowd. Further information can be gleaned by coming over to http://www.badastronomy.com/phpBB/index.php , where you can ask away and get directed to lots of information. The Clavius webmaster hangs out there, along with a bunch of other informed people.

From memory, I think they were 4K machines - there were 5 of them. Neil Armstrong made a comment once along the lines of
“… they cost the budget of a small nation, they had more security than the president, (some other thing I don’t remember) and we were proud of those little suckers.”

The average desktop PC can be linked with hundreds or even thousands of similiar machines to prodcue results that exceed todays supersomputers. SETI@home produces over 59 tera FLOPs/sec using extra cycles on users machines to process data from Arecibo. The Indian government used a few hundred PC running in parallel to do the modelling for their atomic bomb.

I notice, however, that while Cecil listed the clock speeds for the Cray 1 and 2, he didn’t list the speed of the new “Earth Simulator parallel vector” computer. Does anyone have the clock speed specs for that one?

We built the A-bomb and sent men to the moon largely using sliderules. I was thinking just yesterday that when our biggest fear in life was global thermonuclear war and not suicide bombers. I spent a lot of time learning to use a sliderule so that I could still be an engineer when the EMP fried my trusty HP calculator.

Ranchoth

The Master did list the clock speed for the Earth Simulator

underlining mine

Oh, and a link to the article is always appreciated:
http://www.straightdope.com/columns/030704.html

This lists the speed of each cpu at a measly 500mhz. There are, however, 5,120 of them and they can talk to each other at 16GB/s. Thus node-to-node it’s almost 3 times faster than any pentium can theoretcally talk to it’s own memory(4.2 GB/s, Dual-chanel evil rdram PC1033). There are probably other reasons to go with the lower frequency chips. Namely that they probably have shorter pipelines and wider vector units. Meaining they probably are less dependant on the previous instructions seen and operate on more data per clock.

When the Apollo 11 LEM was landing on the moon, the computer froze because it was overloaded with data; luckily Armstrong and Aldrin realized what the problem was, otherwise they might have had to abort the landing.

Comparisons to 1970’s computers are somewhat problematic as well because most business and commercial computing then was done on IBM mainframes that were designed for very large scale “batch” processing. They did not really operate the same way that a PC operates. For certain types of tasks in which a whole bunch of data has to be run through a series of programs and acted on (such as in a large payroll system), even well into the 1990’s, mainframe computers did a considerably better job of that task than did “client-server” systems because of the inherent limitations of the process of moving data back and forth. The Cray series of supercomputers, from a design standpoint, were aimed at increasing the number of operations per second in the processor. But as any PC user who does a lot of graphics-intensive work knows, the processor speed is only part of the challenge…if you have painfully slow hard drive access, you’re still going to crawl at a snail’s pace. So the question comes down to what type of data you are trying to process (real time graphics/animation, astronomical calculations, payroll checks) in terms of whether or not direct comparisons can be made. Unquestionably, the computing power in the average PC right now is far more advanced than it was 10 years ago. But you still wouldn’t want to try printing 10,000 payroll checks every week on a HP DeskJet printer, or even a LaserJet for that matter.

Actually, the guy at MIT Lincoln Labs who wrote the software figured out the alarm was not critical, and told Houston that they didn’t have to abort. He was quite the hero at the time.

True. Computer design today involves understanding the applications you are targeting, and designing to eliminate bottlenecks that keep you from getting good results on standard benchmarks, which educated buyers care about far more than clock speed. Inside the processor the amount of cache has a big impact on performance, outside it is the interconnect. Old style supercomputers were almost easier to build since they did small amounts of I/O. Designers could focus on parallelism, and clever instruction scheduling, all of which are used on your standard microprocessors today. They also ran one thing at a time - your standard PC runs a lot more instruction streams than a Cray ever did. The wave of the future is in building multiple CPUs on a chip, so that each can handle multiple threads of instructions. That, and adding big caches, are a far easier way to spend all the transistors we have now than to add features.

Anyone interested, here’s the transcript of the Apollo 11 landing, with debrief commentary from various people (including Armstrong and Aldrin).

http://www.hq.nasa.gov/office/pao/History/alsj/a11/a11.landing.html

The program alarm kicks off at 102:38:26. Note there’s also discussion about the concerns over fuel quantity.

More on Apollo computers:
Apollo Lunar Landing Guidance and Control
http://apollo.spaceborn.dk/

MIT History page on the Apollo Guidance Computers
http://hrst.mit.edu/hrs/apollo/public/index.htm

Apollo Saturn Reference pages:
http://www.apollosaturn.com/asnr/tablecon1.htm
This might also prove informative, especially the Guidance and Navigation page.