I once read somewhere that the first Appolo mission of NASA used as much computing power as a standard 486 (which I think was a 486/66 at the time.) Science geeks! Is this feasible? I mean could a entire Mission Control from the mid 60’s be put into a Compaq Presario at one time?
no. The Apollo computers were less powerful than a Palm Pilot http://www.jimthompson.net/pilot/power.htm
Of course, as this link points out…thats not the whole picture…there was specialized software and instrumentation…so it might be comparing apples to oranges
“The latest technology – like a computer that can fit into a single room and hold millions of pieces of information.”
– Tom Hanks as Jim Lovell in Apollo 13
I’m with beagledave … it’s difficult to compare. A lot of the equipment on the space missions were designed to do one thing only. Reprogramming would be accomplished by replacing circuitry.
It’s sort of like comparing a modern word processor to a typewriter - in the olden days, you could write a whole page of text with zero processing power. If you wanted to change something, you’d use white-out.
That’s one of the things that’s weird about analog computers. They were vastly less flexible than a digital computer, but they could get by with a lot less “thinking” than a modern computer. That’s why in the old movies, the computer has tons of knobs and levers where today we could get by with a single keyboard - the people were adjusting potentiometers, opening and closing electrical circuits, and dialing in numbers.
Well, if you’re just comparing mathematical computing power, a cheap pocket calculater is more powerful than the Apollo computers. If you factor in the “Apples And Oranges Effect”, anyway.
Around the time of the Apollo mission, the smallest home calculator was around 2 pounds, ran off of 110 current and had no LC display but used tape. I don’t think Pong, the first commercial video game was even around then. There were no home computers. I recall having a TV with tubes in it and when it acted up, pulling those which looked bad, going to the 7-11, where there was a tube tester, checking them out and replacing them.
Transistors were around, but I don’t recall if microchips were.
Those of you raised in the era of home PCs are used to having massive amounts of computing power in a palm calculator, while those of us older recall cranking the handle of a mechanical adding machine and doing the old ‘tic - tic - tic - ka-ching’ of a cash register. There was or is a massive ‘thing’ of a computer somewhere, in a college I think, made up only of rods, gears, cams and electric motors that took days to set up to do one math problem.
It was considered state of the art. It takes up most of a room and is impressive as heck, but the calculator in your wrist watch is 100 times better and at least 1000 times faster. Basically, we got to the moon using pencil and paper to do figures on in comparison to today.
I got a Commodor64 as my first computer. It had no hard drive, only twin disk drives and big floppies. There was no Microsoft and few programs available. No Internet. No modems. My first printer was an Okidata tape run that was slow. Reeeeaaaaalllll slow. The color cartridge was a tape cassette as big as my hand with 3 colors on it, one after another, and it printed by laying one color over the other. That process took nearly forever.
Only one program at a time could be run, not like today on my current system. Plus, if you needed help, you couldn’t just go to the nearest computer shop and ask. You had to call the program maker.
kids.
When I went to school, I had to walk 5 miles…uphill…both ways…in the snow…with no shoes
Unless I’m mistaken, the most powerful analog computer is the humble slide rule. Think about it for a moment. A slide rule (in the hands of a competent person who can keep track of where the decimal point is) is capable of multiplying any two rational numbers. ANY. Calculators can’t do that, if the decimal point is too far from zero. But a slide rule can. On the other hand, the slide rule will start rounding after a few significant digits. But at least it will always give some sort of approximation. The calculator is usually ok, but will poop out with very large or very small numbers and fail to give any answer at all.
HIJACK ALERT!
I’ve long had this “thing” about analog vs. digital in many many areas. Analog has a bum rap, a very bad and old-fashioned reputation, when in fact it comes out ahead in many ways that we don’t see.
For example, a digital watch with a seconds display seems to be very accurate, but actually, it is correct only one instant each second. But an analog watch with a sweep second hand is always exact. (We’re presuming both were calibrated correctly at the beginning, of course.) Similarly, an electronic thermometer will round the temperature to the nearest degree, or tenth of a degree, or whatever, but an analog mercury thermometer is dead-on exact.
Yes, I’ll admit that while the analog devices are exact in theory, in practice it is difficult to figure out what the device is saying. There may not be a digit on the thermometer at the exact level of the mercury.
That’s the trade-off. Analog is more accurate, but it’s slippery and hard to get a fix on. Digital may not be exact, but it can be designed to whatever degree of accuracy you can pay for, and then it will round off the figure and plop it in your hand in an easy-to-use format.
In 1969 the entire defense department had less computing power than half the posters on this board are using right now. The Einiac computer, at the Pentagon caught fire in the sixties because the tubes in its memory “banks” got too hot for the cooling system. There were “thousands” of them, all linked in racks on metal shelves. The result was a live digital memory of 16K. Later, upgrades were installed, raising that to 64k. The upgrades were what caused the fire. That was considered huge, at the time, and a great improvement over the 2K predecessors it replaced. These were the Giants of the time.
Tape storage was “thrashed” in and out of live memory in huge blocks of sixty-four bits or so per operation. The sheer speed of it all was mindboggling, acheiving hundreds of floating point operations in a single second.
Double that every 18 months, up to the present, (Moore’s Law) and you get the figure 16-64 terrabytes for the projected computing power of the Defense Department today. (Live memory) Not too far off, in all likelyhood. I would imagine that NSA is a bit larger. NASA has some serious computing power, as well, of course. But the same general ballparks are probably reasonable.
Tris
Hey, it is good enough for Russian MIGs, right?
You are right about “real time” issues – analog comes out way ahead.
A friend of mine left a technical advisory job shortly after starting on a project to update a gun targeting system on an aircraft. The system had to be able to account for the speed of the plane relative to the other plane, the distance between them, the effect of gravity on the ammo, etc. The old system was analog, and my friend told the “kids” – you have to use analog.
He checked back a few months later and found out the project had been a complete failure. No one had understood that a digital computer, no matter how fast, couldn’t solve this equation in real time. They tried to do it digitally despite his advice.
Computers have come a long way, but you guys are going a little overboard in how puny you think computers were in the 70’s.
Skribbler Said:
Integrated Circuits had been around for some time, and I think the first modern microprocessor (the Intel 4004?) was available by the late 1960’s.
This was true… in the 1930’s. The first modern digital computer was built in the 1940’s. By the 1970’s, computers as found in colleges had video display terminals, transistor RAM (often in 2K integrated circuits), etc. Our local little university had an HP2000 minicomputer, which had banks of 64K RAM, 128 video terminals attached to it, remote connections via Gandalf modems, etc. This was 1974.
By the time Apollo came along, minicomputers were already into 16 and 32 bit processing. The IBM system/370 was available, which was actually quite a powerful computer.
This is not true. Your typical wristwatch calculator probably uses a 4-bit processor with a hundred bytes of RAM or so, and is of considerably less power than even a small computer in the 1970’s time frame.
If you get into the high-powered programmable and graphing calculators of today you could make a case. I recall reading that the HP41C had roughly the same computing power as the LEM computer in Apollo. If you don’t remember the HP41C, it was actually a pretty powerful little beast.
The first ‘home’ computer was the Altair 6800, and in its base configuration I think it had 4K of RAM expandable to 64K, an 8-bit processor, and a bunch of I/O. That would have been around 1975. The Apple II and TRS-80 Model I came out not too much later than this (around 1978, if I recall), and they were quite powerful.
Microsoft was already a large, established company when the Commodore 64 came out. I think the language that came loaded in the C64 was Microsoft BASIC. And a modem was an original option on the Commodore 64, and there was a thriving community of BBS’s where people did much the same things that we do on this message board. There was also a large nationwide messaging infrastructure through FidoNet and a few other networks that allowed large-scale message boards.
The Internet also existed at the time of the Commodore 64, although it was mainly available only to academia and the military.
There were multi-tasking operating systems available at that time. I had a Radio Shack Color Computer in 1982 that had 128K of RAM, an OS/9 operating system that looked much like UNIX (I even had a separate terminal connected to it so two of us could work at the same time), and the processor was a 6809E, which did mostly 16 bit operations with an 8-bit address bus. Given the non-bloated software of the time, it was plenty fast.
As for computer shops, service has actually gone down since the Commodore 64 days. ComputerLand was around then, and most computer stores were run by serious hobbyists. I worked in one, and we used to have a coffee pot going all the time, and people would come in with their problems and we’d pour a coffee and work through it, and often go much farther into details. We had free training classes, ‘workshop’ nights where people could bring their computers in, etc. This was around 1982.
I remember on the 30th anniversary specials a couple’a years ago it was stated that a new car has more computing power under the hood than the lunar lander and orbiter did. Kinda scary, if you ask me - the same amount of computing power that kept three men alive to the moon and back causes the “service engine soon” light to come on when you’re overdue for a tune up…
Well, no WWW anyway. The Internet started to be built in the 1960s.
Okay, I think I may understand.
An analog computer differs from a digital computer sort of like a sliderule differs from an abacus. Right?
I have a very general idea of how a digital computer works. Could someone give a brief, not too technical rundown on how an analog computer works?
I can try, although I suppose there are a bunch of ways to make an analog computer, since we essentially live in an analog world.
Lets say you want to build a computer that multiplies by two. You could find two metal rods that you know that, within a certain tolerance, the more you heat them, the more they expand, and that these two metals expand in this tolerance at a one to two ratio. So, to multiply by two, you run equal currents through each and compare the difference in expansion. Rod one expands by x, and rod two expands by 2x. You can also solve x/2 by making the opposite comparison.
That’s a simple example – and a binary digital computer could solve 2x (for a whole number x) with a leftwise bit shift (binary “1”=1 becomes binary “10” = 2). But, solving 1/2*x is harder. If you just right shift the bits, “10” becomes “1” and that is OK. But “11”=3 also becomes “1”, and 3/2 != 1. There are all sorts of nice work arounds for this problem in digital land, but they all have limits.
Let’s say you want to solve x squared. You could find two other metal rods that expand in this ratio. And the same analog computer could of course solve the square root of x.
Doing x squared in digital land requires a lot more steps. And lot of digital computers sort of have the answers hard coded as to what the square root of a number is (like a slide rule) and can use this “look up table” to figure out the answer, but it can be imprecise.
The analog computer can give you your answer almost instantaneously and more accurately.
you’ve got analog computers all over your house. to use jmullaney’s example, the two metal rods, look inot your fridge. Or up on your wall. That’s right.
Thermostats are analog computers, of the exact kind joel brought up. You set the dial to where you want it to be (say, 64 degrees). that sets an on switch to a specific distance away from its contact. when the metal inside the thremostat expands to that point, a connection is made, and your radiator turns on.
Before this denegrates into an analog appreciation thread, I’d like to say that the modern level of technology could not have come to be with analog computers instead of digital. Why not? Efficient data storage and transmission can be made digital faster than it an be made analog, especially if you want to be able to save programs electronically (on a hard disk or the like) and transmit them electronically. In a digital system, you can compensate for lousy transmission conditions (all lines have noise) by having discrete units of data (called packets or, more precisely, octets, in networking). With a discrete unit the receiving end can ask for a retransmission of a garbled section (if the checksum does not add up, for example) and be given just the garbled part again without the machine having to try and rewind a tape system and play it over. And I can’t imagine trying to do serious word processing on an analog system. Typewriters are antiques for a reason. Limited functionality is one of those reasons. So don’t pine too loudly for analog machines of days gone by. The fancy microprocessor in your PC is very much a digital system, and owes its speed and accuracy to that fact.
In which year were the most vacuum tubes manufactured?
a) 1954
b) 1968
c) 1978
d) 2000
Care to hazard a guess?
Whoa whoa jmullaney, I know you addressed that to Derleth and not me, but could you make the connection a little clearer? I don’t know how the tube vs. transistor thing factors in, and I don’t think Derleth was trying to address it in any case.
OK. But not cheating by reading ahead. The answer is d. All non-digital televisions are analog computers. They take an analog signal and using a few magnets and a specially coated vaccume tube and turn this signal into a color picture.
Of course, he’s right that digital is trying to worm its way in to one on analogs last major venues, but I wish it luck. I’ll gladly accept a little static in my data reception for now. I don’t have 4000 buck to blow on a digital TV.
Anyone know how reception is on a digital TV tuned to a digital signal?