Could 1960s scientists reverse-engineer a modern desktop?

It’s been said that a modern cell phone has more processing power than the computers that helped send man to the moon.

Say I want to help out the Apollo program, so in 1961 I drop off my desktop computer at Cape Canaveral (Let’s say I have a decent one). Would I be doing them a huge favour or would they have no clue what to do? Are there elements that would be completely alien?

There’s no way they could duplicate the microprocessor or memory. The entire infrastructure of IC manufacturing is far too primitive. The lithography equipment of the time could only image features 1000x bigger than required. There was no way to do ion implantation, epi layers or low-k dielectrics. Even the raw silicon wafers would not be high enough quality.

It would be like giving King Arthur a 2012 Ford Fusion and expecting him to be able to have it duplicated.

Oooo… now that’s a good one.

The Apollo Guidance Computer used discrete electronics and a small integrated circuit so the concept of a chip would have been understood. However, that chip was about as primitive as it could be and still be an IC.

The microelectronics employed throughout any modern computing device would be very difficult for any person of that era to analyze. Current technology uses 45 nanometer resolution for the circuitry (as in the quad core Yorkfield by Intel). That kind of resolution may not have been even visible to someone in 1961. It is well within the current range of a modern scanning transmission electron microscope but I do not know if such resolution was available then.

ETA: Yeah… what beowulff said.

How much could they use excel or a similar program to speed up their calculating?

Assume I dropped off MS Excel for Dummies, a Visual Basic instruction book, and made sure that internet access wasn’t needed to get into the help files.

How would they know it could be trusted?

On a tangent, let’s assume that they realise that the desktop is beyond their ability to reverse-engineer in a cost-effective and timely fashion, and let’s assume that it works with 1960s electricity - could they use it to speed up the Apollo programme? With one single albeit very powerful desktop computer, what would it help with the most? Rocket design, trajectories, simulations? Presumably it would speed up the maths involved in all of this immensely.

And many centuries from now our machine overlords would have a legend of The One, the great computer mother that fell from the Moon to Earth, where it catalysed the fleshy ones to send it back home.

They might be able to figure out what the chips did, in a general way, but just having the chips wouldn’t give them any information at all about how to manufacture them. It might have spurred the invention of the microprocessor earlier, but it still wouldn’t have made it any more powerful than whatever IC manufacturing technology was available at the time.

Giving 1960’s NASA a modern computer would at the very least set the Apollo program back, and probably destroy it.

Why? Because it would syphon valuable resources off of the moon launch and onto reverse-engineering the computer. The computer might be deemed to be so important that the program would be canceled entirely (remember, this was in the middle of the cold war, and any edge over the USSR was valuable).

When I was in the semiconductor industry, a colleague of mine commented that the best way to sabotage your competitor was to leak would process information to them (typically, these are very closely-guarded trade secrets). His reasoning was that they would devote manpower and treasure to analyzing your process, and not gain any real benefit - usually, most companies get the same results with different process steps - they just do what works for them. Your competitors would try to figure out why you did what you did, when the real reason is - that’s just what works for your set of equipment.

I have to agree with beowulff; even If they were prescient enough to realize they couldn’t feasibly reverse-engineer it, they’d still sink too much time into figuring out how to use it vs. the tools they already had in place. you’re not going to be able to go from hand-entering binary words into registers to writing C# code in Visual Studio overnight.

One of the biggest issues is that the only computer capable of running the tools they’d need to understand modern hardware is the one that they’re trying to take apart.

It can’t be. I’ve been burned more than once by engineers using Excel as a calculation/simulation tool only to find serious calculation and rounding errors. Excel is fine as a bookkeeping tool, but when it comes to performing real engineering and statistical calculations one needs to use the proper tools, e.g. Matlab, NumPy/SciPy, Mathematica, et cetera.

Engineers in the 'Sixties would have been most impressed with the graphing amd visualization functions, though. No more plotting by hand or programming the primitive pen plotters of the day; just highlight or name the data, label the axes, and plot. Wrong scale, need different colors, or want to move the legend? No problem! Adjust the plot command and replot. Ten seconds, no fuss. Now if only they had a laserjet…


This is a critically important and very often forgotten fact. Excel is not a data processing software. It does complex calculations incorrectly. Every time a scientist or engineer uses excel to present data, a fairy dies.

I suspect it would make little difference. For a number of reasons.

Once someone worked out what it was, it would be whisked away so fast no-one in NASA would ever realise it was there, and vanish into the bowels of the NSA (or its predecessor) to spend the rest of its operational life cracking codes, and maybe helping design nukes.

Even if NASA was able to keep it, it would take years to develop useful software to run on it. Excel isn’t some magical beast, you need to understand serious mathematics and engineering to develop the algorithms. There was incipient understanding at this time - Fortran already existed and there were programmers using it to do work. But the time needed to create software that was useful and could use the serious power of a modern machine would take years no matter what. Indeed this is the story of modern computing. Algorithms and in engineering the validation and testing of implementations takes a lot of effort. For most people processor power has vastly outstretched their ability to usefully use it.

Forget Excel. That isn’t a useful tool. A copy of Matlab or NASTRAN on the other hand would be useful. But the lead time to develop the skills to use them would see you most of the way through the program before you saw any contribution.

Most engineers in NASA were mostly aeronautical, mechanical, and some electronic. But they were mostly engineers, not scientists. Most of these guys would have very little useful expertise to throw at the problem of understanding or using the machine. It would go to MIT or somewhere similar, where scientists would investigate it very carefully. All the guys at NASA (and all the contractors that did the lion’s share of the engineering for Apollo anyway) would get on with their jobs.

But the biggie. About the time someone realised that this device has come from the future all bets would be off. Someone has a time machine. And they have it only a few decades from now. Whoo hooo!! Brits, Ozzies, and US Sci Fi fans will know what the Torchwood Institute was set up for. About ten seconds after you send the machine back in time the black helicopters will arrive. Then you won’t have a time machine any more.

Once they discovered the partition with the porn files, work would grind to a halt.

Marketing aside, most improvements in computer hardware have been evolutionary, rather than revolutionary. Each improvement was achieved by refining and shrinking the prior generations design, not by some brilliant new insight.

There may have been some brilliant insights in the manufacturing processes to achieve the efficiency and power of todays chips, but that wouldn’t help those poor folks back in the 60’s.

I am interested in more information on this subject, and a google search is failing me. What are the limits of excel? Any good concrete examples of miscalculations?

google for 850*77.1

Edit: I read this blog post.

That is interesting, but my excel sheet returns the proper calculation when testing. I understand that in software there can be occasional bugs, but the sentiment above thread seemed to indicate that Excel is/was completely unreliable for scientific research. Is this true, or merely an overblown reaction to a few bugs?

Take a look at this presentation for some of the issues with statistics in Excel as of about ten years ago. I haven’t seen a more recent version, but I’m skeptical that it’s gotten significantly better.

Given that the problem was discovered in 2007, I would presume that MS has patched the software in the intervening half decade.

I think the “overblown reaction to a few bugs” is fair, considering the stakes. If you can’t trust an application that does math to do math right 100% of the time, you shouldn’t trust it any of the time when something like, oh, people’s lives are at jeopardy. The response “oh, but they fixed that bug” only raises the questions, “yes, but what bugs have we not found yet” and “yes, but what bugs did the bugfix introduce.”