Could 1960s scientists reverse-engineer a modern desktop?

Amen.

Well, a computer scientist from the 60s would know what a modern computer is, and would likely know exactly why he couldn’t take it apart–the problems that have been solved in the last forty years were known problems at the time; e.g., more powerful and smaller processors and memory systems.

Nerdy Nitpick…Yorkfield is pretty old now. Intel has done two die shrinks since then (and four acrhitectures…Nehalem, Westmere, Sandy Bridge and now Ivy Bridge)…Westmere and Sandy Bridge were 32nm and now we’re at 22nm with Ivy Bridge.

To the OP:

Maybe call yourself Gary Seven.

They’ll get that.

Yes and no. Conceptually I’m sure a savvy 60’s engineer or scientist could point to parts of the modern PC and give pretty good guess what the subcomponents were, but to have the reality an object the size of a modern desktop (or better yet a notebook) being able to do 10,000 or 100,000 times more computational work per second than a 60’s computer the size of a garage would be pretty fucking magical to them.

Regardless of their imaginations 60’s engineers would have had no idea of the degree to which electronics would become integrated and miniaturized. It’s beyond any science fiction a person could have imagined. I’m 54 and I’m reasonably familiar with PCs on a hardware basis and modern electronics is still pretty damn magical to me.

So, it seems the people here may be upset to know that a good portion of the power grid in the US is run on Automatic Generation Controls. And that the inputs are calculated (via relatively complex calculations for some value of complex) for AGC are done in Excel, and exported via VBA (of course, the inputs for Excel come from a SQL server, but they’re imported into Excel via VBA too(and were probably loaded into the database from another Excel spreadsheet)). C’est la vie.

I’m curious why you think we’d have to just assume that it would work with 1960s electricity. It’s the same voltage, frequency, etc. In many cases it’s the same exact wires, and in many cases the same generating plants. 1960s electronics, etc. will still run off of today’s outlet power and I see no reason to think that the opposite wouldn’t be true. There might be some issues with polarization, but that’s the local building wiring, not the electricity, and as far as I know it’s an issue of safety rather than functionality.

Maybe someone more knowledgeable will tell me I’m wrong.

In general, frequency was considerably less stable back in the day and computers really don’t like that. In fact, it’s the main reasons that are frequency controls are much better today (they have to be so that computers function well). But, I doubt that it would be that big of an issue.

But is frequency all that perfect even today? I’d think any modern circuitry would be designed not to depend on stable wall socket frequency, wouldn’t it? Doesn’t it all depend on quartz oscillators these days (and in the 60s for that matter)?

Looking at the back of a 5 - 10 year old desktop computer (it was the easiest to access), it will take 100 to 127 or 200 to 240 volts, 50/60 Hz. I’d be more worried about how clean the power was, than how stable the frequency. I have no idea how voltage spikes or whatnot compare between now and then, though.

All modern power supplies support frequencies from 50-60 HZ and voltages from around 90 to 240v. There is no conceivable reason that a computer from today wouldn’t work on power from the 1960’s (although you might not be able to plug it in directly - grounded outlets were just being adopted in the 1960s).

This makes no sense at all; my computer’s power supply says 47-63 Hz at 85-264 VAC, and even 20+ year old computers (that I have seen) usually had power supplies that could accept 115/230 volts at 50/60 Hz (usually with a switch, nowadays you only need the right plug for universal input supplies). Maybe you meant digital clocks, but even back then they used synchronous motor clocks which need a stable frequency (computers use a quartz crystal for their RTC).

Even the grounded outlet issue could be bypassed with an adaptor or just cutting the ground prong off, although the case would then be slightly live due to noise-filtering capacitors connected between the mains and ground, but not dangerous unless a short occurred; this is why touching some 2-wire equipment may give you a tingle (might cause some issues with electrical noise though).

If you look more carefully, you will find that the voltage selection requires a manual switch setting (2 positions, maybe more), but the frequency isn’t all that important. Neither is the stability of the waveform; if you are driving the PC from some kinds of UPS, your waveform doesn’t look like a sine wave anyway. There’s a lot of leeway here.

Someone (Analog Magazine? ) wrote a similar article about `what if we sent a drone aircraft in 1970 to look at Chinese nuclear tests and it accidentally got sent back to 1939, before WWII?

He suggested if anyone had to estimate the year it came from - ramjet engine is just a tube and how does that work? It appears to pilot itself, attempted to land at a remote navy base in the Phillipines; at the time, they did not have sufficiently precise measurement tools, so the doped silicon chips would simply look like incredibly pure silicon, but they were obviously electrical, electronic something. Then, they`d find the whole thing was radioactive! WTF was that all about?
SO for the OP computer question… In 1963…

The hardware would be so many generations of incremental improvement that they would not know how to duplicate it. They would likely recognize the hard disk concept, it was around then. But I remember taking the lids off 3330 cake trays; the concept of a clean room to handle that sort of disk? It would be garbage if the opened it.

Would they be able to record and analyze signals on the wire at 3GHz - I don`t know. I doubt it. All they could do is figure out the frequency and spend years (wasted time) trying to build tools to analyze it.

I`m trying to think how much detail they could get, if it did not come with a programming language. The visual basic embedded in Excel probably provides a faster computer than what most would have to work with.

(My dad told me of his early programming experiences. One computer, the RAMwas a drum with several dozen magnetic heads on it. Each instruction included the address of the next instruction. It was hand-coded machine code. part of the fun puzzle was to look up how many clock cycles the instrcution took, and find an unused cell on the drum just past that latency; otherwise, your program could not execute the next instruction until a full revolution of the drum brought the next instruction cell into position to be read. Ah, minicomputers of the day…)

If they realized that the thing included tens of billions of flip-flop circuits, talked in gigahertz frequencies, and had storage of tens of trillions of bytes…

IIRC an IBM 3330 disk held 30MBytes, and that was a big breakthrough in the late 60s or early 70s. A co-worker of mine from then said he learned to prgram assembler because the mainframe he worked on simply could not compile his more complex COBOL programs in 40K of RAM.

Remember, RAM was called corebecause each bit was a tiny magnetic donut threaded by hand in far east factories (back when we exported the difficult tedious factory work overseas to Asia).

May I just say that this has been one of the most interesting threads in a while - I’ve enjoyed reading it and learning, as this is an area I know nothing about. Thanks to all who contributed.

Not immediately, while the drive head / media interface would eventually get gummed up by dust it would take quite a awhile. A modern hard drive will often work fine for an extended period of time with the top taken off.

Really? The bits I’ve read suggested that even a particle as tiny as a smoke particle could cause a head crash; the layer of air the head floats on is that fine, and has been for more than a decade. The head needs to be closer than the length of 1 bit from the recording surface, obviously. There’s no point in maling the recording surface thicker than half a bit. That makes a very fragile arrangement. Unless your room is incredibly clean, or you are tolerating quite a bit of data loss before the drive is considered toast.

Of course, most of the core OS would be loaded into RAM, so it will chug away for a while until it finally stops with too many disk errors; plus if you load too many programs, the swap file activity should cause a crash from disk errors.

I doubt that they would ever reverse-engineer it completely (at the point where they finally have the ability to do this, it would be a pointless task), but there’s a great deal that could be learned quite quickly from studying such a device. Probably the microprocessors would have to be considered black boxes for a long time, but just studying the circuit board layout and components on the board would give a huge head start in electronics.

As for helping apollo, I’m on the side of it wasting rather than saving resources overall. I’ve no doubt that engineers would learn VB or whatever very quickly (if I learned BASIC without a manual, they sure could). But the timeline for apollo was very tight and I don’t think the computer would be useful enough to justify time spent investigating it.

Nobody’s talking about the SOFTWARE on the computer.

Just viewing a 2012 user interface would be a mind-blowing huge leap in computer/human interaction design, and those benefits could be translated/retrofitted to 1960s computers in only a few years. They couldn’t replicate its functionality, but they could replicate its basic design-- on-screen objects, WYSIWYG interfaces, direct manipulation with a mouse or trackball, etc.

Even considering the “high priesthood”-like mentality I’m sure 60s computer experts had, I think that would be hugely significant.