Could 1960s scientists reverse-engineer a modern desktop?

I don’t think so.
There simply wasn’t enough computational power back then to waste on bit-mapped displays. Evans and Sutherland had graphical CAD systems in the early 1960s, but there’s no way that computers of the time would be able to run a windowed environment for their UI.

That’s the thing about technology - when the time is right, it naturally evolves. There’s a reason that the Macintosh, Amiga and Atari ST all were brought to market in the same period - the hardware could finally do what the software guys wanted at a reasonable price.

Make it a laptop, so the size and flatscreen blow their mind.

As for the interface, IIRC Cyril Kornbluth described in the 1950 sci fi story “Little Black Bag” a flexible card that was a medical ebook in a magic doctor’s bag; flip the card back and forth, and the next page displays, or searches for what you told it.

Once you fire up regedit and start poking around in the registry all sorts of interesting things will be even more mysterious. The amount of detail would be incredible. Fire it up in safe mode or read the Help files. Find out that Seattle is the center of the universe in the far future. Is google moon installed? Why do people need to navigate the moon- what are they up to in 2012??? Too bad this “internet” thingy shows “disconnected”?

One my more favourite rants is as how little really important is new in computing. If you looked back to 1962 you would be really surprised at the sophistication and imagination of the pioneers. They had a very clear understanding of the potential, and were limited by the hardware. Software has not progressed anything like the amount that hardware has. We like to say “there is no such thing are Moore’s Law for software.” And there really isn’t. We get generation after generation of new programmers that seem to think that there is this cool new software technology that keeps arriving. There isn’t. There are add ons, and increased capabilities that are enabled because the machines get faster to the point where things that were infeasible before become commonplace. The number of genuinely new innovative ideas is depressingly small. What is worse, a lot of really really good ideas have been forgotten. If you think people would learn fantastic things about operating system design, the Multics system was begun in 1965. It was more sophisticated than anything you use now, a lot of the design ideas in modern operating system were taken from Multics. The hardware design ideas back then were brilliant. We are stuck with the bastard offspring of a desktop calculator chip designed in 1972. Indeed there is a danger that sending back a PC would cripple innovation in computing, with people somehow thinking that this is the right way to build a computer. Which is isn’t. The modern PC owes more to unfortunate accidents of history to any sort of coherent design process. If there was time to wipe the slate clean and design a proper computer from scratch with modern technology, it would not look like a PC. OK looking inside it would be indistinguishable - what I mean is that the system architecture and operating system would be quite different. It would use a lot of those ideas that were lost from the 60’s and 70’s.