Suppose you could go back in time with a modern laptop running a standard Windows and MS Office installation. No other software installed. The range of travel would be the early years, say 1950s and 60s
First question:
Going back to the 50s and 60s, how well could they understand the device’s workings? I think the raw concept would be apparent to them, but would they be able to conceptualize the workings once they saw it in action? That is, would it be obvious to them that this was a direct line from what they were doing?
Second Question:
Given a standard Windows and Office install, would they be able to do anything besides use the machine? That is, could they learn how to program it and create new applications? As far as I know a standard install doesn’t include things like compilers and development tools (does it?). SO, is there a way that the brightest minds of the generation could actually get in there and program it, or would it be stuck in the state it’s in?
Third Question:
Would it be possible to use the networking capabilities of a modern machine (wifi, ethernet) in any useful way? For instance, could they somehow figure a way to make that machine with its 500gb HD some kind of server, with all their UNIVACs connected to it?
Last Question:
What would be the best use of the device? Would they be better off tearing it apart to learn from it? Or, would things like Excel make it the most powerful computing tool in the world, best used by some government agency as-is?
As a bonus, how far forward would you have to come before the machine would go from inconceivable to just impressive? Certainly a computer scientist from the 80s would see a modern laptop as the natural and obvious progression of where they were then. That is, while impressive, not rising to the level of hard to believe…?
I’d say as a hard latest limit, it would not be more than impressive after the first mass microcomputers came out in the late 70s. Possibly it would cease to be more than impressive before that but I’m not sure of the earliest limit.
Most consumer computers are not shipped with actual compilers, but even the relatively simple scripting languages available would make it far more powerful than anything they had at the time.
Well - they didn’t have transistors until the early 1950’s so the idea of a computer chip packed full of them would have been pretty mind blowing and way beyond any possibility of analysis or reproduction.
Given what the likes of Turing et al had managed with simple Strowger switches and then with thermionic valves, I suggest that it would have been feasible for them to work out some of the programming language. How much they would have been able to do with it is anyone’s guess.
Since all our technology is a result of reverse engineering the stuff we found aboard the crashed UFO in Roswell, at least it would be familiar to some. :rolleyes:
It would no, as in zero, I/O capabilities. Disk drive? No one could read a 3.5" drive. Nor a CD-ROM. Ethernet? 20 years away and there is no port anyway. Printer? No compatible ports. Even if you brought a copy of, say, Python, or even Basic, all you could do is run a program and read the output on the screen. Might be useful. But everything would have to be input by hand and output by eye. Not very useful. Even reverse engineering is problematic. When the largest IC has 10 transistors, what can you make of one with a billion transistors. Now if you knew something about the manufacture of chips that could be useful.
Excel, people! Built in VBE.
The OP specifies that the machine has an Office install. That means Excel, which would make a modern computer almost unbelievably powerful.
It’s quite astonishing to many people today what you can do using Excel, and it comes with built-in help files, meaning that anybody who can read English can teach themselves how to write formulas, create macros, use pivot tables and even do VB coding. There’s very little that a skilled user can’t do using Excel in terms of data processing. While it may not be the most efficient application by modern standards, and often requires excessive manual handing, but you can usually find *some *way of getting the right answer using Excel.
So with that stipulation, the amount that could be achieved in the 50s and 60s would be mind boggling. Just think about something basic, like all those calculations for the space program that were done with slide rules. A skilled Excel user could do them all literally overnight. The real question would be which application of the machine would be most world altering, not whether it could be useful. A lot of things that nobody would even consider because it was not feasible to have thousands of people hand calculating (with the associated errors) would become practical.
The other point to consider is that just the concepts behind a modern PC would be ground breaking in their own right.
The very idea of an Excel style spreadsheet would have been revolutionary in the 50s and 60s. People wouldn’t need to be able to reverse engineer the machine itself. Just the realisation that you could use computers to create a cell based spreadsheet, and a glimpse at the algorithms behind that, would likely provide a great boost to computer research. It would be positive proof that computers were actually useful to and usable by the average businessman or researcher, a truly novel idea at the time.
The battery technology would also be revolutionary, and might be the easiest thing to reverse engineer.
I don’t see why it would need to be nearly as dire as that.
At the most basic level, you can simply photograph a screen. No need to output everything by eye. You can readily enough set up Excel to make a noise when an output ready, and using 1950s technology it’s simple enough to rig up a camera and microphone to take a picture at the sound. So you could convert the entire output to microfiche automatically using 1950s tech. Ultimately somebody will have to eyeball the results, but ultimately somebody has to do that now for 99% of data processing. Windows also comes with a built-in sound recorder that allows you to create limitless wave files. So you can actually program Excel to read its results out loud. These could be readily recorded onto disc or tape using 1050s technology.
So you can automatically output in both visual and audio format.
If they wanted to get the results into another digital format, I suspect they could do that within a month. Magnetic storage tape has been around since 1950 and I suspect that any competent electrical engineer could rig up a device to convert Excel’s sound into the tape equivalent. I don’t know enough about how magnetic tape works, but if I can write an Excel code that makes any sound I want in response to any output, it shouldn’t be any great feat to record those sounds onto tape.
To me the most interesting questions are whether they would understand what they had and what they could do with it.
Even in the 50s, the basic technological ideas existed to a degree that all the basic principles could be explained, though they would be totally blown away by the sheer magnitude of speed and capacity and the ultra miniaturization. I could see this thing immediately being classified top secret because of many military implications, especially in the Cold War paranoia of that time.
Taking it apart would be a terrible option and I’m sure they wouldn’t do it. There is far too much large-scale integration involving technology far beyond their means to reverse engineer for that to be productive. Essentially all they would see is a bunch of pretty colored objects with interesting electrical properties. But assembled and functional, the laptop would be priceless and provide tremendous opportunity for non-destructive functional analysis of the operation of its hardware and OS.
What they could do with besides running Office and other standard programs would depend on what documentation they had. Full documentation of the processor architecture and instruction set, machine architecture including BIOS, buses, and ports, and Windows interfaces would allow them to write their own compilers even if the laptop didn’t come with any. They could either build it from scratch on the laptop or, if they reverse-engineered some form of communication, they could start by writing a cross-assembler for it that ran on an existing mainframe of the day like an IBM 709. But if it had a development system already present, even something relatively rudimentary like Visual Basic, their minds would be blown all over again! At a time when high-level languages were just beginning to be developed (the greatest thing since sliced bread around that time would have been FORTRAN II or COBOL) the idea of an interactive object-oriented development platform tightly integrated into the visual paradigm of Windows would have been breathtakingly powerful on many different levels. Even something like Excel would be beyond belief. It’s easy to forget how innovative the basic interactive spreadsheet idea was when it first came out.
There’s no point in building a server in today’s terms because there wouldn’t be any clients for it to “serve”, and they didn’t understand client/server architecture back in the day. But by the standards and needs of the 50s and 60s, Windows would be considered a horribly inefficient way to use this remarkable hardware. This would be a computing resource valued at untold multiples of millions of dollars and basically priceless, and you don’t waste a resource like that by devoting it to a single user and making it display a bunch of colorful icons. We only do that today because the hardware is so cheap.
In the 50s they’re probably be stymied and unable to do much about that. In the 60s the concept of timesharing began to be developed. If they had enough documentation to be able to adapt serial line multiplexers to the machine’s USB ports, they could dump Windows and write a timesharing OS for it.
A modern laptop would be an immensely powerful timesharing computer by the standards of the day, by far more powerful in most ways than anything that was ever built. It could literally provide responsive timesharing services to hundreds of users, giving them each computing power that was unimaginable in the day. If they knew the laptop would continue to run reliably for decades, that would be the absolutely optimum way to use it – not as a Windows curiosity on someone’s desk, but effectively as a mainframe in what would be considered a major computing center, even if the marvel at the center of it all was no larger or heavier than a book.
BTW, I have a PDP-10 simulator on an old PC that is more than 10 years old. The simulated PDP-10 runs the TOPS-10 timesharing monitor, and despite running on simulated hardware, on a relatively slow old PC, under Windows, the thing is much, much faster than the original multi-million dollar PDP-10 mainframe and would be able to support much greater user loads.
The biggest problem with this scenario is getting data/programing INTO the computer.
People can only type so fast. There are only so many hours in a day.
The best use of this ONE wonder machine are problems that take some serious computing power but don’t “use” a lot of data or require very complicated/large programs/spread sheets/whatevers.
If I can bring a USB hub, it’s possible to create USB to serial RS232 to ASR33 Teletype interfaces. And at 100 baud for the ASR33, my little laptop can certainly handle multiple simultaneous connections.
You could speed things up a bit using the computer’s own hardware and software.
A typical laptop comes with a camera and Office comes with OCR software. You could have an arbitrary number of people producing data on typewriters, and then enter it into the computer as a “scanned” document. That would enable skilled operators to enter ~1,000 pages/day of thoroughly checked data, as opposed to a hundred or so pages typed directly into the machine that would require further checking.
But you’re right, that’s still going to be the bottleneck. You will still only be entering kilobytes of data into the computer each hour, instead of the gigabytes that we know it’s capable of downloading directly.
Another problem. You gotta balance the time you spend using it and the time you spend putzing around with it figuring out how it works/what is capable of of/what it contains/how those I/O ports work…
You can’t utilize nearly it’s potential without putzing with it…but you can’t get shit done if all you do is putz with it.
And I hate to be the person that fucks it up putzing with it
A hub isn’t necessary. A single USB 2.0 port has ample capacity to multiplex hundreds of 1960s-style dumb terminals, and a laptop will typically have 3 or 4 of them. Since hypothetically we’re writing our own timesharing OS, it could support whatever multiplexing protocol was being used by the physical terminal line multiplexer talking to the outside world.
The real point though, is not just that the laptop could support that many serial lines, but that each of them could be doing demanding compute-intensive work! When a typical user uses a modern laptop, most of the time virtually ALL the available computer power is just wasted, a small portion going to Windows overhead, and the rest in the idle loop. If it was running an efficient timesharing system, by the standards of the late 60s and 70s it would be an extraordinarily powerful mainframe computer!
Nitpick: the ASR 33 was 10 cps but technically 110 baud, not 100, because of an extra stop bit. But 30 cps terminals were 300 baud.
Now that I think about it, getting data OUT of the computer is a major issue as well.
Not so much processed DATA per se, but stuff like program documentation, how to tutorials and the like.
Somebody is going to have to do a lot of reading to figure this thing out.
So, even if the thing has a good documentation, it is going to take awhile before anyone gets this thing anywhere near up to the capabilities it has (if ever).
Anybody wanna wager whether its a Norton or Window update that screws over our ancient ancestors?
I disagree that I/O would be a truly fundamental problem for moderate amount of data by the standards of the day, but that is all premised on the assumption that a great deal of hardware documentation came along with the laptop as previously noted, because without that all anyone could do is sit there and marvel at it, which isn’t terribly interesting.
But assuming this existed, it does get interesting, because as I said, they could build USB-connected serial line multiplexers, write a timesharing OS, and use this thing as the core of a hugely powerful data center unlike anything anyone could even have imagined.
And if they can do that, then they have thousands of active users who would all be entering data through the keyboard and mostly printing it out on their own ASR33s or whatever they happened to have, which is an aggregate total of an awful lot of data through a lot of slow channels. This is, in fact, precisely how timesharing computers of the 60s were mostly used.
Again, given sufficient documentation, there’s no reason this couldn’t be augmented with custom-built interfaces to the relatively fast magnetic tape drives of the day, by means of which data could be transferred to and from other computers, printed on line printers, etc.
This is not as fantastical as it may sound. Back in the days when peripherals like disk storage were extremely expensive, interfaces between multi-generational computers were sometimes actually built.
I think the greater issues are more nuanced, the kinds of things we tend not to think of because in our world the solutions are so easy. How about … backup! The laptop would have a disk drive whose capacity would be almost beyond the bounds of imagination for those folks, and used in the way I imagine it would contain vital data from thousands of users. There would be simply no practical way to back it up. Use would have to be on a caveat emptor basis! The early 7-track 2400-ft magnetic tapes of the day, as impressive as the drives looked, had a tape capacity of about 5 MB. So even a modest laptop with a 1 TB disk would require 200,000 tapes to back it up, and about 23 years of time to do it!
If the thing has microphone and headphone sockets, those could be used as general purpose I/O - I think they could be driven using vb scripting (built in to Windows)
It’s unfortunate that OP specified a Windows machine. A more useful option would have been to put Linux on it. A more-or-less standard GNU/Linux installation comes with a crapload of interpreters, compilers, editors, and various data massagers. Plus it’s multiuser straight out of the box.
Making it programmable probably wouldn’t be too hard either, even without Excel or browser javascript workarounds. If you remove an executable’s file extension you can force Windows to open it in Notepad which just shows a hex dump of the file. Seeing a bunch of hex you could make the leap that it’s machine code and from there figuring out how to write programs would be tedious and obnoxious, but easily doable. Giving it all the effort of your greatest minds you could probably have an assembler bootstrapped pretty quickly, even if you’re missing a lot of quality of life instructions (figuring out SSE2 or threading, for instance, may be a stretch).
It would take a little serendipity, and admittedly showing file extensions is buried in a menu in the control panel, but they’d be poking at it so much they’d probably end up doing it eventually.
Thanks for all the really interesting thoughts and discussions. Just to follow up on a couple of things:
I was envisioning the laptop coming with the normal user documentation one gets in the box. So no detailed system documentation. I was also envisioning that it came with a “normal user” who could explain the basics. So someone like me who could show how to use the machine and explain the basic high level concepts. Of course I didn’t say that, and however one wants to speculate is fine with me.
I specified a Windows machine only because of how common they are. I wanted it to be a very typical and common item
A few have touched on what I would think is the biggest problem- as soon as a registry file gets corrupted, or a driver, I think they may have a brick on their hands. I suppose that if the person with the laptop had some common sense, he’d tell them to use it as long as possible and once it (inevitably) craps out then disect it…