Time Travel with a Modern Laptop

Right. Remember that famous quote from the President of IBM that he thought that there was a world market for four, maybe five computers.

If a working teleporter or warp drive were to fall back to our own time, we wouldn’t need to reverse engineer them for there to be a revolution in research. Just the fact that we have proven that it is possible will kick development into high gear as funding becomes almost infinitely easier to get.

Also consider that many of the basic concepts regarding input, output, memory, processors, etc. were laid down by Charles Babbage in the 1800’s and later expanded on in the 1930’s by Alan Turing and Alonzo Church. Understanding what a processor is or why having more RAM is helpful would not be foreign concepts.

Unless, of course, our time traveler is smart enough to go back in time with a Mac.
In which case it will have clang installed, and all the man pages.

If the Mac was a bit older, it would have the gcc suite installed.

For people interested in this type of thing, I stumbled across this video of someone trying to surf the web using a 1965 modem on one of the first web browsers using an ancient line printer as the display. It works!. I realize this isn’t directly related to the question at hand (but it still has a lot of relevance). However, the level of geekery involved is astounding and I have never seen anything quite like it. The basic point is that you can integrate wildly divergent technologies of completely different generations if you have the knowledge and passion to do something so pointless. This guy is a work walking piece of cliched art.

Y’know, on thinking about it more, I think they might be able to figure out the machine code, and hence be able to write their own compilers. But they wouldn’t do it by starting with a complete executable already on the computer. Or at least, they mostly wouldn’t.

Start by opening up an exe file in Notepad or whatever. The program will contain a few literal strings, which should be readable as they are in Notepad (surrounded by a sea of gibberish, of course). String literals are often found in the company of print commands, so they should be able to figure out (with some trial and error) what those commands are. Now, if you’ve got anything that can write bytes to a file, you can probably bung together a crude but functional Hello World in binary. And that program, only a few bytes long, is the one that you then use to find the rest of the machine code: Once you have a functioning program, try wrapping it in all sorts of other codes until you find the ones that act like ifs and gotos and so on.

There’s a similar video of a guy connecting to a text-only web page using a DEC VT100 emulator running on a laptop and a slightly older “Model A” version of that wooden Livermore modem. He had to use an older laptop with a DB9 serial port and even so needed an adapter for the even older DB25 port on the modem.

Nitpick: that isn’t a “line printer”, it’s an ASR33 teletype. Line printers were high-speed output devices that were so called because they printed an entire line at a time using a horizontally rotating chain (IBM) or vertically rotating wheels (many other vendors). Some were very fast and line printers and magnetic tape drives were among the most impressive of ancient computer peripherals!

Thanks for the info. I call every large printer from the mid-1980’s backwards a “line printer” but I never knew it was more specific than that because I have never worked with anything older than that. I like retro computing but it doesn’t go much past the late 1970’s or early 1980’s.

I was just really impressed that someone worked out a way to connect to the web in the most antiquated and convoluted way possible. Using the printer as a display was certainly a great touch but doing it with a circa 1965 acoustic modem, a rotary dial phone, and punch cards were the capper.

That is true geek art.

Acoustic couplers were certainly a significant limitation. When I was referring to really simplistic methods I was referring to things like the Kansas City Standard. The potential in even a cheap cassette tape recorder was vastly greater than the pathetic speeds obtained, and avoiding doing idiotic things like making 0 and 1 harmonically related tones was well understood even then. It really would have been only a small handful of parts, and a few dollars, to make something very much faster and more reliable.

I predict they would disassemble and study the laptop. This would produce a huge advance in computer science. This would advance our technology by 50 years, enabling them invent the very time machine you used to travel back in the first place!

Folks keep talking about decoding this, back engineering that, reverse engineer something else, MacGyvering yet another thing and so on and so on.

All this in an effort to get the I/O from keyboard typing/screen output limitations to what the computer is actually capable of I/O wise, because that is what you would smartly like to do to REALLY utilize/optimize the capabilities of this magical computer from the future.

But this stuff is gonna take time. How many problems are going to need to be solved? How many days for each problem? And again, while you are putzing with it, you aren’t using it. Then throw in some minor problem that ONCE the solution is found its obvious in hindsight, but its the kind of thing that has you stumped for WAYYY longer than it should (we’ve all had our share of those).

It IS fun to think about what COULD be done, whether it is theorectically possible, and how you might go about really making this computer scream back in the day.

But realistically, I think there is fair chance if you went that route, it could be anywhere from months to years before you got all your ducks in a row.

It’s even simpler than that.
The computers of the time were more than capable of solving the problems of the time. My dad was a programmer in that era, and he never complained about the speed of the computers - the problems he was solving (large simulations, mostly) ran at reasonable speed (say, overnight) on an IBM 360.

Having a machine which was 1,000,000x as fast would just mean that more time was spent in meetings, deciding how many new bells and whistles to add to the software…

I would assume years. One or two. This does not diminish the value. It might make it a poor story for a movie however. We are so used to instant gratification that we forget that most important tasks have taken significant time.

Consider the code breakers in WW2. It took literally years from the first intercepts of the Lorentz machine codes to construction of the Colossi, and production breaking of the German high command communications - just in time for D-Day. It was worth every hour spent.

In the 50’s, you get what? - a massive boost in design of nukes, aircraft, spacecraft, ability to break a huge amount of encrypted traffic in the world. For a country to get hold of such a device it could be an insane game changer. If it took a few years to get the device up to a useful level of use, it would not make a great deal of difference. A modern laptop would be capable of covering the entire world’s electronic computations up until about the end of the 70’s in about a year. That capability in the early 50’s would be ridiculous.

Oh no doubt the thing would be a boon.

Also, however, keep in mind all those computations done in the 50’s through today in the non time travel (our) timeline still needed one thing done. ALLL that code writing to do the calculations and simulations in the first place. And then there are the simulations and calculations that need real world data. Which means testing and measuring stuff.

This supercomputer/laptop won’t help much in that regard. So you still have some serious other real world bottlenecks.

Sure this computer would probably inspire the powers that be to invest even MORE on that stuff and do it faster, but it wasn’t like the DOD and the like were just dabbling in it because the computer speeds of the time weren’t that hot.

Hmmm, just occurred to me.

The most valuable thing about the computer might just be all the code on it.

The ability of THAT computer to run the code and do it fast might be in the noise value comparision wise.

Well, that and Bill Gates ass never gets rich :slight_smile:

You mean the source code for all the software that comes pre-installed on a laptop? Unfortunately, all you have is compiled code. Seeing the resulting apps would certain be useful somewhere down the line as far as pointing app developers in useful directions, but without source code, you won’t be able to skip all the work of writing code for your own eventual dumbed-down versions of Microsoft Excel, etc.

Well, yeah, I was assuming might be able to figure out how the code works from the compiled code. Or at least get insight/ideas.

But I wasn’t neccessarily thinking the actual full blown programs. I thinking more PARTS of the code.

But you are probably right. That is likely wishful thinking.

PS. That’s why I said just occurred to me. I had not thought it through :slight_smile:

And that’s just what it’s really hard to get. If you want the full program, you have it. If you want to break it up into pieces, well, where do you break it? How do you know what piece is what? How do all of the pieces fit together? That’s tough.

PS.

See the PS in my last post :slight_smile:

I also think that there could be a treasure trove of future “spoilers” in the dictionary, and thesaurus.

For example if run the thesaurus on the word gene we get “DNA segment” as a synonym. Also that Obama is a name that is well known enough to make the dictionary might mean something.