Nah.
Feed it that voltage, and it will be fine.
Apple uses smart power supplies, and laptops that are charged by USB-C are smart, too, but most others just need a source of DC voltage.
OK, I just checked again; the voltage isn’t next to the power port but there is a label on the bottom that says the voltage needed. (BTW, something that will blow the minds of the people in the 1950s who are looking at the device is the MADE IN CHINA sticker.)
Sorry I didn’t specify, but absolutely the power supply is included. I meant it to be a fully functional but in no way special modern laptop
I think they’d get more benefit from trying to reverse engineer a really early computer like an Osborne 1 or an Altair 8800 - simply because:
[ul]
[li]The printed circuit boards are single layered, so you can actually see what’s wired to what[/li][li]More discrete components - Modern laptops have very few discrete components - functions are all concentrated into chipsets that would be very hard to understand. At least with an old computer, you would have components like transistors that should be recognisable as such in 1955[/li][li]The integrated circuits are simpler - I still think that something as complex as a microprocessor might be hard to reverse engineer in 1955, but some of the logic chips could be understood, tested and reverse engineered, I reckon[/li][/ul]
I think the boffins of the day would complain that you didn’t travel back with a Mac.
You’d be better off going back with an Ubuntu laptop. That’ll have the gcc compiler and also comes with python already as well. So yes, programming is definitely possible there.
Most standard Ubuntu setups will have LibreOffice, an editor like PyCharm and a db like Postgresql as part of the basic install (at least, mine do) which would, in the right hands, be very powerful. All of those are also quite self-documenting.
Even for a given distro there’s a range of what’s included. The gcc compiler is not included on the more basic ones. And even at the top workstation and server versions then the person installing can opt out of packages like this.
OTOH, Python and PHP should be included on all but the simplest ones since so many scripts require them. You can do a lot of great programming with those. Even a simple bash shell has some features that would make a 50s programmer drool.
A top notch programmer can be pretty good at figuring out smaller programs/scripts and see how things are done. Modifying them for other uses at first and then writing new programs from scratch. The question is whether such a programmer existed at that time. Esp. since one of key skills is knowledge of several languages which greatly speeds up learning a new one.
Let’s ALGOL was officially released in 58. So missed that. Knowing, even if only by reading, ALGOL would be a big boost.
Here’s a link to three earlier threads with a similar theme:
Could the people of 1983 connect to my Macbook Pro?
http://boards.straightdope.com/sdmb/showthread.php?t=456127
Could the 1980s reverse engineer a modern iphone?
http://boards.straightdope.com/sdmb/showthread.php?t=814911
The earliest year modern tech could be replicated (time traveler scenario)
https://boards.straightdope.com/sdmb/showthread.php?t=850957
One thing I was thinking about modern laptops is that they often come with SSD drives, which are just memory chips. There wouldn’t be a 2.5" drive with spinning magnetic platters in the laptop. That might actually delay storage advances, as the scientists may think that chip or electronic based storage is the way to go. I could see research into magnetic platters having less interest or being abandoned all together.
Realistically speaking, what’s the likelyhood that the scientists in '55 end up breaking the laptop trying to take it apart? Even today with youtube videos and stuff, it can be a delicate process disassembling a laptop. With the type of manufacturing and assembly they had at the time, I would guess they would assume the laptop used similar techniques and that they would be able to figure out how to take it apart. The amount of curiosity they would have about the internals would be immense. I can’t imagine that the laptop would stay in one piece for very long. I’m imagining a “goose that laid the golden egg” scenario, where they destroy the laptop trying to figure out how it works.
Even if the computer has a HDD with spinning disks, they’re not going to discover that without opening it up, which is almost certainly going to destroy it. Actually, I think the slightest poking around is likely to cause damage.
Certainly they’d break it, but they’d know what it was. They had HDD’s back in '55, although obviously much larger and much lower density. However, the basic design was the same. Both have magnets and spinning magnetic platters. I’m certain they’d instantly recognize that it was a disk drive as soon as they got the cover off and saw the magnets and platters. The drive would be broken at that point, but they would realize that disk technology was the viable storage technology for the future and research would continue on that path. But if the laptop instead uses memory for storage rather than spinning platters, they might wonder if HDD technology is viable or not. I wonder if they’d realize that the HDD’s were a necessary step before we advanced enough for SSD.
I’m actually pretty optimistic about what they would gain by looking at a modern PC. Remember, they were dealing with very simple concepts in solid-state electronics. Just giving someone the idea of etching complex circuits using certain substrates would be a huge leg up. They would easily be able to identify the types of materials used, even if they had to work on the technology to produce them. IOW, they don’t need to be able to duplicate a CPU (for example), just understand the approach and the fact that it can be done.
I just want to give them Kerbal, and see what happens.
Close. Announced in '56 and delivered in '57. But the people at IBM working on those would have been able to figure out a few things. (There were also drum drives which are a similar concept.)
But note that there is no way some 1955 IBM engineer just by (irreversibly) tearing apart a small laptop drive is going to figure out the vast number of tricks used to store data at the rates we’ve been used to for the last 20 years. They would not have the tech to analyze the way the magnetic domains were applied, etc.
Many people are missing out the concept of bootstrapping in tech. To advance tech to the next level requires a lot of nearly as advanced tech. A lot. This laptop might give people ideas earlier. But ideas aren’t the same as the chain of experience that the laptop doesn’t remotely come close to conveying.
An old question I vaguely recall: Suppose a circa-1990 cruise missile with a nuke warhead gently landed at Bell Labs in 1935. What could they make of that stuff from 55 years in the future? ICs, radar waveguide plumbing, lasers, and unknown fiendish metals. Would they learn to make better flying bombs sooner?
A circa-2010 WinTel business laptop taken back 55 years to 1955 would be equally mystifying and liable to break, even if it’s brand new, shrink-wrapped, with a Quick-Start Guide enclosed. At least it probably won’t explode.
Sure, but the OP was asking about a standard Windows machine, so the equivalent is not going to be some minimal distro. It’d be the bog-standard Ubuntu Desktop install, which definitely has gcc.
They could. But then they’re going to have a lovely adventurous time searching for precompiled versions when installing some other packages (and some python packages too). Who does that to themselves, other than embedded system wonks? Genuine question.
“Bog standard” is not the same as universal.
I have been using gcc for decades. I use Cygwin on a daily basis. But the number of times I’ve had to update/install a package using gcc is quite rare. Here and there several years apart. And that’s because I’m eclectic and old school. (The software are really old ones that no one put out a Cygwin package, for example.) Regular folk don’t do that.
For a “desktop” version of a Linux distro, the need for gcc is virtually zero if not zero.
But it should come with other programming tools that would be quite useful.
The question is what does a current consumer version of MS-Windows come with? The only thing I can think of is limited command shell stuff. Is PowerShell now part of a standard install or still an extra? That would be somewhat better. The GUI stuff is important. How would one learn to do that using one such computer?
I’ve done GUI stuff in PHP. Fairly straightfoward. And you can put PHP on a MS-Windows box. But it’s not standard, or even all that common.
Must be my job, but stuff needs to compile quite often for me. Mostly python packages but still…
PowerShell is standard with Win 10.
It would cost them more than that. The $4 billion cost of a fab is only possible since you can buy a lot of equipment from vendors. Try to invent that by yourself and the price tag would go way up.
Wouldn’t help them reverse engineer the design. We had to remove some layers to get access to a signal line we wanted to observe, and we messed up half the attempts, and that was with the right equipment and really good engineers.
They might, if they were lucky, figure out the memories though. Maybe.
They wouldn’t know how fast the thing went, unless there were convenient pdfs of articles on the machine. Analog computing was pretty much dead by 1955. While there were plenty of dead ends that people investigated, the number of development threads people investigated means they didn’t slow up progress too much. It would go faster if people knew the details of what worked, but just knowing a fast computer was possible didn’t help. It isn’t like nuclear energy, where say if we knew economically feasible fusion was possible might encourage more funding, especially if there were hints as to how it was done.
Trying to go too fast in process development would likely slow things down.