There were cell towers. Not cell towers that the phone could use, but there were 850 MHz AMPS towers out there.
I was talking about IC layout and autorouting, not printed circuit layout. E.g, microprocessors were laid out and routed by hand up until (I think) around the early 1980s. Today it requires sophisticated software to implement the floorplan, routing and post-routing optimization. For decades it has been beyond human ability to do that manually.
Even if scientists in the early 1980s had some kind of primitive (by current standards) software layout and routing tools for ICs, it would not work on the 3.3 billion transistor A10 CPU in an iPhone 7.
Even if someone gave them the IC layout, routing, optimization and simulation software, they didn’t have the computers with horsepower to run that. E.g, the 1982 Cray X-MP had 16 megabytes of RAM and could only do 400 megaFLOPS (peak). Today the i7-6700K in an iMac will do 240 gigaFLOPS (600x faster), and has 2000x the RAM. Intel’s latest Xeon E7-8890 v3 will do about 3 teraFLOPS - 7,500x faster than a Cray X-MP.
Scientists in the 1980s might get some tiny advantage from examining an iPhone 7 from the future, but it would be like the Wright brothers examining a Boeing 787. They would recognize a few things (“hey, it uses ailerons”), but they would not be able to understand the manufacturing, materials science, and electronics/software technology, much less reproduce it.
Ha. And double ha. USB is more complicated than some full-blown networking protocols in active use in 1980. USB requires actual processing power in the peripherals if you want a USB mouse, for example. A circa 2016 mouse has comparable processing power to a circa 1980 desktop computer, and a lot of that is tied up in speaking USB to the computer. (The rest is driving that wonderfully convenient optical mouse hardware stuff which works without a gridded mousepad. That technology was a twinkle in someone’s eye in 1980, if that.)
USB is great when transistors are so cheap you can shove a fully-functional computer processor up your nose and sneeze it back out without worrying about the cost of either doctor’s bills or replacement hardware. USB is the protocol of civilizations fairly drenched in cheap computing power and RAM. In 1980, if you somehow coaxed a desktop computer to speak USB, it would spend so much time running the protocol code it wouldn’t be a useful system.
I remember a fact article, by Heinlein I think. He postulated a late 60’s early 70’s drone taking samples of a Chinese nuclear test then returning to its base in the Philippines but going through a time warp and landing in the late 1930’s. From details like the USAF logo (“Huh? The Army Air Wing has its own logo now?”) to what the heck are these silicon chips (technology did not exist to detect the level of doping) to the fact that the whole thing was slightly radioactive and what sort of engine is this turbine… He suggested if they had to guess, they would estimate it was from at least a hundred years in the future, not 30.
By comparison, the iPhone is much more recognizable. Liquid crystal was known and IIRC starting to appear in crappy grey displays by 1980; matrix screens by 1990. IC’s are just more dense, but the concept exists. Lithium batteries? I think that would be new.
Another point to ponder - if they disassembled it, could they operate the components and demonstrate their use? Battery? Easy to substitute. Create their own screen drivers to show something readable on the scree, even with a square yard of discrete logic? Probably not? Would the memory chips respond to much slower logic if they tried to read them offline?
Would they want to destroy the device to analyze it? Could they unsolder the chips and leave them functional? etc.
-
They would not have a charger for it, thus it would run out of electricity quite soon and that would be the end of it. They would have no way of knowing how much electricity to use to charge it nor which pins on the connector to use.
-
If it did not work with any cell phone towers they had at the time? They would not know what it could do or what it should do.
-
Because they don’t include any instructions with these, again they would not have any idea how to operate it.
-
Many people have the same problem in the 2016’s. Without instructions, people have no idea their phone has certain functions!
-
If it did have instructions, something which said “Use the browser to access the net” would be a mystery. What is a “net”? And many other things would not yet exist. So they would not have any idea how to use those things and would not be able to see them working. Like a YouTube app. No YouTube yet! They would not even know you should press on the YouTube app to “turn it on”.
Nah.
They could just remove the battery and connect the phone to a power supply of the same voltage. After all, the voltage is marked right on the battery pack.
I wonder about other side effects. If, in fact, the phone provided direction and jump-starting of a technical path, are there things we’d lose? Are there things that were discovered in the actual research and development that we’d miss because we ‘knew’ where to look for the ‘right’ answers?
I worked in cell phone design from roughly 2004 to 2014. For a variety of reasons, it’s become harder and harder for us to remove and replace specific components. Sometimes we could place a new component instead of putting the original one back, but sometimes the security features made that virtually impossible, too. Of course, they wouldn’t have this option, because they wouldn’t have replacement parts. Putting the original components back after removing can be really tricky.
“No Copying Allowed” by John W. Campbell, Jr., I think.
I don’t recollect who wrote that – I thought it was John W. Campbell but I can’t find it listed. However the gist was a cruise missile from the 1980s going back only about 40 years to (say) the late 1930s, and the scientists could not adequately examine it. They didn’t have instrumentation to read the microsecond clock rates of the electronics, trying to cut a titanium allow wing with a cutting torch set it on fire, etc.
Similarly, scientists from the early 1980s did not have the instrumentation to even examine the chips on an iPhone. There were effectively no digital oscilloscopes. The fastest analog scope from Tektronix was about 400 Mhz, totally incapable of reading the 2.3Ghz clock of the latest iPhone CPU which has a rise time measured in picoseconds.
There were no logic analyzers remotely capable of sampling the bus signals from a 2016-era smartphone. Here is a typical logic analyzer from that era, the Gould Biomation K-100D. Note the clock rate of 200 nS, which equates to 5 Mhz. Its fastest sample rate was 100 Mhz. It would get no more valid data from an iPhone 7 than if you connected it to a tree: https://www.youtube.com/watch?v=iFmOOZXm1TA
http://www.recycledgoods.com/media/catalog/product/cache/1/image/0dc2d03fe217f8c83829496872af24a0/g/o/gould-biomation-digital-logic-analyzer-k-100d-2.35.jpg
Surface mount assembly of packages with extremely fine lead pitch (ie spacing between the tiny conductors from the chip package) was unknown in the early 1980s. 1980s scientists could probably somehow figure out how to remove the chips from the an iPhone 7 PCB but they would never be able to examine the signals in place or after if it was removed – even if they could somehow repower the components. It would be little different from a renaissance-era scientist examining a transistor radio from the 1960s using tweezers and a magnifying glass.
I give up. How does a USB port work with only four wires? For example, there is a VGA to USB adapter. How does that work?
You may be right. It sounds like Campbell, come to think of it - and I have two books of his collected letters (very interesting read).
The difference is, the 1930’s scientists would have no idea what they were looking at. Whereas, 1980’s scientists would recognize silicon chips; multilayer circuit boards would make sense; I presume they would at least be able to determine the clock frequency the device operated at, and radio frequencies for telephone and Bluetooth, even if they could not read the signal data. Basically, a jet engine cruise drone using IC’s would be a horse of a different color in the 1930’s, while an iPhone in 1985 is simply a bigger faster horse.
Yep
And here’s a link with some excerpts from the article story identification - Essay in a sci-fi anthology; A missile from (circa) 1968 is found in 1940 - When do scientists think it's from? - Science Fiction & Fantasy Stack Exchange
Power, Ground, and a synced pair of data lines called D+ and D-. By being clever about what you do with the signal on the data lines, you can transmit a crapload of data.
There’s a bunch of protocol stuff that happens, which let you have different bits of data be properly encoded on one end and decoded on the other.
Since the circuitry to do this is pretty small these days, you can embed it in such an adapter.
High-speed serial communications.
USB 2 supports around 50 MByte/second data transfers, so you send the video data out serially, and then convert it to analog VGA in the adapter.
USB in some sense only has theee wires. Power, Ground, and Data. But the Data is a differential pair due to the speed.
Here’s another link with more excerpts from the essay, which was reportedly first published in October 1948:
Probably yes (although as noted in another comment, it’s simpler and less risky to take the battery out and supply power directly to the phone itself)
USB has four connectors, and if you look at the end of the cable, two of them are longer than the other two. Those two are power and ground. And the reason that they’re longer is that you really want the power and ground to be connected before the data pins are connected, since you can get weird spikes and artifacts if the power/ground are on the verge of connection after the data pins are connected.
I’m pretty sure there were combined power/data connectors in the 80s, so this principle is one that an EE would know. So, right there, they have a 50% chance of getting power and ground right.
But they can do better. If you strip off the cable, you’ll see that the wires to those pins are Red and Black. Those are pretty common Power and Ground colors. They might be slightly confused by the fact that one of the data wires is green (also often used for ground), but since it’s got a pin length that indicates its not a power line, it should be fine.
They should then be able to attach a bench supply and slowly increase the voltage. Undervoltage shouldn’t hurt anything, and as soon as they get to 5V (probably a bit below that) it’ll start working. The battery symbol looks just like a AA battery, so it should be pretty clear what’s going on.
Of course, it’s possible that they’d do something wrong and break it. But a reasonably-well-trained person with an electronics background ought to be able to get this right.
Another point that folks have danced around, but not addressed fully, is that a lot of what’s in an iPhone isn’t in the iPhone. An awful lot of it is cloudy. This is obvious of, say, the web browser, or the phone communications, but it also applies, for instance, to Siri’s voice interpretation.
That said, even though the engineers of the 80s couldn’t reverse-engineer the whole thing, there are definitely some subsystems they could learn from. They’d probably be able to learn a lot from the batteries, the capacitative touch screen, the accelerometers, and the camera optics, off the top of my head. And a lot of what gave Apple such an edge over their competitors were ergonomics and usability features, which can easily be copied as soon as you see them in action.
Then how was I able to do it in 1976 with only 3 wires? It’s even possible for RS-232 with only 2 wires if you can accept some severe limitations.
And 4-bit microprocessors were in pocket calculators[sup]*[/sup] by the early 1970s (4004), 8-bits by late 1970’s (8008, 6502, 8080, Z80), not '80s.
- If you had a big pockets, that is, both for the bulky units and the $100 a 4-banger would cost.
A lot of what we use iPhones for is in the cloud, but I think most of what an 80s engineer would be excited about is the second category: the hardware itself
The fact that you just touch the glass and it responds so quickly and fluidly that it’s like your finger is moving a physical layer is amazing. And they could figure our pretty quickly it’s capacitive.
Click on the Music app, and holy crap this guy has 700 albums on this thing?! And four full-length movies?
Plus, so many apps that use connectivity will give them tantalizing glimpses at what is possible. Sure, when you click on the phone icon and dial a number, it won’t work. But they’ll still know that this thing is a telephone, and that the radio signals its sending out are trying to communicate with something. Try putting your iPhone on airplane mode and nose around to see what kind of information is there, and what conclusions you could draw. WordLens: Magic translation. Venmo: there’s a global electronic payment platform. Twitter: Oh God, what have we done!
The most amazing thing in all would be the “Assembled in China” written on many parts. China?! WTF?