Could the 1980s reverse engineer a modern iphone?

That’s an interesting point, because one other thing nobody has touched on is what it would tell them about the future of the user interface and how they might implement some of that technology. They may not have the capabilities to pull off all the graphical components, especially the gestures and animations, but it would show them a way to approach the interface in a way that works.

Granted, today’s iOS and macOS interfaces are noticeably less intuitive (or discoverable if you prefer) than they were even just five years ago, and that’s a legitimate criticism, as is the excessive thinness and overall washed out appearance due to a fetish for grey on grey. Nevertheless, Apple was a pioneer in user interface design in the 1980s, and if they got a hold of today’s iPhone, even with its flaws, I have no doubt they would be able to extract a lot of valuable lessons from it.

This makes one wonder just how much more touchscreen-focused development would be from that point on, because they wouldn’t have the equivalent modern desktop OS to reference and compare to. I can see it either taking the computer industry in a totally unexpected divergent direction, hopefully for the better, or it could also be an unmitigated disaster trying to shoehorn a bastardized touch interface into products where it makes no sense and doesn’t work.

In the stated scenario “a 1980s scientist stole my iphone”, they would not be able to even get past the lock screen of a current iPhone 7. They could not disassemble it and read the memory, since it is encrypted against an attack by even today’s technology. The iPhone 7 “secure enclave” encryption is highly resistant to attack even by the FBI and NSA – today – much less in the 1980s.

Assuming the battery was alive, they could tell by looking at the lock screen it had some kind of advanced touch UI and a high resolution display. But that would not enable them to somehow fabricate one within 10 years – what the OP asked.

A perceptive 1980s examiner might conclude it looks vaguely similar to the Xerox Star GUI (which existed in 1981), but for all he knows the iPhone is a specialized one-off device, not representative of future GUI trends.

If the examination took place in 1983 or later, he would probably know about the Apple Lisa and Mac, then notice the Apple logo and lettering on the iPhone. He would be amazed like a time traveling pilot was at seeing Star Trek computer technology in the fictional episode Tomorrow is Yesterday: Tomorrow Is Yesterday - Wikipedia

But that amazement would not translate into ability to recreate the technology much sooner.

The influential value would not be in the iPhone itself, since it could not be reverse engineered in the 1980s. Rather it would be in associated trends indicated by the phone – IF that additional information accompanied it. 1980s scientists would have no idea by mere examination the single iPhone depicted future ubiquitous trends in technology. IOW they would not know how representative that iPhone was. For all they knew it would be an experimental prototype or a $100,000 device for rich people.

So the value would be in information, which need not even involve the iPhone. Of more benefit would be a single-page paper note assuring them the future would be dominated by touch GUI interfaces and digital Software Defined Radio (SDR) technology which would replace analog RF methods: Software-defined radio - Wikipedia

But even then they would be dependent on layer after layer of technological advancement to achieve that. The ADCs and signal processing to implement SDR in the microwave spectrum would not exist for many years. Those advancements could not be greatly accelerated.

The limitation to making an iPhone in the 1990s is not lack of ideas or imagination. What enables an iPhone 7 is many decades of advancement across a wide range of technical areas: software, semiconductor design, routing, simulation and testing, chip fabrication, packaging, RF engineering, GUI design, etc. All those must simultaneously exist at an advanced level.

That technology is also only possible due to societal and economic changes which harness titanic capital investments over decades. As electronics, computing and communication shifted from a specialized niche to a broad economic segment, this enabled spending the required vast R&D in many areas – because it would be paid back by sales to a gigantically expanded customer base. That is how Intel and TSMC can afford to spend $15 billion each on a fab – they aren’t making a few “Apollo-style” devices for the military or industry. They are making 100s of millions for regular people.

E.g, global smartphone total revenue is $421 billion per year. That is 4x the entire 10-year Apollo moon project – every single year. Without that economic system to support the R&D (which didn’t exist in the 1980s) it would be impossible to fund the technology to make an iPhone in the 1990s. IOW it would have been beyond the economic grasp of a nationalized research project.

The OP asked whether examination of a 2016 iPhone would enable 1980s scientists to build a smartphone in the 1990s. Apple themselves tried to build a mobile device in the 1990s – the Newton. The technology was not then available for a truly effective device.

The people working on the internals would probably recognize many of the company names or logos on the components. Would they not then use that info to enrich their future selves by better guessing at which electronics companies to invest their stock accounts in? Knowing Apple and Motorola still exist 30 years in the future and the components aren’t from some of the other tech giants of the day would tell me something.

Just exactly what I was going to say. They may not know how to do it but they’d know who was going to be able to do it. Today they’d be wiping their asses with $20 bills.

Default is still a 4-number password, and no erase after failed logins, so they could brute force it in a few days at most.

Um, no.

I doubt they could engineer it, but I’ll bet they could get funding for research that would knock off a few years, and the emphasis on design would start earlier.

You do realise that the DB25 was merely a hang over from TTL level signalling used with eg DEC printers . They used the DB25.. and then all sorts of devices used DB25.. the printers, tty’s, terminals, keyboards, and so on.

The DB9 could run the RS232 modem. More than 3 so that the cpu was relieved of the burden of watching for carrier and disconnect, and handling flow control.

Anyway USB works with differential signalling.. the receivers are op amps with a high common mode rejection ratio as the gain controller.. that gives it noise immunity so that a far lesser energy per bit can be used..

Back to the OP…

The iphone would provide reverse engineered info for a more rapid development to the same technology level. The exact doping and layout (eg capacitor size, capacitor locations, and that sort of thing) used for the very very tiny node size (eg 10 nanometres ?) in the iphone could be measured. It would also demonstrate that such a tiny circuit would work and they’d be confident to make more aggressive shrinks.. (do chip companies purposely do smaller steps to sell more chips ? )

If one company had a monopoly on the market, that would be a smart business move. But with competition, you need to put out the best that you can as fast as you can so that The Other Guy doesn’t beat you to it.

Ooh, capacitors, that’s another thing they could learn from. An iPhone probably doesn’t contain any supercapacitors, but even for ordinary capacitors, the tech is a lot better now than it was in the 80s, and that’s mostly just chemistry.

I stand corrected.

Ok, so we have to hope that there’s no password set, or that it is a very easily-guessable one.

You are assuming the OP has left. xkcd: Security

Could they reverse engineer it?

I am sure they would easily understand what it was, and how it works
I think they could replicate it to a point (no cell network to put it on in 1980)

But am pretty positive that they would never replicate it in the same size factor as they found it.

Does it still count as an iphone if its now a big metal box sitting on your table
running off of a big 12v lead acid battery?

Back in 1995 you could buy a SIlicon Graphics Power Challenge with a Reality Engine graphics system. This came in an XL cabinet, which is the size of a fridge. A reasonably high spec machine comes close to the basic specifications of an iPhone. These machines came in at about $1 million. (A fully strapped up machine rated in the top 500 supercomputers - not near the top, but it got in.)

The problem with “replicating” an iPhone is the time scale you give yourself. If you get the iPhone in 1980, when the best VLSI technology put down one transistor for each million in a iPhone, you have a gap that is going to take some time to bridge. If you give yourself 15 years to make a big box that replicates most of the capability, you have not gained.

Au contraire

We’ve also pretty well established they couldn’t, without a few *years *R&D, even build the instruments to measure the signals it exposes to the outside world, much less the ones it uses internally.
So no; they couldn’t “… understand *how *it works or replicate it to a point …”. Their understanding would stop at the basic “it’s a (magical degree of) integrated circuit digital computer.” After that’s it’s PFMM. (Pure Fu**ing Molecular-scale Magic." In a world where “insanely tiny” = 0.1mm on a side.

The UX engineers would learn a lot more than would the EEs.

I look at it this way: it took thousands of years of the smartest thinkers on the planet studying mathematics in order to come up with calculus. Once the first guy figured it out, it was just a few short steps before we were teaching it to 11th graders.

The point is that it can be very difficult to figure out how to do things from scratch. Those same things are relatively easy once you have a blueprint.

There is a ton of hard won technological knowledge built into an iPhone. Smart scientists with access to money and a lab are gonna get that out. They are going to figure out how to interface with it, even with slow 1989s computers. They are going to get the software for modern wifi, Bluetooth, cell phone management. Give scientists an iPhone in 1980, and I’d wager it will make the next 30 years of computer and semiconductor innovation pass in 5-7 years.

Getting it to something that can sit on a table and run off of a single car battery would be almost all of the way there. But they couldn’t come even close to that. They’d probably have a hard time with an entire lab building full of electronics, hooked up to its own dedicated power substation.

Dear God, what have we done. This is worse than bringing back a Sports Almanac.

It is like that with pure theory, but not with technology. E.g, it took hundreds of years of scientific progress before Einstein’s theory of relativity. Today a young physics student learns that easily and it doesn’t seem so difficult provided someone already discovered it.

Technology – the application of science – is not like that. It is gradual growth and refinement, each layer of knowledge building on (and dependent on) the previous layer. New tools are first required before the next phase of technology can be handled. Those tools themselves are in turn composed of many advances which are each required to enable those tools.

E.g, an iPhone’s CPU cannot be made without an extreme ultra-violet lithography machine. Those cost at least $75 million and are themselves the product of decades of research. That is only the hardware – the sophisticated VHDL languages and support tools to design the chip must also exist: VHDL - Wikipedia

Also the layout, routing, and simulation software (and high-end computers to run it) must also exist, else the chip would never function, even if you could design it. The days are long past when a “breadboard” prototype of a chip is made and tested.

Trying to design the iPhone 7’s 16nm A10 CPU without all these tools and the knowledge to use them would be like trying to build a Saturn V moon rocket from bamboo strips and glue.

First, the iPhone 7 is hardware-encrypted so thoroughly that it’s unclear if the FBI and NSA can break into it – today.

Second, even if it was not encrypted, they had no technology in the 1980s to even read the 2Ghz bus signals from the A10 CPU. The most exotic technology in the 1980s was the gallium arsenide Cray-3, which ran at 474 Mhz. There were no oscilloscopes or logic analyzers back then which were fast enough, so Cray had to commission custom-built gallium arsenide instruments. And that was the limit back then. They did not have the ability to sample signals 10 times that fast.

They wouldn’t get anything since it’s encrypted and they cannot read it. If it were not encrypted they still couldn’t read it because the instruments didn’t exist. Besides the CPU and software, on the RF side the iPhone 7 is a Software Defined Radio: Software-defined radio - Wikipedia Modulation, demodulation, and filtering take place entirely within the digital domain. There was no technology in the 1980s to directly digitize gigahertz RF signals.

As shown in the SDR article, the theory and concept for purely digital radios had long been envisioned. It would have been no help to tell a scientist from the 1980s that given a future 4Ghz 12-bit ADC and a 2Ghz CPU, he could build a purely digital radio that could modulate RF at 1.8 Ghz. He already knew that, and was dreaming for the future day when technology would make that possible. That did not happen in a consumer package until well into the 2000s, because it was not technically possible before that – not because they didn’t have the idea.

“Stone knives and bear skins” https://www.youtube.com/watch?v=yfJXd0rSCqo