Could the 1980s reverse engineer a modern iphone?

Whenever I time-travel to the past, I always leave my IPhone home, to prevent changing the timeline if the past scientists find my phone and use it to invent technologies sooner than they should have.
But then I realized, even if a 1980s scientist with a state-of-the-art lab did get my iphone, he wouldn’t have access to the machines that make the phone, so he wouldn’t be able to replicate the iphone. Most recent computer advances weren’t actually in the iphone itself, but in the manufacturing processes used to make the iphone. (Transitors are now 14 nm wide and there’s over 2 billion of them in an iphone. Back in the 80s, transistors were over 1000 times bigger than they are now, and it took a lot of researchers decades to figure out how to make transistors that small). So I’m thinking that even if that 1980s scientist had detailed circuit diagrams of every part of the iphone, that would be FAR less valuable than the blueprints of the machines that make the iphones. Simply having the phone without details of the machinery to make it might not do him much good at all.

So the next time I travel back in time, how much damage would it do to the timeline if a 1980s scientist stole my iphone? Assume he’s backed by a large company with an essentially unlimited budget and large research team to reverse engineer my iphone, heck let’s even assume he stole detailed circuit diagrams of the iphone too – but without knowledge of the manufacturing process, is there any realistic chance he would be able to manufacture his own iphone within, say, 10 years (mid 1990s)? Or would they have to wait till the manufacturing technology caught up in the mid 2000s?

He wouldn’t have access to the machines that make the machines that make the machines that make the phone.

Seriously, I think that completely answers the question. The best electronics engineer of the time would say, “Wow, that’s either magic or from the future, and I don’t believe in magic.”

I think you have a pretty good answer there. However there will be lots clues for having access to the actual iPhone that would allow major changes to the timeline. It is a truism that a very significant part of invention is simply knowing that something is actually possible. In the 80’s VLSI technology was very limited. As you say, feature size was immense in comparison to modern processes. And although Moores’ Law was understood, and there were no foreseen barriers to continuing reduction in feature size for some time, nobody really imagined it could be pushed as far as it has been. You could expect that an iPhone would eventually be dissected and the internals imaged with electron microscopes. That would reveal a lot of highly valuable information. The geometry of very tiny transistors and just as importantly, RAM cells. That would inform engineers that 1 - this was a workable design and 2 - such things were actually manufacturable. Even the design rules could be inferred, which would inform efforts for years to come.

It would not be simply a matter of “we have no idea how to build this” but rather - wow - it is viable to make transistors that look like this - and that would spur innovation in a very targeted manner. Heck, knowing that is was possible, those scientists might actually come up with better ways to do it. They would not be constrained by decades of conventional wisdom and accidents of history railroading their thoughts. However they would not be churning out iPhones anytime soon. But the knowledge inside one could easily be expected to cut a decade or so off the development. (Given the iPhone has an ARM inside, the errant phone might have the happy consequence of killing off the x86 at birth.)

It’s funny, though, that the OP specified “the 80s”. Even though the technology in 1989 would have come up short, it would still have been many, many times closer than technology in 1980. Remember, Moore’s law talked about doubling in density every 18 months. That’s ~ 2^6 improvement during the decade.

The microprocessor was invented in the 1980s, so you’d have a very primitive version of one. There were no cell towers, so using it as a phone is out right away. The camera would have been a toy, at best. Flat/tough screens were still a few years away.

You’d be premature on quite a few technologies, but the engineers would easily understand how things were supposed to work, if someone could explain it to them.

ETA: responding to Francis Vaughn’s post.

Sorry, but I don’t buy it.
Knowing that it is possible to make a 14nm transistor doesn’t tell you anything about how it was made. What lithography was used? I used to work in E-beam lithography (in the 80’s) and that was always seen as the savior, but it’s gone pretty much nowhere - deep UV and phase-shift masks have extended optical lithography far past the point it was predicted to end back then.

Then there’s all the process technology required - steppers, photoresists, ion implanters, defect density reduction, etc.

Basically, at any given time, the most complex IC manufacturable is governed by the available process technology. Knowing that a more complex IC is possible doesn’t buy you very much - it’s always understood that there are more sophisticated ICs on the horizon.

:smack::smack:

Microprocessor in the 1970s, not 1980s.

But is the limiter the processor? Or even the memory. You couldn’t have a phone that you could stick in your pocket, but (other than cell towers*), I imagine you could make a laboratory version of the phone that was more like an appliance. And it might not have a touch screen, but you could have a mouse.

*And even then, we did have some early version of cell phones in the 80s, so some cellular technology was out there in some places.

Not a direct answer, but…

My father had a brilliant but erratic student who went on to work for Intel designing the pin architecture of chips–I’m sure that’s not the correct term–I mean the way the pins connected to the chip. As I understand it, his designs for this were so advanced that they influenced the way the chips themselves were built. So I can imagine that an engineer might look at something beyond her experience and say, we can do better here.

Sadly he flamed out–loved that guy. Burned brightly but fast.

So, was Gene Roddenberry really a nice guy back in the early '60’s? … that was sweet of you to front him this story-line … it worked quite well I think …

There is more to it than the fabrication technology, which itself would make it impossible. The early 1980s used 1-2 micron (ie 1,000-2000 nanometer) semiconductor fabrication. There were essentially no digital oscilloscopes, no logic analyzers capable of sampling at the required rate to even examine the iPhone data bus outside each chip package.

Most of the complexity in the iPhone is not in the chip-to-chip “circuitry”, but inside the chips. In 1980 the very first primitive Programmable Logic Arrays were being used, which could replace a few discrete chips. Today FPGAs can do entire systems. Today, fabrication (whether FGPA, ASIC, microprocessor or System on a Chip) is useless without supporting suites of test and verification software and tools. So in 1980 they didn’t have the ability to design the chips used in an iPhone, could not test the designs, did not have the semiconductor technology to fabricate them, nor the ability to test those.

I think in 1980 most IC design was done manually. To my knowledge layout and routing tools were not widely used until later. The sophisticated computer-aided circuit design process and Hardware Design Languages like Verilog did not exist then. When Seymour Cray designed the Cray-1 supercomputer, he used a pencil and pads of yellow grid paper. He solved the Boolean logic equations by hand. That computer was shipped in 1977, only three years before 1980.

Scientists in a large lab in 1980 might be able to de-cap the chips in an iPhone and physically examine them with an electron microscope, but they could no more reproduce that than a scientist today can reproduce a living ant with all it’s sophisticated behaviors after examining it under a microscope.

So they didn’t have the digital analysis tools to examine and reverse-engineer the iPhone, they didn’t have the circuity design and layout software tools even if someone gave them the blueprints, they didn’t have the fabrication tools to make the required silicon, and didn’t have the software technology to write the operating system.

Then there is cost. In the early 1980s a new semiconductor fabrication plan probably cost in the tens to hundreds of million $ (if that). Today they often cost $5 billion, and TSMC just announced plans for a $16 billion fab. That’s what it takes to make the chips which make mobile devices work.

In short they wouldn’t have had the tools to make the tools to make an iPhone, and maybe another level or two before that, and probably couldn’t afford it either way.

I remember installing autorouting software on my Windows 3 box in the late 80’s from a 5.25 floppy I purchased at one of the local component supply stores. Can’t remember the name of it right now, but I was using it to design and layout custom cartridges for my Commodore 64.

I remember now… EAGLE. (I think. There were a couple of others out there.)

There are also other technologies that could be scavenged from an iPhone. Touch screens were an infant tech at that time and I don’t think lcd screens were seriously on anyone’s drafting board at the time. Or is the latest version OLED?

You are correct about them not having the tools to make the tools, probably to the power of at least 10. They would be able to discover enough to be able to avoid some of the dead end research lines. Not that that would necessarily be a good thing, science and engineering learns at least as much from its failures as its successes.

So, did doubled give Roddenberry a piece of the action? :smiley:

Imagine taking your iPhone back in time and you had the Lightning cable, but not the wall brick. Would they be able to reverse-engineer the charging system without shorting it out or worse? Then you could at least ooh and aah over the games and other apps that don’t need a cell/data connection. Of course even with the older 30-pin connector could all the handshaking and voltage negotiation be figured out? If you know the phone takes 5 volts and 1 amp (thus 5 watts) to charge, would someone in 1980 or earlier be able to figure out how to deliver that to the right pins? Of course if you did have the wall brick then you could go to the 1800s and keep your iPhone charged up with very little issue, because that has the input voltages listed on it which are very easy to produce.

Well, if you do forget to leave it at home next time, at least set the password so that they cannot get into it. Hackers were not really less smart in 1980 than they are now. If they could break in and examine the system, that would greatly influence the trajectory of hardware and system design.

And more importantly, what do you have on it? Do you want nerds from the '80s finding out about rule 34 that early?

Screw the 1980’s, there would be no hope of building a 1st generation iPhone in 2000. The proof is that people were already trying and couldn’t do it at all. The hardware simply wasn’t there even if you knew how it all worked and had an unlimited budget to build the software. I work in IT and this stuff moves really quickly. You are looking at whole generations of tech in just a few years and it is still moving just as fast as ever. 1980’s technology still has nostalgic value but it is wagon train level compared to today. The 90’s were a Ford Pinto.

Let me put it this way, could you build the same level of virtual reality devices today that you will see everywhere in a few years? The answer is no because people have already spent a lot of money on developing that technology, partially failed and are trying again with better success. It will take a few more years for virtual reality to become viable.

I was an early adopter of the web. I always tell people that you could do most things back then as you could today but you had to be tech savvy and there wasn’t much content in 1994 - 1995. I had something that could do most of the things an iPhone can do today in 1996 but it was a full-sized computer with a CRT monitor. You certainly weren’t going to put it in your pocket or even take it off your desk. Tech years are even shorter than dog years.

It would be nice to just go back to 1980 and say, “This is a USB. This is the standard. Carry on.”

What the fuck? Only 4 wires? Everyone knows you need at least a full DB-25 connector for serial – assuming you can get enough throughput with serial. And what is this dynamically configuring hot-swappable bullshit? Get serious. Next thing you know, you will be telling me they are using FGMOS for non-volatile storage. Or that they make use of more than 640K of memory.

I agree to a point. We always knew there was more to come - but there were clear roadblocks that no-one had much idea how to surpass. Like you say - e-beam lithography was one idea that didn’t go that far. But here you would have a clear and indisputable proof that a feature size way way smaller than anything on even the most far reaching technology roadmap was possible, and some clues about how it would look. That starts the though processes going - how accurate do the steppers need to be? (Holy Shit! But it is clearly possible so I wonder how I could make one… And so on.) Knowing something close to science fiction isn’t fiction but cold hard fact crosses a huge set of gaps in terms of enabling action. I’m not saying that having an iPhone in your hand would allow you to reproduce one in short order. But having one in your hand would likely make some significant changes in the manner in which the existing technology developed. In a parallel universe where the OP did forget to leave his iPhone behind, I very much doubt the parallel 2016 would be only equal pacing us now. It could easily have gained a decade, maybe more.

no, but at its heart it’s still radio, so they could work out that much. figuring out the modulation and encoding is another thing entirely.

I agree that careful examination of an iPhone by scientists and technicians in 1980 probably wouldn’t result in them being able to reproduce it (just from the likely damage of taking it apart, if nothing else).

I was going to say that there are a bunch of things in there that could potentially provide a technology boost at the time - but actually, so much of the technology is reliant on the exact process that was used to create it, I’m not sure many of the pieces are particularly ripe for reverse engineering.

For example - even the white LEDs used to backlight the screen - they’re fairly macroscopic, but taking them apart would only tell people what they already knew at the time - Gallium Nitride is a good candidate for blue LED emitters, and blue LEDs with a yellow phosphor make whitish light.
What they would not be able to figure out is the fairly specific process by which Gallium Nitride is made into a suitable material for LED emitters - that would still have to be invented - I don’t think it could be figured out backwards.

Lithium battery technology might get a boost. Despite their tendency to self-destruct when opened - the development was already underway in the 80s - but having a modern example of a high capacity lithium cell would probably bump the process forward by a few years.