Upper limit on processor speed

I remember reading in some journal that the upper limit for an electronic computer processor is about 10 GHz. The explanaition had something to do with bandwidth, but not being an electrical engineer I could not understand. At first I tried to employ heisenbergs uncertainty principle. Figuring that 1 billionth of a second is a reall small time span I assumed that put certain limitations on the energy. When I did this I came up with a minimum of 3*E-7 Volts. But that is a really small number so I don’t think that is the problem.
What is the upper limit and why?

When Cray was making his supercomputer he put all the wires into one “tower” cabinet. He’d determined that by cutting the wire length alone he could speed up the processing.

The constraint was on the speed of electrons in silver at room temperature, which is close to the speed of light.

So if you keep getting smaller, you can get faster, until you reach the size of the smallest gate.

There was an article a while ago in the NYTimes about what is the theoretical limit to computing power, based on hardware limits and the laws of physics. They author hypothesized the “ultimate apocalyptic laptop” that is basically a small quantum computer. You think your laptop gets hot? Try the ultimate laptop. It is basically a small sun sitting on your desk, or perhaps melting through it as it gets up to millions of degrees in temperature. The article was really astonishing, but alas, it is now only accessible by paying a $2.50 fee to NYTimes.com. Just search on the phrase “apocalyptic laptop” if you’re really interested.

So then why does switching to photonics solve this problem? I see no fundamental reason why photonic devices would be smaller than electronic devices. I am currently doing research in this area for this very reason. According to the article I read the potential increase in spead is something on the order of 10^6. (Quoting from memory so ± 10^3)
I think there may be something more fundamental involved, but I must admit I am outside my realm of knowledge.

When I bought my first PC (Leading Edge Model D w/2 5-1/4" floppies, 4Mb memory, 7.14Mhz 8088 processor) I joked with the salesman that, one day, computers would shrink down to the size of a large paperback book, but would need a desk-sized liquid nitrogen tank to keep it cool. Yeah, I know, I kill me, too. Anyways, when IBM created the 1Ghz prototype chip, its biggest problem was overheating and melting. As for single chip speed, I’m just guessing that there is a feasible limit, so that future computers with be multi-processor ones.

What I wanna know is: whatever happened to holographic memory? Storing files in a glass cube using lasers.

Too bad we’re not in Great Debates. We could have started a PPC vs. Intel war over processor heat generation. :slight_smile:

Think of it this way:

A 1GHz processor means that one ‘cycle’ takes approximately one billionth of a second. Light only goes a little over 11 inches in a billionth of a second (and the electrons in microchips don’t even travel at the speed of light). Eventually you will have squeezed the maximum yield out of conventional microchip circuits.

There are other problems besides just the speed of electrons in a conductor.

Stray capacitance and inductance play a big part in high-frequency circuit design and can cause no end of problems. There’s always ways around it to a point, but a capacitor acts like a shorted wire at high frequencies. Inductors act like an open.

With electrical paths close to each other as they are in a µP between any two paths is a “capacitor” (by definition). A capacitor acts like a short-circuit as frequency increases.

Any length of a conductor acts like an inductor to high frequencies, too. Thus, in order to achieve faster speeds we must shrink the chips. Sort of a double-shrinking effect (not just shrink to squeeze more components in, but to counter inductive paths). Shrinking doesn’t affect stray capacitance, though…that’s handled seperately.

Let’s keep our fingers crossed for the quantum computer :smiley:

Actually, theorists readily admit that light-based processors would be much BIGGER than your standard silicon wafer.

Why the photonic proc would be better:

  1. As mentioned above, signals on a silicon chip don’t travel as fast as light. Light signals would travel as fast as light (obviously :D).

  2. Light beams can intersect each other. Electronic signals on a chip can’t. By being able to have more than one signal going at a time, you increase the speed.

For those who don’t know, the “Quantum Computer” bases its calculations on the rotation of individual atoms. This is, obviously, the smallest you can get.

Other possible computation devices that are anticipated include “organic computers”, which would use artificial DNA sequences for calculation, and “molecular computers”, which would rely on streams of molecules to carry info. The latter is just a step up from the Quantum.

sorry to be picky Spoofe, but you can get smaller than individual atoms - in the context of quantum computing as well.

I think that parallell processing is the way forward - after the physical limits of chip technology are reached.