I will have to say that has to be about the most poorly written NYT articles I have ever read. Anything is possible there. I was especially puzzled about their comments about really fast chips being comparable to making Wi-Fi device chips out of silicon. For such a big breakthrough, I am surprised it will be published in a 2nd rate letters journal as well.
That said, I am fairly certain that typical cell phones do not operate at 2 ghz. I have no idea why they would need to be that fast. Most of them have chips that operate at a small fraction of that speed.
“On a pure cpu / memory / display / communications level, most modern cell phones should be considerably better gaming platforms than a Game Boy Advanced. With Java, on most phones you are left with about the CPU power of an original 4.77 mhz IBM PC, and lousy control over everything.”
Most cellphones that I programmed for (and for which I knew the processor) were using an ARM 7 or 9 processor. These days they are all probably 9s–which, according to the Wikipedia will give us around 200 megahertz.
I was able to do a 3D box spinning on a phone without any hardware support*, so the processor can’t be that bad. That I could tell though, screen writes, flash writes, etc. were all pretty slow. So doing a lot of math was fine, but interaction with any of the other components wasn’t very good–I think they cut a lot of money out by putting in pretty cheep wiring which effectively slows the whole phone down, but then of course putting the CPU speed on the box so that it looks like it can do impressive stuff.
- Had to make my own fixed point math functions, including sin(), cos(), etc. Pretty fun!
The CPUs don’t, but that doesn’t mean that some small special purpose processor doesn’t. I don’t know enough about signals and EE to know if sending and receiving a 2.4GHz signal requires silicon at that frequency as well.
The article is referring to a single transistor operating at that frequency, not a processor, or any kind of digital logic for that matter. More to the question, a cell phone certainly has no need of a CPU operating at 2+ GHz speeds, although I think cell phones do contain digital signal logic that operate at the frequency of the transmissions (2.4GHz). That’s probably what the article was referring to.
The article talk about an IBM achievement. 2 GHz is a bit low, but perhaps the journalist just heard “Cell” and got it horribly wrong.
Someone got totally confused.
A lot of cell phones these days operate at a radio frequency of around 2 GHz. This is the frequency that they transmit and receive data on, not the frequency that the CPU operates at. As Sage Rat said, the actual operating frequency of cell phone CPUs is more in the couple of hundred MHz range (or, in other words, around 0.2 GHz).
Why you would want to compare CPU clock frequencies to radio transmission frequencies is beyond me.
My best guess is that an engineer was trying to explain clock speeds, and said that most computers run at about 3 GHz today (the 3 GHz figure is mentioned later in the article) and was trying to explain to a non-technical person just how fast that is by comparing it to radio waves. I dunno. Just a guess.
I’m also less than impressed by the fact that the extremely high speed was reached using cryogenics. High speed circuits are already reaching speeds of about 50 GHz, so while this probably is a breakthrough, it’s not a huge breakthruogh of many orders of magnitude, which is what someone is trying to imply.
Don’t you need some kind of signal processor that can go at 2.0 GHz to receive the signal?
The radio-frequency signal (frequency-modulated carrier at ~2.45GHz for the cell phones under discussion) is heterodyned down to an Intermediate Frequency (IF), typically of a few MHz, using a fixed-frequency Local Oscillator(LO) and a non-linear Mixer. This IF signal may be digitized directly and processed by a DSP (Digital Signal Processor), or an additional LO and mixer may be used to further lower the signal frequency before digitization and processing.
To perform DSP on a frequency-modulated signal, the clock speed of the processor needs to be many times the frequency of the signal. The only silicon devices that work directly on the 2.45GHz RF signal are analog amplifiers and the first-stage mixer down to IF.
The article was about IBM’s 500GHz uP. He was trying to illustrate how fast that was, and mistook the rf of a phone for the clock speed of it’s uP.
The exact nature of the device is unclear from any of the reports that I’ve seen thus far. The OP’s linked NYT article just uses the term “chip” with no further description, which could refer to just a no-feedback analog amplifier with a transition frequency of 500GHz.
This site calls it a “silicon-germanium processor”, but doesn’t say how much processing it does at that clock speed. It’s a pretty safe bet that the first device clocked at that frequency is not going to be a μP or CPU by any reasonable definition of the terms (i.e. from the lowliest Pic to the Pentium-class CPUs that we’re all familiar with). It’s much more likely that it’s a low-level digital building block such as a flip-flop, possibly with some Boolean logic capabilities (similar to the basic logic block of an FPGA).
Usully when process breakthroughs happen the high clock rate is for a simple ring oscilator. This is usualy an odd number of inverters connected in a ring. So it is probably 3 inverters in a row toggling at 500GHz
Here is the IBM press release:
One of the interesting things is that this material operates at 350ghz at room temperature.
gazpacho, I’d thought about a ring oscillator, but the TechSpot site that I linked to upthread specifically mentions “able to compute at 350GHz room temperature and over 500GHz when frozen to -451 F”. Even correcting for the scientific illiteracy factor, I figured that there had to be “useful logic” in there somewhere.
The IBM press release linked to by RaftPeople is so badly written that I can almost forgive all of the other journalists “downstream”, if that was their source material. Apparently, IBM itself came up with the cell-phone comparison:
Oh, dear. I think I’ll organize a press conference tomorrow, the climax of which will be when I hold in my hand a semiconductor device that (by the standards of the IBM press release) operates at over ONE THOUSAND times the frequency of their new 500GHz devices. What’s more, I’ll donate one to all journalists present! If you want a second sample, it’s only $10! And I’ll provide samples to any Dopers who want one, at $10 apiece!
Green Light-Emitting Diodes (LEDs) are available for a few cents apiece, and have been for decades. You can even buy them at any small-town Radio Shack. The optical output peaks typically at ~570nm. Given that c (the speed of light) is 3E8 m/s, and v(velocity) = f(frequency) x lambda(wavelength), 570nm light has a frequency of ~526THz (~526,000GHz), over 1000 times the clock speed of the new IBM device.
I’ve seen 3-D spinning boxes on Commodore VIC-20s, which have 8-bit CPUs running at 1 MHz and only 3.5 KB of RAM. Whatever phone you were working on, I assure you that the processor could have been much, much worse and still been able to do spinny boxes.
The IBM press release sounds to me like they are talking about a single transistor. Traditionally when talking about the speed of a transistor you use the frquency at wich the transistor has a gain of 1. If they are talking about the speed in that way then it makes sense to talk about cell phones running at 2 GHz.
I think that the IBM device has to be more than a single transistor, gazpacho: UIUC made a 509GHz transistor back in 1993, and that was apparently at room temperature (no mention of cryogenics in the linked article).
I guess that we’ll have to wait for the July issue of IEEE Electron Device Letters to find out just what the new IBM device is capable of.
After a little googling I found the following:
A few years ago researchers broke the 500ghz mark but using “indium phosphide and indium gallium arsenide.” Which I have no idea if it is cheap or feasible to mass-produce, but I assume it is not.
This is from IBM’s announcement:
“For the first time, Georgia Tech and IBM have demonstrated that speeds of half a trillion cycles per second can be achieved in a commercial silicon-based technology, using large wafers and silicon-compatible low-cost manufacturing techniques,”
This announcement is about the economics of the high speed, not the speed itself. The article goes on to say that they feel they can achieve 1thz. So the main point is that they have a material that will achieve high speeds, but unlike some of the other materials that also achieve high speed, this one is economical to manufacture.
And a little more googling found the following article from 2002 when IBM broke the 110ghz barrier with this same technology, however it gives more details about relevance etc.
Why Cell Phones Were Mentioned in Other Articles
"IBM first revealed its SiGe technology in 1989, and later introduced it into the industry’s first standard, high-volume SiGe chips in October 1998. Since then, IBM’s SiGe technology has been adopted by a wide range of companies for a variety of applications, including RF components in cellular handsets, Wireless Local Area Network (WLAN) chipsets, high speed test and measurement equipment, and chipsets for optical data transmission systems. "
Benefits of SiGe vs Other Materials
“Work with these circuits demonstrates the technology’s ability to support communication speeds of over 100 gigabits-per-second. It also demonstrates SiGe’s much lower power consumption than the gallium arsenide and indium phosphide materials traditionally viewed as necessary for such high-speed operations.”