First there was the 8086 chip. Then came the 286, 386, 486, and Pentiums.
First there were 2kbps modems. I don’t know the exact sequence, but there came 14.4, 28.8, 33.6, and then 56k modems.
First there were 8-bit video games, then 16-bit, 32-bit, 64-bit, and the latest, 128-bit.
My question is: Why do the makers of computer hardware feel they have to go from one number to its double? Why couldn’t we invent a 986 processor chip before we invented the Pentium chip? Is there some unwritten law that says we have to automatically do things sequentially? Why couldn’t we invent a 256-bit video game before we invented the 32-bit?
There is a natural progression to technology that is dependent on a number of factors: manufacturing technology, software support, backwards compatibility, market demand, cost, etc.
Look at it this way, and it’s a shakey illustration at best–but it works:
To climb a flight of stairs, you take one step at a time. You don’t jump straight to the top landing.
Most technological progress builds on previously developed technology. That just means you’ll have to wait a while for that 256 bit video game.
–Kalél TheHungerSite.com “If our lives are indeed the sum-total of the choices we’ve made, then we cannot change who we are; but with every new choice we’re given, we can change who we’re going to be.”
The basic reason is that the underlying computer technology is binary, and as a result things tend to increase by powers of two.
The naming of the Intel chips is independent of the above (which is just as well, as those chip names haven’t increased by powers of two). That’s a marketing thing. We all think of a bigger version number as being better (MegaMax version 3.12 must be better than MegaMax version 2.47), so Intel kept making its numbers bigger while retaining the “…86” values to remind you of their upward compatibility.
Actually, Intel stopped using numbers to identify their chips when they found that they couldn’t copyright a number, and competitors such as AMD were advertising their own chips as 386s and 486s. Now Intel uses names like Pentium (and Pentium II and Pentium III), which it can protect.
WGFF is right on with the power of 2 bit (sorry couldn’t help it). I might add that the Intel 8088 preceded the 8086. 8088 indicated an 8 bit data bus and 8086 indicated the more advanced 16 bit data bus on the chip.
A point in every direction is like no point at all
{{{I might add that the Intel 8088 preceded the 8086.}}}—Oblio
Actually, the 8086 preceded the 8088 microprocessor. The 8080 microprocessor is the mainstream 8 bit offering from Intel that preceded the 8086.
The 8088 was selected for use in the original IBM PC for reasons of cost and ease of design. Other 8-bit Intel microprocessors of similar vintage include the 8008 and 8085.
Incidently, Intel was founded to develop solid state alternatives to magnetic core memory storage. Their big break came when they developed the first microprocessor–the 4004–under contract for BusiComp, a Japanese calculator manufacturer, in 1971.
Fortunately for Intel, Noyce and Moore made the decision to buy back the rights to the chip. The rest, as they say, is history.
–Kalél TheHungerSite.com “If our lives are indeed the sum-total of the choices we’ve made, then we cannot change who we are; but with every new choice we’re given, we can change who we’re going to be.”
The 8088 is software compatible with the 8086, and is functionally identical to the latter with the following major exceptions:
The 8088 has an external data bus width of 8 bits, while the 8086 has a 16 bit data bus. 16 bit operands are fetched or written in two consecutive bus cycles–1 byte at a time.
The 8088 has a 4 byte queue length, while the 8086 has a 6 byte queue length.
The prefetch algorithm of the 8088 was modified to optimize the instruction queue. The BIU will fetch a new instruction upon the instance of a 1 byte opening in the queue. The 8086 waits for a 2 byte opening in the queue before instruction fetching occurs.
16 bit fetches and writes, on the 8088 take an additional 4 clock cycles, as compared to the 8086–this being due to the 8 bit interface of the 8088.
–Kalél TheHungerSite.com “If our lives are indeed the sum-total of the choices we’ve made, then we cannot change who we are; but with every new choice we’re given, we can change who we’re going to be.”
The progression of PC modems actually runs more like 100b, 300b, 1200b, 2400b, 9600b, 14.4kb, 19.2kb, 28.8kb, 33.6kb, 56kb.
Harware manufacturers don’t ‘feel’ they have to go from a number to its double, they end up working in powers of two because computers are binary at the core, so most things relating to computer hardware end up relating to powers of two. Also note that things don’t always double; the modem speeds (even your list which left a few out) don’t double at each step, for example.
Since the x86 numbers were just model numbers, Intel could have called some chip a 986 chip if they really felt like it, but it would be kind of pointless. You could call the second version of a program number two or number ten, it doesn’t have any deeper meaning.
Really, this one is like asking why can’t you have chapter 9 of a book before chapter 5? You could if you felt like it, but it wouldn’t really mean anything. Witness that Intel has stopped using the x86 nomenclature for it’s machines, instead using the Pentium, Pentium Pro, Pentium II, Pentium III, Itanium, etc. names. There’s something of a meaning behind that naming scheme (the Pentium machines are all similar under the hood, while the Itanium is radically different), but it’s really just marketing hype.
There’s a law that you can do easier things more quickly and cheaply than more complicated things. The bitness of video game consoles is really the bitness of the processor underneath. Building a 256 bit processor is far more difficult than building a 32 bit procesor, requiring you to cram far more transistors onto a given processor and so increasing the cost, heat produced, design time, etc. It’s a lot like the way aircraft makers had to build a plane that could go 200mph before they could build one that could go Mach 1.
Processors are made in powers of two because of the way memory and programming work, so no one is going to bother making 48-bit processor.
Also, don’t expect to see a 256-bit game system anytime soon as there isn’t any real use for a 256-bit processor in the real world, so no one has made one. Game consoles just use a processor they can buy off-the-shelf, so until Intel, SGI, IBM, Sun, or one of the other chip makers comes up with a 256-bit chip consoles won’t have them.
Also earlier computers always came with ram that was in that order
2k , 4k, 8k, 16k, 32k, 64k, 128k, 256k and so on.
I have always attribited it to the binary nature of computers and (I don’t know if this is true) the addition of 1 digit effectivally doubles the value:
2 digit binary (where 0 = 0)
00 = 0
01 = 1
10 = 2
11 = max value =3
4 possible values, now to improve this your only choice is to add a digit, making it 3 digits long:
000 = 0
001 = 1
010 = 2
011 = 3
100 = 4
101 = 5
110 = 6
111 = 7 = max value
now we have 8 possible choices doubling the value, if we add another digit it would double again…
Maybe Flinky’s question is kind of like a concern I have about razors. Remember when those double-blade ones came out and we were like, all “Wow! Two blades! The first one shaves real close, then the second one shaves even closer!!!” We all knew that shaving technology had peaked. It couldn’t get any better. Then, after dozens of dollars of research, our scientist friends came up with a triple-bladed razor!
Well, I have an idea, which I’ll reveal to you Dopers if y’all promise not to tell anyone else… Four blades! And I have yet another project in the works. The design is still in beta, but I’ll tell you that it has something to do with the number of blades.
I mean, couldn’t we have skipped over the three-blade stage? Likewise, couldn’t we have skipped over, say, the 32-bit (or 16-bit) stage for CPUs? We knew the benefits of doing so, right? If I were designing a CPU right now, I’d skip 128 and do 256.
I have one year from now to patent my 4-blade razor, right?