Why Only Binary?

This may sound like a really stupid question, but manny people have asked me, and so I wonder, why do computers make use of only two states of electric current (“on” and “off”) to encode information? Why can’t we build computers that recognize 10 different voltage variations, or even 100, or 1000? Why do we have to stick with just plain old boring “on” and “off” states of electric current in our computers? Forgive my ignorance in advance, thanks. :wink:

Well, the short answer is, we can. Analog computers exist, but they’re not practical. The reason is, we need to put processors on tiny, low-power chip. It’s difficult to build these tiny circuits to deal with more than one voltage state. Additionally, with each voltage state that you add, processor design becomes exponentially more complex. In addition, since computer programs are based on boolean logic, base-two just makes sense.

Here is an interesting page about analog computers.

Because a switch can be in only one of two states: on or off. Transistors are used as on-off switches in digital computers, therefore digital computers have two states.

  1. It’s not just voltages. Many binary storage situations involve things with binary states (on/off, clockwise/counterclockwise, up/down, etc.) that CAN’T have more than two possible values.

  2. Multivalued logic is possible, but complicated. But as Martin Gardner pointed out (in his discussion on Korzybsky in “Fads and Fallacies”) EVERY multivalued logic is still binary in the sense that a statement correct or incorrect. (If you drag in fuzzy logic, I’ll point out that you can recast fuzzy logic in terms of conventional logic. Quantum logic is something else altogether, but I’m going to conveniently forget about that.)

The IC circuits I learned computer logic on had voltages states that ran from 0 to 5 volts, with about a 0.7 V variance due to circuit construction differences. That is, off is 0-0.7 V and on is 4.3-5 V.

I once asked my teacher if it’d be possible to have multi-state bits with the remaining 3.6 V range. He said no, because the circuits were unstable between these voltages. Once you set a bit to off, it stays off (with the given range); the same with on. But if you tried to set the voltage to, say, 1.0 V, it could vary clear up to 4.0 V, or back down. And at any given moment, it could pass either the 0.7 or 4.3 threshhold and go to an on or off state when you really wanted one of the middle states.

I’m sure that technology has improved since then, but generational software still must run on older machines (at least one generation older). So there’s a chain of machine and software than anchors us to the binary system. If a stable multi-state bit system were devised, a completely different set of processors, media, software, and hardware would have to be designed. If these were invented tomorrow, it’d be years, maybe decades, before the price of them would be within the range for consumers. (It was about 30 years the last time.)

AWB,

That’s partially right, stability is an issue, but there’s an even bigger problem with multistate switching. Suppose you have a transistor with three logic states (0,1, & 2). Now suppose that your logic wants to switch from 0 to 2. The gate has to transition from 0 to 1 to 2. In that intermediate 1 state, we have a non valid (undesirable) logic condition that would be propagated through the rest of the circuit. You could get around this problem by clocking every single multistate transistor, but that would more than double your circuit size.

JoeyBlades pointed out the biggest problem with multi-state logic. Another reason it’s not done is that there isn’t really a lot to be gained. Since multi-state logic can be exactly expressed with binary logic, you don’t gain any new computational abilities, and the added complexity wipes out any gains you made in computational power and speed- for the same cost you could have simply added more binary logic to get the same performance.

That’s not to say that it will never be done- I could envision an optical computer with different wavelengths representing logic states, etc.

BTW, actual voltages for logic levels have been coming done in recent years- the circuitry on my desk is a mixed-level design that includes both 2.5V, 3V, and 5V logic (where a logic one is 2.5, 3, or 5V, and logic zero is 0 volts). Level translators interface among the different subsystems to allow them to be mixed.

Arjuna34

Hey, Snark, if you search about four to six months back you’ll see I asked the very same Q here.

IIRC, after a few (I think, uncalled for) snide remarks, the geeks took over and I got more information on the subject than I could ever use (or understand). Check it out… good luck.

You mean this one:

http://boards.straightdope.com/sdmb/showthread.php?threadid=33957

A lot of the points you might be interested in were covered, with a side foray into analog as well. Check it out.

Hey! I resemble that remark!!!

Stuyguy, thanks for the referral to the other thread. I had no idea someone had already asked this question.

Stephen Hawking was told that for every mathematical equation he put in his book, A Brief History of Time, he would lose a certain number of readers. Unfortunately, this is very true for me. I just don’t have a mind for math, so when I run into equations like the ones in the other thread, my mind lapses into “Huh?” mode.

But I find the idea of optical computers very interesting and I hope they can be developed. They sound promising.