I keep hearing from my geek friends about quantum computers and how amazing they will make the world. What exactly is a quantum computer?
Ordinary computers use bits to represent information. A bit is a binary digit, which is either a 0 or a 1. In a quantum computer, a given bit can be both a 0 or a 1 at the same time. This allows a sequence of n bits to represent 2[sup]n[/sup] pieces of information. If they can be made to work properly, quantum computers will revolutionize the world due to their amazing speed.
If any bit can be 0 or 1, how can you possibly reassemble the data coherently? I thought a quantum computer was about saving space, power and time, not about larger memory capacity. I read one problem with them is that the data might flow BACKWARD on occasion because of quantum mechanics, but current transistor design does not accommodate this.
Reassembling data coherently is the major problem with quantum computing. In theory, it can be done, although I don’t really understand how.
Representing 2[sup]n[/sup] sequences with n bits is a significant saving of space. If you can read those n bits and get the right answer, that’s a significant saving of space. Quantum computers are no more powerful than ordinary computers, in the sense that any problem that can be solved on a quantum computer can also be solved by a regular computer. I don’t know how much power they consume (in the ordinary sense of the word).
As I understand it, when programming a quantum computer you choose your algorithm so that the wrong answers “cancel themselves out” leaving the correct answer to dominate the output. So you’re not using the qbits to save memory; instead, you use them to allow your quantum computer to literally try every possible answer at once, and hope that you were clever enough to program the computer in such a way that the right answer will fall into your lap.
If you have access to a university library, look up Peter Shor; he’s been making quite a name for himself lately by designing these kinds of algorithms for quantum computers. He has, for example, designed an algorithm to factor large numbers by literally trying every possibility at once in the manner described above.
Math Geek is on the right track. QM allows for a superposition of states we just don’t see in the classical world, but allow all sorts of interesting tricks to be accomodated. Instead of simple binary gates, all manner of new gates can be developed.
The sad thing is that quantum computers haven’t gotten very far because people aren’t exactly sure how to control them. The last computation I heard that was done that made everybody in the field fall off their chairs in excitement was…
get ready…
factoring the number fifteen.
We have some time yet before we end up with quantum processors for our personal computers (and in fact, may never actually see them on that level because it is unclear as to what, exactly, they’d be useful for on those platforms. However, I can think of a few good science applications for these beasts, so we’ll probably see them there first. Give it a decade or two (barring some amazing breakthrough), I guess… then we’ll know more.
Rather than going directly to Shor’s papers, you might try Nielsen and Chuang’s textbook Quantum Computation and Quantum Information (Cambridge, 2000). It came highly recommended and I’ve been impressed by what (little) I’ve read of it.
On the other hand, there is the counter-argument that you only have to add a single qubit every two years in order to keep pace with Moore’s Law. More than that and you’re catching up. But I agree, the interest so far is more on possibilities than concrete systems.
This article is a couple of years old, but I think it explains the basics:
http://130.94.24.217/1998/0698issue/0698gershenfeld.html
Normally, a bit is either 0 or 1. An atom either spins this way or that way. Quantum mechanics, tells us that we can have mater in more than one state, with different probabilities. You can have an atom spinning this way with probability p and the other way with probability 1-p. It is actually spinning both ways. Once you “look” at the atom, you force it to a definite single state (which is chosen according to known probabilities).
In a quantum computer, we make a quantum register out of material that has such quantum properties. A normal computer acts on the bits in its register. A quantum computer operates on probabilities. The quantum register starts out in some known set of states (each bit is 0 with p(1/2) and 1 with p(1/2)). You then perform operations on the probabilities. An n bit register is actually in all 2^n states at once. The trick is to perform operations that force the register into a superposition such that the answer you want is the state with probability 0.99999999… and all the other states have probability 0.00000000…001. You then look at the register and with high probability, get your answer.
For a more concrete example, consider factoring. There is an algorithm (Shor, 1994) for factoring large numbers using a quantum computer. With an N bit quantum computer, you could factor an N bit number in linear time (with high probability of correctness). The number to be factored is placed in the register, and the probabilities are manipulated until the factor of that number has a high probability of being the state of the register. Checking the correctness is easy (multiply the factors).
The largest quantum computer built had 4 bits, and could factor the number 15 (5*3) “quickly”. This isn’t so impressive. Consider, however, that if we could build a 128 bit quantum computer it could quickly break standard encryption algorithms such as DES. A 2048 bit quantum computer could break almost all RSA implementations. You could solve the traveling salesman problem and other NP Complete problems in linear time. In effect, the quantum computer breaks the Church-Turning hypothesis.
Anyway – I’ll stop rambling.
Correct me if I’m wrong, but doesn’t the Church-Turing hypothesis just say that any effectively computable function is Turing computable, and make no mention of the speed?
Ok – your right – I shouldn’t ramble so much. A quick search shows this paper showing that, in fact, a quantum machine would be consistant with the CT-Thesis:
*Originally posted by MHand *
The largest quantum computer built had 4 bits…
Actually, they’re up to seven now. May not sound like much of an improvement, but for every qubit added you in effect double the data capacity of a QC.