So: Quantum computers. From what I’ve gleaned, they’re supposed to revolutionize computing: they’ll be faster, larger, and, if we’re lucky, they’ll finally be able to make a nice tiki masala. About time.
I asked a roommate of mine who generally knows more about all things tech: “What’s the deal with Quatum Computers?” He smiled lamely and shrugged. I went to the Straight Dope.
Here’s what I think I know…not much. But, as I understand it, basically, quantum equations allow for code to be a 1 or 0, but instead a 1 and zero. Okay. Got it. I understand that’s going to create some intense math. But why is this going to make computing faster? And what sort of math are we talking here? Are all digits in a quantum scheme one-zero hermaphrodites? Can you tell I have no goddamn clue what I’m talking about?
Well, partly that’s it – allowing superpositions (being partly 1 and partly 0 at the same time) allows you to compute a function’s value on all possible inputs at once. Then the trouble is reading out the right answer.
As for whether they actually are faster, that’s still an open question. The best case we have is that we know an algorithm that factors a number in polynomial time on a quantum computer, but not on a classical computer. This means that the running time of the algorithm grows as the number gets bigger, but as a polynomial in the number of bits used to express the number. Classically the best method we know grows exponentially, but there’s no proof that there’s no polynomial-time classical algorithm.
Mathochist: Thanks. I’m with you on the first paragraph, certainly. As for the second, why aren’t we certain that quatum computers are or will be faster? I’ve heard the words “revolutionary” and “infinitely faster” used in conjuction with them…
Oh, and anyone else who filters in…try not to make the mathematical explanations too intense. I had to do some google-fu to understand Mathochist, which was good, but…well…let’s just say it’s been a while since I had to do much beyond long division.
The basic problem is that we really don’t know how fast classical computers are. Or to put it more accurately, we don’t know how hard almost any problems are to solve with them. Factoring a number may be fairly hard to do on a classical computer, or it may not be.
Beyond that, there’s marketing hype, but that’s not a mathematical issue.
I’ll second this. People say “revolutionary” because they’re trying to build buzz which will help get grants. Rarely trust anyone who says “infinite” about a technical subject outside of a technical context.
As for how we cannot know how difficult factoring is on a classical computer, all we know is how fast we have managed to do it, and we can give speeds we’re sure we can’t do it faster than, but there’s a gap in there. Either someone will find a faster classical method than we currently know, or someone will find a proof that there is no faster method. Until then, well, we just don’t know everything.
So as people that actually understand quatum computers/computing, you don’t seem to be especially expectant or excited in their success. Am I wrong? We’re basically in full-fledged conjecture mode then?
It’s more that I’m not trying to sell you anything. I think they will turn out to be better than classical computers at some tasks, but the point is that we don’t know. They have the potential to be much faster if you ignore certain problems with the way quantum systems work. Some people think that those engineering problems can never be completely overcome, but other people sell the idea based on the potential.
On the mathematical side – which is where my interest lies – people tend to state results modestly. Kauffman and Lomonaco, for instance, explicitly avoided trying to make a fast quantum algorithm to calculate something called the “Jones polynomial” since it’s known to be impossible on a classical computer. I actually did try to find such an algorithm with a certain ansatz, but only managed to show that it was a blind alley. Maybe there is a fast quantum algorithm for that problem, but it’s very difficult to find. If we do find it, however, it’ll be the first real provable speed breakthrough for quantum computation.
Ok. Got it…with copious googling. Now I know was an “ansatz” is though.
I guess underlying this is the fact that I really don’t understand the benefit of using ones AND zeroes for equations. It seems like it would just provide copious ifs and unknowns. Specifically, Mathochist said this:
“Well, partly that’s it – allowing superpositions (being partly 1 and partly 0 at the same time) allows you to compute a function’s value on all possible inputs at once. Then the trouble is reading out the right answer.”
How can there be a right answes with so many possible inputs?
Well, here’s an example, which is really where a large number of quantum algorithms come from.
Imagine a clock that every minute shows a different symbol. It repeats every sixty minutes, since then you’re repeating the minute number, but it might repeat more often. Maybe every minute is the same symbol. Maybe it alternates between two, or cycles through three symbols. The “period” (the number of steps before it repeats the pattern) must be a divisor of 60: 1, 2, 3, 4, 5, 6, 10, 12, 15, 20, 30, or 60. We can find the period by writing down all the values and looking at the results since 60 isn’t so many things to consider. If it were, say, 60 million we would need a better answer.
So, say we take two of the clocks and shift one by some number of minutes from the other. If we shift by a multiple of the period then the two clocks will look identical, but otherwise we’ll see a number of differences. Let’s further assume we’ve got an easy way of telling whether they’re identical or not. Now we can just try all the shifts and see which ones give identical clocks. The common divisor of all those numbers is the period.
But there’s still a lot of shifts to consider. A quantum computer allows us to try all of the shifts together, sort of like a filter. Everything that doesn’t line up right falls through the sieve, keeping only the shifts which make the clocks identical, and it only takes the time of checking one shift, rather than having to do them all one after another.
…whereas a classic computer would have to try all the shifts with set values? Which of course would take far longer, but the calculations themselves would be simpler?
(the clock example was very, very helpful. Thanks)
I’m not all that excited because it’s been a few years since I’ve heard of any real advances in the field. I’ll admit I could’ve missed something, but these sorts of things tend to be big news, so I have to doubt it a little.
Quantum computing will be very cool if it ever becomes practical, but thus far I don’t see any reason to be certain about it.
I think that if a QC is ever built, it will solve the factorization problem completely. Here’s why. Suppose you want to factor a thousand bit number (around 300 decimal digits). If it has a factor at all, the smallest one will have at most 500 bits. One thing you could do is try division by all nembers of all numbers with at most 500 bits and see if one divides evenly. Unfortunately, there are 2^{500} of them. But a QC that can generate a qubit (quantim bit) string 500 places long can carry out all 2^{500} divisions simultaneously, since each qubit is in a state of superposition of 0 and 1. I do not understand how to get the results out, but people who should know have claimed they can solve that problem. They have actually used this to successfully factor a number (I think it was 15) for which they need only 2 qubits. The engineering problems involve in getting 500 qubits to all stay in this state of indeterminacy seem to me almost insuperable, but if they evey figure it out, watch out. Of course, not every computational problem is amenable to massive parallelism, but plenty are.
So what I conclude is that there is a lot of hype and a few solid facts. We can only wait and see.
Factorization problem is a problem of finding for a number X some (or all) numbers P and Q such that X = P*Q. Generally, the interesting cases is when P and Q are very large primes, as applied in various encryption algorithms and such.
Given an X we really don’t have a better way of finding a prime P and Q other than checking all the primes smaller than sqrt(X). For a very large X that is a serious computation challenge. There’s various statistical methods for establishing a probability of a given number being prime, but they only help a little. Checking all the numbers smaller than X to see if they are prime and then dividing X by it can literally take more than the estimated age of the universe to compute classically for a fairly reasonable X (just a few thousands digits, IIRC).
If you can suddely instantly factor large numbers into their prime factors, you just broke a lot of commercial and government grade encryption. Congratulations. You’ll either win a prize or disappear
So are quantum calculations even currently possible? Maybe with some super-computer somewhere?
I at least now understand what a phenomenally different approach to computing this would be. But it seems ages off from what the more knowledgable folks here know.
No, it can’t be done on a “super-computer” precisely because it’s a completely different form of computation.
As for how far off it is, there are a bunch of interesting algorithms if you make all sorts of simplifying assumptions. What’s holding us back is the engineering problem that quantum states, which hold the information during the computation, tend to degrade over time and bleed out information. This is on top of noise effects, which are similar to those in classical computations, but more devastating since you can’t look at the computer halfway through the computation to correct noise midstream.
To give you an idea of where we are, people have actually built a quantum computer and implemented the factoring algorithm we’ve been talking about on it. It worked exactly as predicted, and we now know that 15 factors into 5 and 3.
I have faith that 21 will soon be tackled and factored with equal tenacity.
Honestly though, this is really helpful. I used to read about quantum computing in the same state of mind I read about the Federal Interest Rate: jaw-slacked bewilderment. I feel like my feet are at least wet now.