Quantum computers

So… besides cryptography, what else will quantum computers be good for?

I think you’re underestimating the potential power of quantum algorithms, at the level of 600 digit numbers the quantum algorithm is in theory vastly more efficient with the potential to factorize such a number in a matter of weeks.

A lot of the talk around quantum computing involves doing things that traditional computers already do a lot faster. But one thing I think is more interesting is the potential for quantum simulation – using a quantum computer to simulate systems of many particles. You can do such simulations on regular computers, but the complexity grows exponentially with the number of particles. A purpose-built quantum computer could potentially do such simulations in linear time.

It really depends on how many qubits the quantum computer has. The more qubits, the faster it is. I have seen numbers tossed around of 4,000 - 10,000 qubits to crack RSA 2048 quickly.

Of course, we are a long way from building a quantum computer with that many qubits. There was recent excitement over building one with 5 qubits so there is a ways to go and the challenge is difficult.

Depending on how many qubits you have. A QC that could crack RSA-2048 in a reasonable time would require a thousands of qubits, and the engineering challenges in building such a machine are quite immense.

I must admit I don’t know that much about cryptography, but the numbers in the QM text book which I got out when reading this thread suggests a quantum computer could factorize a 600 digit number in potentially 9 days. I would guess for this to be achievable though a quantum computer would need to be able to perform the same number of operations per second as a classical computer, which at the moment is at best a pipe dream.

As far as I am aware all claims of using a quantum computer to factorize a number have been on numbers small enough that the classical algorithm is still comfortably more efficient than the quantum algorithm.

5 years. Maybe as little as 2, absolutely no more than 10. Like everything else in computer technology, I expect we’ll see exponential growth in capacity once the fundamental problems are solved, and that’s pretty much happened now.

While I’m wearing my prognosticator hat, I’ll bet we see the first consumer products based on it before 2025.

something something nuclear fusion something

Sometimes it seems to me that there is a competitive element involved in over complicating wikipedia articles (check out the articles that deal with economic theory, practically unreadable)

Functional quantum computers capable of factoring numbers (albeit very small ones) have existed for 15 years now. How much further advanced than that are we now?

What physical or mechanical machinery does it take to create or manipulate one qubit?

(Noah to God: “Right. What’s a qubit?”
– Bill Cosby.)

The spin of a particle in a given direction is mostly independent of its other properties and as long as you choose a suitable particle the possible results of a measurement of it will be binary, making it ideal for use as a qubit.

In classical computers data is manipulated by logic gates and an example of such a logic gate would be the NOT gate which returns ‘1’ when the input is ‘0’ and ‘0’ when the input is ‘1’. The quantum equivalent of the NOT gate manipulates the magnetic field of the particle with the effect that it flips the co-efficients of the wavefunction of the qubit.

I believe they can also be really good at finding data in complex data sets really quickly.

That lady from infoworld is quoting herself in a different magazine. Isn’t there some sort of journalistic ethic guideline that prevents oneself from quoting oneself in a fractal support argumentative structure? It just seems so wrong, even if ultimately she’s right.

Being self referential it may indeed be a fractal support argumentative structure :smiley:

In general you would expect her to explicitly say it was her that wrote it (and not just as a footnote or reference) to avoid exactly this issue. Citing it without such disclosure is very poor.

In academia of course you cite you own work all the time - but this is expected as you are laying out the trail of work (and if it is important to remind people you got there first.)