I saw this article today:

Quantum computers pose a huge threat to security, and the NIST wants your help

Can someone explain quantum computing in simple layman’s terms? The Wikipedia article was dense reading.

I saw this article today:

Quantum computers pose a huge threat to security, and the NIST wants your help

Can someone explain quantum computing in simple layman’s terms? The Wikipedia article was dense reading.

Yes that is a *horrible* article. Unfortunately it is all too typical of some Wikipedia articles written by people who do not understand that Wikipedia is a general-purpose encyclopedia. It should not be an unapproachable, technically dense reference for someone already conversant with the field. When asked to improve articles along these lines, the writer will often huffily complain saying it cannot be simplified, you must simply know the subject.

Of course great scientists such as Richard Feynman had no problem explaining complex subjects in understandable terms: Richard Feynman - The.Character of Physical Law - Part 1 The Law of Gravitation (full version) - YouTube

Here is a good video lecture describing an overview on quantum computing by Eric Ladizinsky, the chief scientist at D-Wave. He has no problem explaining this in simple terms, unlike many would-be technical writers on Wikipedia:

Eric Ladizinsky: Evolving Scalable Quantum Computers - YouTube

I know a lot of people here aren’t a great fan of about dot com, but this is a pretty good explanation without jumping off the deep end.

Quantum computers exploit the ability of quantum mechanics where are particle can be simultaneously in two states at the same time.

It said by the eminent physicist Richard Feynman that all of quantum mechanics can be gleaned from carefully thinking through the implications of the double slit experiment. (Feynman also said, “If you think you understand quantum mechanics, you don’t understand quantum mechanics.”) Put most simply this experiment shows that atomic particles and smaller can be in more than one place at the same time. More broadly they can be in more than one state at the same time leading to the famous Schrodinger’s Cat thought experiment which takes this to an absurd end where a cat is both dead and alive at the same time (QM is very weird).

So consider a computer today. It can only do one operation (task) at a time. It can seem to do more than one thing at a time but that is because it is really fast and/or has more than one processor working. So, say you want to factor a number. You have to:

- Write your number.
- Find two more numbers that multiply to make your first number.
- Determine whether any of your factors can be factored again.
- Stop factoring when you reach prime numbers.

A computer today needs to iterate through each step one at a time. Because computers are so fast though they can factor a number like 15 nearly instantly.

However, cryptography relies on using massively large numbers to factor. These numbers are so big that factoring them, even for super powerful computers, takes ridiculously long amounts of time. This makes the cipher secure.

However, a quantum computer can improve on that dramatically. Rather than doing one step at a time it can search out all possibilities simultaneously (assuming it has enough quantum bits [aka qubits] for the operation). Because the qubits can be both “on” and “off” at the same time they basically do all the calculations at once. This of course would break current encryption methods.

That said there is a long way to go. While quantum computing has been proven to work it has only been proven on very small scales. Scales so small factoring even a small number is a challenge. IIRC the record so far is a quantum computer with 5 qubits which is not near enough to do meaningful work. It is estimated to break RSA 2048 you’d need at least 4,000 qubits. We are a long way from being able to do that.

Also, quantum cryptography is also a thing and, supposedly, if it is done right it is unhackable even by quantum computers.

I don’t think quantum computing can really be put into simple layman’s terms. I would say the Wikipedia article is a fairly good starting point. The usual way it is explained in hand-waving terms is that quantum computers can use quantum superposition to carry out multiple operations simultaneously, but this can be misleading.

The basic idea is that classical bits (how information is store on a classical computer) can either be in state 0 or state 1, but a qubit (how information is stored on a quantum computer) can be in any quantum superposition of the states ‘0’ and ‘1’. Quantum superposition of two states means here that the quantum system is not definitively in either state 0 or 1 and we can only assign probabilities of finding it in state 0 or 1 upon making a measurement.

If we allowed our imaginations to run away with us we might think we could imagine a qubit is simultaneously both in state 0 and 1 and manipulating a qubit it is akin to manipulating a classical bit in state 0 and a classical bit in state 1 simultaneously and by adding qubits we can manipulate unlimited amounts of classical information at the same time. This would be true, if it wasn’t for one important fact: if we perform a measurement on a qubit to see which state it is in, it will either be in state 0 or state 1 just like a classical bit.

So in fact at first glance it seems quantum computers have no advantage over classical computers and one big disadvantage in that they have a probabilistic nature, which means to determine the final state of the information (pre-measurement) we’d have to run a computer program several times just to get a probabilistic answer. E.g. with a classical computer if we ran a simple program to add 2+3 we could just run it once a say with 100% confidence our answer was ‘5’, but with a quantum computer we might have to run it several times just to say with 99% confidence our answer was ‘5’.

However that is not the end of the story as quantum data can be manipulated clever ways that classical data can’t be, so in certain cases it is at least very similar to carrying out multiple operations simultaneously. This means that there are some quantum algorithms that can arrive at their answer(s) much more efficiently than classical algorithms doing the same task. This is even taking into account the above-stated disadvantage of quantum computing.

Unfortunately though understanding why quantum algorithms can outperform classical algorithms really does take a decent (technical) understanding of QM and also of algorithms. I must admit I am currently looking at a comparatively simple quantum algorithm and it is more than a bit taxing to follow. I’m not sure if there is any way to break it down so it would be simpler than that Wikipedia article which is simple enough that it only really skirts the question of why quantum computing can be more efficient than classical computing.

Quantum cryptography is a real thing, but it solves a different problem than encryption does. Basically, it helps to prevent eavesdropping. Secure systems can be vulnerable to something called a “man-in-the-middle attack”. Basically, you think you’re establishing a secure connection with your bank, but you’re actually establishing a secure connection to the man in the middle, and he then establishes a secure connection with your bank. When the bank asks the MitM what your password is, he turns around and asks you what it is. When you tell him, he then turns around and tells your bank. This is a vulnerability, because meanwhile the MitM has copied down your password for his own use.

But this only works because it’s possible to make copies of classical data. It is not, however, possible to copy quantum information. So if your “password” were a quantum state, a MitM would have the choice of either passing it on to the bank unread (in which case there’s no harm done), or of reading it himself and ruining it so the bank can’t read it (in which case it’s clear that there’s a problem). Combine with other appropriate security techniques, and even that won’t help the attacker.

Is it that the bank can’t read it, or that the bank would know that it has already been read?

Actually they are also working on ciphers that would be too difficult for even a quantum computer to crack (they think). Unfortunately it may be too cumbersome to use if it is secure and if they streamline it they open avenues of attack. It is a work in progress though.

Regarding quantum cryptography; as I understand if the MitM were able to intercept both the message and the entangled particles, he could decrypt and read the message undetected.

Once he’s decrypted and read it, he simply creates his own set of entangled particles, sends the appropriate parts of them onto the bank, then encrypts the message using the retained “partners” of those particles and sends that. That way neither the recipient nor the original sender will detect anything amiss. In fact, he could even make changes to the message that the bank receives. There will likely be some sort of confirmation message returned by the bank, so he’d also have to be able to process that.

Granted, this is complicated and requires intercepting two different streams, but it could be done. At least I think it could. Someone may be able to correct me.

Great:

But seriously, thanks all for the explanations.

Quantum computers aren’t faster, they solve different problems known as the Quantum Complexity Classes. A lot of cryptography is in this class, but one thing we don’t necessarily know is how a lot of problems “reduce” to quantum problems. To be short and somewhat vague: if there is an efficient reduction to a “fast” quantum complexity class, then that problem breaks, if there’s not, nothing will change.

In theoretical computer science, a “reduction” is simply a way of converting one problem to another, even if it takes a long time*. For instance, you could reduce the problem 42+137 from an addition problem to a counting problem by gathering up 42 and 137 sticks, and counting them.

Certain cryptographic techniques would break for sure, such as anything that relies on integer factorization being difficult. Symmetric encryption would be more or less fine (it makes it easier, but not easy enough to be practical), and there are also a few algorithms that provably have no efficient quantum reduction. It would throw the world in slight turmoil, but it’s not apocalyptic. Notably, asymmetric encryption is borked, but that may be alleviated by the fact that man-in-the-middle attacks can be foiled by quantum properties (since a big purpose of asymmetric encryption is to prevent listening in on communications). Of course, there is a cost embedded in needing a quantum chip in your computer to do these things.

Here’s a worthwhile Security Stack Exchange answer:

- Strictly speaking, “reductions” have to be “efficient”, but that has a wildly different meaning in CS jargon than it does in regular talk. An algorithm can be both “efficient” (e.g. polynomial time) and still extremely slow (e.g. takes n[sup]50000000000000000[/sup] steps to compute).

The form of quantum encryption I know does not use entangled particles to exchange the transaction. What you use it for is to agree on a shared key. If there is a MitM attack the properties of the entanglement mean the key sharing will fail to reach agreement on the key. Once you have a key you can proceed with the transaction. For utterly attack proof communication use it as a one-time pad.

The thing I have never understood about quantum computing is how you recover 4000 bits worth of information from the system in useful time. The intrinsic noise/stochastic properties would seem to place a time requirement that would make this infeasible.

Quantum computers always bring to mind the challenge put forward by the physicist David Deutsch in regard to Shor’s algorithm for finding the prime factors of large numbers. It’s all very well to talk about qubits exhibiting superposition and holding 0 and 1 values simultaneously, but Deutsch puts it this way: “When Shor’s algorithm has factorized a number, using 10[sup]500[/sup] or so times the computational resources than can be seen to be present, where was the number factorized? There are only about 10[sup]80[/sup] atoms in the entire visible universe.”

Where indeed? Deutsch posits that quantum computers that actually perform such almost inconceivably rapid calculations would be strong evidence of Everett’s many-worlds hypothesis, in which all these parallel computations are in fact occurring in Everett’s parallel universes, which he hypothesized precisely as an explanation for quantum superposition. I’m not sure if this really follows since we have no mechanistic explanation for quantum superposition, we just know it exists and therefore has this awesome potential computing power, but the idea is fascinating nonetheless.

But you still exchange entangled particles in order to agree on a shared key, right? So couldn’t my scheme be used to intercept the particles from the sender, agree on the key, and then do the same with the receiver?

I feel like pointing out that you may have heard of the famous “P=NP” which can be reformulated as “an NP-complete problem has a Karp-reduction to some Polynomial Time (P) problem”. A Karp-reduction is a reduction whose algorithm is in P, meaning it takes polynomial time.

However, if either the Karp reduction, or the target P-time algorithm takes something like n[sup]9999999999999999999999999[/sup] steps, it will be both extremely surprising and profoundly meaningless in the grand scheme of things. (Though of course it will raise many interesting questions about why, precisely, whatever exponent it ended up being is the lower bound on the reduction; that number would probably become a pretty well-known mathematical constant).

I completely disagree. That Wikipedia article very poor. The average person reading the first paragraph would have no idea what that means. In fact this very thread was started due to that article being so poor.

There are many examples of better introductory articles on quantum computing, including Wikipedia’s own simplified version: Quantum computer - Simple English Wikipedia, the free encyclopedia

Other examples:

http://michaelnielsen.org/blog/quantum-computing-for-everyone/

Eric Ladizinsky has given several popular-level talks which could easily be edited to a synopsis that would be vastly better than the current Wikipedia article.

I honestly don’t know what the criticism of the first paragraph is - seems a very basic outline of the topic. The only possible criticism of the 1st paragraph I’d make is that it packs a lot of information in, but that’s more the nature of Wikipedia and links to the referenced topics are provided as you would expect.

Some topics naturally require some kind of prior knowledge to approach. You wouldn’t expect the Wiki article on Green’s functions to make any sense to someone who had never heard of a differential equation. Of course quantum computing is not that of an obscure topic and you’d expect a basic outline, as is provided, but to go any further it clearly the reader will need some prior knowledge (and in all honesty in these days of Wikipedia and Google it is so easy to come by that prior knowledge).

Not that the Wiki article is beyond reproach, but it is still, compared to many other articles on the subject, a good starting point (not that I’d claim to be an expert on quantum computing).

The first is just a precis of the Wikipedia article you are complaining about, mainly based on the first 3 paragraphs. I don’t know how you could like that article, but not the main article! I noticed though still someone has flagged it for not being simple enough.

The second article is very good in that it focuses on why quantum computing can’t be explained easily and it makes a nice suggestive point about the size of a classical computer of that would be needed to simulate a quantum computer of certain number of qubits. I would say this article is superior to the Wiki article, but its aims are also different in providing a sketch by focusing in on a couple of key points rather than being an encyclopedia article.

The third article is appalling, in the whole article it only has two short paragraphs on the actual theory of quantum computing and one of those paragraphs is misleading for precisely the reasons I explained in my previous post.

[quote]

Eric Ladizinsky has given several popular-level talks which could easily be edited to a synopsis that would be vastly better than the current Wikipedia article.

I would tend to approach anyone from D-Wave with caution, the consensus among experts still seems to be that D-Wave machines most probably do not utilize quantum computing, despite claims otherwise. Its makers have also been accused of misunderstanding the concepts they claim their machines are based on, by one of the originators of those concepts.

Of course that doesn’t necessarily mean that Ladizinsky couldn’t explain quantum computing, however from what little parts I’ve seen it looks like the standard pop-sci slush that unfortunately seems to have become prevalent. The reason I dislike this overly metaphorical mode of explanation is it often is just as likely to misinform as to inform people about a subject. In particular he uncritically (at least in the small parts I watched) brings up quantum parallelism, which as I’ve explained is misleading (in fact the second article you’ve quoted also goes over this point too).

So, in “classical” digital encryption, which uses huge numbers to prevent easy factoring in any kind of Big O time, quantum computing bypasses this by doing huge numbers of computations versus a binary computer doing a billion/trillion comps a second? What are the size of these unfactorable numbers?

RSA 2048 (2048 bits) is a number 617 digits long.

To be clear, the factorization stuff really only applies to public-key cryptography systems like RSA; symmetric cryptography doesn’t rely on factorization problems. But public-key schemes are usually used to exchange symmetric keys securely, since symmetric block/stream ciphers are hundreds or thousands of times faster than public-key ciphers.

All that said, the numbers used in RSA are quite big indeed. Cracking a 2048-bit RSA key by brute-force factoring would involve factoring a 600+ digit number (in decimal.) Currently, all the computers in the world could not do this before the heat death of the universe. (Stealing the private key is a lot easier.)

A good quantum computer could (potentially) put a significant dent in that brute-force attack, maybe doing it in 1/10 the time. But 1/10 of a quadrillion years is still a long time. And we’re nowhere near building a quantum computer that could actually do that, even if it had enough time. IMHO, a breakthrough in traditional factoring algorithms is more likely to threaten RSA than quantum computing is.