A 300 qubit quantum computer could calculate all information in the universe

This is impossible. It cannot even encompass its own state since that requires knowing which way the quantum will flop. What does it say about Schroedinger’s cat?

Imagine you had two computers each of which could compute the future state of the universe (if you could make one, you could make two). Call them A and B. Ask A whether B will print out a 0 or 1 at Midnight on Dec. 31 and print the same bit at midnight on Dec. 31. Ask B whether A will print a 0 or 1 at midnight on Dec. 31 and print the opposite bit at midnight on Dec. 31.

I hate these kinds of quips.
Occasionally, on smaller scale, with moderately actual facts available, I will debunk them for myself.
But this one is so far out of all the ballparks, there is no use to even wonder about it.

We don’t know what we don’t know. Let alone how long it would take to calculate it. And calculate what? All the information? What the hell is all the information? Are all those thoughts I don’t tell anyone, information? It better stay uncalculated!!

This statement is ludicrous and meaningless in so many ways. A moment of thought reveals it to be fluff.

On an existential but still credible level, it is impossible. As it calculates, the results of it’s calculations create data, information. So ultimately it will always be one step behind calculating in the result of it’s calculations, into all the information. To infinity and beyond.

Dumb fluff statements!!!

People such as David Deutsch think that the calculations are done with parallel universes and there are a huge number of them - not just a single universe.

As an alternative to branching parallel universes there could be colliding parallel universes:

The article says that 300 qubits could be used to map the universe, not calculate it.

I’m not even sure what this means. What resolution of detail are we talking about? If the scale is small enough you can’t even talk about exact positions, etc.

And suppose this computer could hold all of this information. The computer would itself be part of the universe so it would have to contain complete knowledge about itself, which would include all the information it contained…

It’s qubits all the way down. . .

I can’t help thinking about cubits, and the nine billion names of god.

I think that’s the reference, yes.

You can almost see it on the ‘history’ channel, ‘does the bible contain the parameters for a super computer that will answer the question of life, the universe, and everything once and for all? Some people think so.’

Call me jaded but this all just reminds me of computer scientists in the 1940s & 50s saying that the entire world would only need five or six computers to met all its needs. It’s describing a technology and a field that is so completely speculative as to practically make the discussion philosophical…

Quantum computing is absolutely nothing like electronic computing, and that’s why it’s so compelling. I believe the 300 qubit notion is that if you had a quantum computer that had 300 qubits, in order to emulate it on an electronic computer, you would need every atom/proton in the observable universe. This is since the amount of computation you can do in one step is exponential in the number of qubits you have due to interference effects. These interference affects, however, mean that the results are non-deterministic in general, and you have to have a very specific algorithm implemented to be able to make any sense out of a quantum computation.

What I’ve learned about quantum computing is from watching lectures in quantum physics on MIT open Course Ware, and there it was stated (sometime in 2014) that the largest number a quantum computer had factorized was something like 21. The fact that this type of computation has been done says that it’s not complete science fiction, but there are problems to its scalability. One of the things mentioned on the topic was that in terms of memory, quantum computers are much, much (much, much) smaller than electronic computers, and our largest ones can even be simulated on electronic computers, but that the physical quantum computer was actually much larger than the physical electronic computer that could simulate it.

That’s pretty close, but you need one more thing. You need a way of making the right answer more probable than the other answers.

All else being equal, when the quantum system collapses, all possible answers (right or wrong) are an equally likely result. Since most answers are wrong, you haven’t really gained anything–you have to retry as many times as it would take to compute the thing in the first place.

But there are ways to make some answers more probable than others. Shor’s factorization algorithm arranges things such that the period of a function is related to the final answer. It then does a quantum Fourier transformation on the function, which has high-probability peaks at the desired frequency. When the wave function collapses, you’re likely (though not guaranteed) to get the period as a result. From that you can derive the factors.

Please don’t waste your massively high IQ doing something as mundane as proving an engineering professor at MIT wrong. You should consider being a supervillian instead. Built a suit of armor and then take over a country. Why shoot for the moon when you can have the stars?

I think the point some of us are trying to make isn’t that the MIT professor is wrong, but that this article seems to be doing a poor job of explaining whatever it is he’s trying to say.

That’s not unusual with science or engineering reporting, unfortunately. In an attempt to simplify the concepts, they end up totally misrepresenting them.

And speaking of fluff, my computations lead me to the fact that a computer that size would instantaneously suck up all the dust and cat hair in the universe and overheat.

Resulting in a large explosion (or “big bang”) and creating a new universe.

Unfortunately for non-advanced-physics-majors, and by all accounts I’ve heard, it’s basically impossible to explain anything about quantum physics in simplified lay terms and still be anything approaching accurate about it.

Well, that would be 10^10 bits of information about every single atom in the universe. If I’ve got that right, around 10^9 digits. Not 10000000000 pieces of information, rather, a number with 10000000000 digits. Which is rather a lot.

I don’t think that is actually a known and agreed number though. I’m not sure that even the theory is completely agreed. But what there is, is fairly simple: the amount of information in the universe is (I think) the total energy divided by the average temperature.

Right, but in this case they include a quote from the professor, and the quote is BS. Perhaps they misquoted him, or perhaps he was winging it.