I recently read in New Scientist an interesting purported fact which is that each time you add a qubit to your quantum computer it doubles its computational capacity. This is somewhat analogous to Moore’s law which describes the doubling rate of transistor density except we don’t know how frequently we can expect to add qubits to our quantum computer. Let’s make the naive assumption that every time you add a qubit it is exponentially harder than adding the last one, and at the same time our ability to add qubits increases exponentially such that we are able to add 1 qubit per year. Now we just need some kind of characterization of the growth of our computing power in terms we are familiar with. I suggest the size of the integer that can be factorized that year using Shor’s algorithm.
Unfortunately I have no idea how to produce such a plot. I know I’ve seen some quantum gurus around here so perhaps someone can shed some “light” on the issue