Does there have to be an end to Moore’s law that computer power doubles every 18 months?
Yes, if only because there are a finite number of subatomic particles in the Universe.
Unless we find a bunch of other universes. And then a lot more.
Hey, it could happen
Yes, and it’ll happen within about 75 years unless things slow down:
Moore’s Law isn’t a law as much as a good idea. The fact that Moore’s Law still seems to be working is a surprise to Moore himself. Gordon Moore, one of the founders of Intel predicted back in 1965:
As you can see from the above statement, his original projection was only for until 1975. in 1975, Moore stated the doubling would be every two years instead of every year. Again, this was only a prediction for the next decade. The fact that Moore’s law will probably still be around for another decade or two is rather unbelievable. By 2020, processors will be about 1000 times faster.
You might want to take a look at this thread, where the same conversation is taking place.
As qazwart points out, Moore’s law isn’t about speed, it’s about the most cost-effective level of complexity. Aside from that, CPUs are already going in the direction of multiple cores instead of higher speed per core, and there is no reason to assume you cannot simply put more “maximum speed” cores into the same package.
The other thread has more info
But hasnt whats allowed for Moore’s Law to hold for so long is the fact that things have kept getting both smaller AND faster.
Sure, those certainly arent the only things, but they sure appear to my barely educated eye to be a major part.
Baring some “breakthrough” that may or may not ever come, arent those two advances about to hit a brick wall ?
And if so, it seems to me Moore’s Law would start to faulter and increases in power/ability will be much more linear with time than geometric/exponential/non-linear.
Anyone who remembers the Pentium going from 60 to 75 to 100 , 300, 666, 800MHz must notice that in fact it has pretty much ground to a slow halt in the 4GHz and less processor clock speed.
Intel and others compensate with multi-core processors; if the software is written correctly (Hey, Microsoft!) then why interrupt the screen draw to decode the MPG stream or figure out where all your elves are or what the value in the next spreadsheet cell is? Do two or more things at once!
This of course has a limit too. IBM Mainframes, VAX and Alpha machines, and IBM AS400’s all famously tried this end run around the limit of Moore’s law; all it did was delay the end for a bit longer.
Whether it’s computing speed, or air travel speed, network speeds, disk capacity, or even real estate prices; most progress follows an “S” curve - from relatively flat to going up to level at the new height. (Except maybe real estate, which has a hump because it’s going down on the other side). People during the rise event look at the curve and say “This will go on for ****”; as in “Real estate will keep going up”. There is always a limit.
Progress is just a series of these S-curves overlaid with each other. As each technology reaches its limit, another comes along to excite us with its potential. The big difference is that with the new level of computing power, home users can do things that were to difficult and expensive for professionals only a few years ago. Photo manipulation is trivial; video editing is trivial; computer modelling, even encoding compression like MPG or DIVX is compute-intensive but trivial with modern technology.
Eventually, Quantum Computing will be all the rage.
Clock speeds may be flattening, but maximum processor throughput is still increasing steadily. Mostly this is due to adding extra cores on the CPU. However, even here there’s got to be a maximum (and not just one given by size constraints): adding more cores makes things like synchronization harder, and also massively parallel computing is extremely difficult for the average programmer. Many “truths” from current programming practice no longer hold in a parallel world: redundant computations may sometimes be good, because of the reduced need for synchronization, etc. It may get to the point where computers can go faster by adding more cores, but it’s irrelevant, as we don’t know how to write software that makes use of the added cores (automatic parallelization is, for all intents and purposes, a pipe dream: many algorithms take considerable insight and ingenuity to parallelize, c.f. Floyd-Steinberg diffusion).
Mainstream CPU designs are only just starting to make the switch from single to multi-core. We’re witnessing a silent revolution!
It’s not at all clear just how much of a speed up quantum computing will provide.
Nor when and where it will happen
It is both happening, and not happening.
I’ll explain with more details later, but there are very good reasons to believe that quantum computing will never be cost effective, even if it can be made to work in the first place.
+++ Divide By Cucumber Error. Please Reinstall Universe And Reboot. +++
More practically, I think it has moore to do with the fact that electrons have a finite size which limits ultimately how small you can make components.
Oooh…
Was that an intentional pun?
Considering Moore’s law only works because new CPUs are multicore, I’d say its dead or dying. Youre not doubling performance of a single core every 18 months. From now on, they’ll just be adding performance gains via traditional methods like smaller dies but mostly banking on more cores for a good part of that performance gain. I really dont think we’re seeing a jump like we saw from, say, the slowest Pentium III to the fastest Pentium 4.
I guess there’s an argument that if multicore wasnt feasible we’d be looking at energy hungry, water-cooled, high clockspeed designs, but Im skeptical the industry would resort to multicore unless it had to. Then again, multicore is pretty efficient per watt.
I’m not going to complain, about computing power plateuing for general use, personally.
It means that thing will get more efficient. Instead of programmers (bad programers, I’m looking at you Microsoft) assuming they can make bloated software and it’ll run “at least as fast” as the old software, they’ll have to trim it down and streamline it. The average Microsoft Word document opens the same speed now as it did ten years ago. And computers certainly have sped up in that time.
Also, it will mean that Laptops will eventually become equal, or at least more equal to desktops in computing power, although there’ll no doubt still be a price difference. And, it’ll drive the cost of Desktops down. Don’t even get me started on not having to replace my damn PC every 18 months (at least, major components) if I want to play new video games. I just stopped that shit after I realized how much money I was flushing down the toilet.