We’re at… 4.0GHZ or so, right now. Most people would not need this for almost any programs, currently, save video editing/few things I’m not thinking of.
Figuring in currently concievable but not too outlandish concepts, such as better game graphics, at what point will larger proccesor speeds become redundant for home computer usage? 15GHZ? 30GHZ? 50GHZ?
It never really will. Just like with the occassional bandwidth enhancements that have been made to this board in the past, any increases in speed will soon be used up. As processor speeds increase, software manufacturers will add more and more intensive features and such. Trying running a modern program, geared towards a 2.0 GHz machine on a 200 MHz Pent MMX, for example. Assuming it can run at all, it will be EXTREMELY slow.
We’ll keep finding new things for our computers to do. Obviously there’s other ways to advance the capabilities - already we shove a huge burden onto our graphics cards, which are as powerful as a PC of a few years ago. Wander around your house and think what else could be controlled, or communicated to. Think about true full-screen video phones. Imagine - it’s what the computer geeks do all the time.
“It never really will. Just like with the occassional bandwidth enhancements that have been made to this board in the past, any increases in speed will soon be used up.”
That was simply an analogy. And a fairly apt one, at that. As more resources (CPU speed, server bandwidth, etc) become available, more features are added such that the resources are still consumed.
For many uses, processor speed already is redundant. I’m typing this message on a 5 year old 500MHz PIII, not because I’m poor and can’t afford anything better, but because there’s no reason for me to upgrade the computer at this time. If I was playing games, then I might have a reason. I’m a programmer, and I develop networking software. I have a lot of computers at home (since I work from home), and the fastest box I currently have is a Sun Ultra 2 with dual 300MHz ultrasparc processors (and 15K rpm drives, 1G RAM, etc).
There will ALWAYS be a reason to push processor power, memory, hard drive size, etc.
The game Unreal Tournament 2004 will be released next week. It will take up 5.2 GIGABYTES of your hard drive. A computer I bought in 1999 had a 20 GB hard drive. In five years, one game would occupy 25% of my hard drive.
The graphics continue to get better, but so do the video cards. Until a game looks like Finding Nemo or Shrek, people will be clamoring for more. But even with better “eye-candy”, there is more going on underneath. Better AI (artificial intelligence) for you to play against takes up large amounts of processor power. Realtime physics are making their way into games now, which also eats up cycles.
This is just a tiny niche of users that pushes development of hardware. As Q.E.D. said, whatever advancements are made will be quickly filled in by users who will then demand more.
Only problem is that it can make coders fairly lazy since they’ll not worry about writing the most efficient code and simply put out code. Not pointing fingers at anyone in particular, but it does happen. In the end however, improvements to technology are mostly a good thing. Even though the main reason users get really fast processors and high end vid cards are for games, there are practical purposes as well.
I think its funny. right along people have said “why would we need better <hardrive/cpu/internet/whatever> no program around uses more than what we have now!!!” which is such backwards logic, its not like there are companys dumb enough to release programs that no one can run. every single time the capasitys of computers has increased people have found ways to use it, but your not going to have tons of programs on your computer where they need orders of magnitude better computers to run… because why would someone be selling a program no one can run?
I have been using computers for more than 12 years. It is my opinion that :
High end computers are becoming cheaper and cheaper. For example, today most people can buy a 4.0GHz Pentium. 12 years ago the best computer was probably a 486, and you had to sell your house to buy it!
I don’t know if this is a consequence of 1., (ie. people can buy more high end stuff than before) but I believe that the gap between computer power and software requirements has increased.
Check some of the computer games. In the old days, minimum requirements could be a 33MHz 386 with 2MB RAM, while most people had 40MHz 386s and 4MB RAM. Nowadays games might require 1GHz and 64 or 128MB RAM and most people have more than 2GHz and 512 or 1GB ram.
Harddrives are cheap and really big now. Many people have 200 gigs that have room for UT2k4 40 times over. Do you know ANYONE who have problems with HD space now that doesn’t save lots of video. It’s pretty hard to fill your HD up with just games nowadays.
If you don’t play games you can buy a dirt cheap computer and run winXP and all the latest programs easily. It’s only for games and certain high-end programs that you need the latest hardware.
I have a 10mbit internet connection at my college dorm. It’s a lot more than I need. You could give me a tenth of that and I would be just as happy. Once you can stream full-quality video you seldom need more bandwidth anyway.