I am also waiting to purchase a new gaming computer. I have had my current one for 18 months and my next purchase will come when Star Citizen launches for retail. It is fitting since the Wing Commander series were the first games that ever led me to upgrade my hardware.
I’m pretty sure the answer to that is no. The price point is too high, so the people who are interested at most need new graphics cards, not a new PC.
As for a more casual gamer like me, I think I may just now actually need a new CPU. And I’ve not even gotten into the i-series of processors yet.
Add in that thin clients are back in a big way via VMWare and other vendors. For many uses, a PC isn’t required, just a desktop on a server somewhere.
The time when you bought a new machine to get the new OS is behind us, considering Windows 10 is a free download.
Anyways, this is an interesting thread. I’ve never thought about it, but now that I do, I realize that my (formerly Windows 7, now Windows 10, with some compelling advances) machine is at least four years old and it has never occurred to me that perhaps I should get a new one soon. Years ago it seems you did that every other year (think about it at least), but yes, the lifetime of a PC definitely has become longer.
And just to illustrate the PC gaming market growth vs the overall PC market shrink, This month the GTX 970 GPU became the most popular GPU on steam, over Intel’s HD 4000 with almost 4% of the active user base owning one, or about 7.5 million people.
We’ve now got almost 90 million active Steam users with hardware comparable or better than a PS4, and about 15 million with twice the power of a PS4 or better.
I think much of this comes not from people ceasing to buy PC’s precisely, but that what we used to think of as “the PC” now expanding to a wide range of products. We now have PC’s, laptops, tablets, and smartphones, any one of which is powerful enough to handle almost any non-gaming application. Further, each one of those has lots of competitors who emphasize different features and appeal to different markets. Until or unless someone so thoroughly miniaturizes devices that everybody carries a computer the size of a keychain, which automatically connects to whatever screen and output you want, this is a process which will likely continue.
Tru dat. The desktop computers at the agency I work for had been running Windows XP for roughly forever, but when the end of Microsoft’s support for that OS approached, they responded by switching us to virtual desktops. All we do - all we can do- on the physical PCs at our desks, is connect to the virtual desktop that, as you say, lives on a server somewhere.
Once you’re logged into the virtual desktop, it’s pretty much indistinguishable from a regular desktop, and it also has the advantage that when you’re teleworking, you have access to literally everything you can access on your computer when you’re at work.
And yeah, that has to seriously stretch out the replacement cycle for PCs at offices that go with this sort of setup. A PC really only needs replacing under those circumstances when it can’t even handle the minimal demands of essentially being a remote terminal.
Yeah, there’s the “casual home user” market which consists mainly of people who browse the internet, deal with digital cameras and printers, and occasionally do stuff with software like MS Word, Photoshop (prob. Elements) and other light productivity software.
These users are going to tablets in droves; why would you want a PC, with all the hassle and components if you just want to fire up a browser and watch movies on Hulu? Or look at the local news page, or shop online? PCs for the most part far outpace the hardware requirements to do those things, which is why a tablet running a less powerful processor optimized for power consumption is able to do the job relative to a PC.
Then there’s the PC gamer; these guys have a replacement cycle that’s mostly driven by the graphics available in new games. And that replacement cycle is getting larger and larger, as hardware gets more and more capable, and to some degree, monitors aren’t getting larger in step with capability either. In other words, modern hardware may be able to do 115 fps at 2560x1600 with the highest settings, but most gamers aren’t going to have a monitor that can display that.
Then there are the professional users- developers and various kinds of editors and content creators. They probably (I don’t know for sure) still need very capable PCs, but they’re a small part of the market.
Then there’s the business market- most business PCs aren’t much different than the casual home users- browsing, using word, running an email client, and being able to be a thin client for a lot of corporate apps. These aren’t driving PC specs any more than the home users are.
So you really only have one market segment that’s probably still crying for higher performance, and one that is crying less than before, and others that aren’t crying at all. This is a different situation than say… 2003, when ALL the segments wanted higher performance just to do their tasks.
And they’re probably the hardest segment to track since they’re the least likely segment to just buy a Dell or HP out of the box.
I usually upgrade something every year or two, usually a ~$150 video card, better CPU or more RAM, and build a mostly new PC every 4 or 5 years. I never have gamed on a ready-built PC.
I had my first computer for seven years, and my second computer for six, hopefully, considering how much I paid for it ,I’ll be able to keep this one at least five years.