My computer is 8 years old and still works like it's brand new.

It’s a HP desktop. Paid $1500 from Microcenter.

So what’s the deal here? Did I just buy a good computer? Or did MS finally fix their shit and make a decent product?

Because before, it seemed like I could keep a computer for 3 or 4 years before I got the “Blue Screen of Death”.

8 years ago was the infection point in CPU performance. Up until then, CPUs doubled in processing speed every two years or so. After tha, the progress has been much more gradual.
I built my Core i7-2600K Hackintosh in 2011, and it’s still my primary machine, and easily fast enough for everything I do.

And, BSODs should not be caused by computer age - that is usually as sign of a failing hard drive or RAM, and should be easily fixable.

I am on Toshiba Satellite from 2012 and it runs beautifully.

The power cord is a bit loose and a few keys stick a bit, but it runs great and is still as fast as when I got it.

Despite being 6 years old, when I run specs on new computers…they don’t seem all that much faster or better. I feel like we went through a slow-down in computer improvements compared to the 80’s and 90’s, when computers were outperforming old ones every year. Like…by a lot.

I’m on a mid-2010 MacBook Pro, and it’s fine. I use another mid-2010 MacBook Pro as a Windows station, and it’s fine, as well. My main desktop is more recent (late 2013 MacPro), but the old one in the basement acting as a server is a 2006 MacPro, and it’s showing no sign of any problems whatsoever. (I did upgrade the CPU at some point, though, to bring it to something like 2009/2010 MacPro processing levels).

Now, I don’t use any of these as a high performance gaming system, but I do use them for Photoshop, Lightroom, and Logic Pro. There is definite performance difference between the newest desktop and the oldest laptop, but the oldest laptop doesn’t seem to be running any slower than it was when I bought it, and I was perfectly fine and able to use my Adobe Suite on it, though a bit slower than a modern system. Everything else runs fine, so far as I can tell. No major slowdowns or annoyances. And for the type of games I play on these systems, there’s no issue, either.

BSoDs often come from updated programs or hardware. If you are lucky, won’t happen at all. Hardware is really reliable these days. What OS are you using, btw?
Most of the extra transistors you can put into processors these days go to cache and extra cores, not for speed, so if you are not pushing the processor you won’t notice their absence. Not having enough memory might slow you down eventually depending on what kind of software you run, but chances are you won’t notice that either.

Yeah I’d say 8-10 years ago computers got fast enough and Windows got stable enough that you don’t need a new computer as often. Another reason many people bought new computers in the past is because Windows XP got slower and slower and after a couple years people couldn’t take it any more. A reinstall of XP would have mostly helped but many people just bought another computer. I liked the XP interface a lot but it wasn’t that stable or secure.

Firstly, CPUs stopped getting significantly faster at each step. Second, unless you play games or do specialist work, software developers don’t need the extra horsepower - CPUs have long been fast enough to run Office and web browsers. Thirdly, Intel and AMD concentrated on power reduction and increased core counts. In 2010 the typical CPU had two cores with premium models having four. Now we have four, six, or eight cores, and CPUs are hyperthreaded for even more performance. But you’ll only see that performance if your software makes use of those extra cores.

If you look at an Intel NUC box you’ll find a PC that is faster than your current PC, is smaller, is quieter, and consumes less power. You can’t add a GPU, though, so they’re useless for games.

That said, I would be concerned about your PC failing. There are two components in particular: capacitors on the motherboard and in the PSU, and the hard drive. If you feel like spending money and your next PC is going to be a desktop then buy yourself a SSD - 400 to 500 GB is the sweet spot right now - and clone your current HDD on to the SSD. Your PC will start blazingly fast and it will thoroughly rejuvenate your PC.

The CPU on my old workhorse computer came out in Oct. 2011. (Double check that, didn’t mistype it this time.)

Still an “adequate” computer.

One common problem with people having their computer get “slower” is they have not just malware but a ton of useless startup crap sucking cycles in the background.

If you keep things neat and tidy, it will not “slow” down.

I’ve definitely felt some slowdown on startup for Windows 7. And this was even while removing as many programs on startup as I could. That said, it was nowhere near as bad as Windows XP over time, which was nowhere near as bad as Windows 98.

I did finally get an SSD and resinstall Windows. But the reason for the reinstall was just that I got only a small SSD, and reinstallation was easier. The startup time was never that bad–at least, for an HD. Now, with restarting taking like 30 seconds, it would probably feel like an eternity.

I don’t even mind applying updates anymore with an SSD.

^Great point. Same story for me, and my HP is going on 7 years old. I don’t think I’ve done anything beyond some HP updates. I go thru mice and keyboards and even monitors faster than the box.

wikipedia says that Pentium IV processors were shipped from November 20, 2000, until August 8, 2008.

The P4 was a dog. Intel secretly offered heavy discounts, and Over the model run cranked up the speed and cache size, so there was significant noticeable improvement over the model run, and significant noticeable improvement when the model was replaced.

I’m still using a P4 at home. It still works. The startup sequence is slow with WinXP on a slow hard disk, not using hibernation, but the main feature is that web browsing is painfully slow – one of the reasons I would prefer to use add blocking, no-script, and old web browsers.

My 22 year old Power Mac 7100 never developed any wearing-out symptoms aside from the backup (PRAM) battery dying. It was supplanted by my first laptop but remained in regular practical use until 2006, when I would still remote into it and perform work tasks on it that ran faster under MacOS 8.6 than on MacOS X of that era. Since 2006 it doesn’t stay booted and running on any kind of ongoing basis but now and then I’ll start it up and it obligingly makes its bong! sound and comes up ready to work.

Works and works well are two different things. I owned my last truck for 14 years and was used to every vibration and noise it made, and it worked well. It wasn’t until I got a new car last year that I remembered how much quieter and smoother a drive/ride could be.

I had my old PC for for 10 years and when from Vista to Win 7, from HDD to SD and it worked ‘well’ for what I did (no gaming or video encoding) with few issues. This year I finally built a replacement PC with an i7 and Win 10. Still don’t do anything any game or video encoding, but everything is just quicker and smoother, especially videos. It may just be a second quicker here and there, but it adds up.

Yes, my 2012 laptop improved its performance back when I did the free upgrade to Windows 10. Windows 10 seems to be even less intensive than Windows 7, which is what my computer came with.

I never used Windows 8.

I recently purchased a new desktop after the old one would no longer boot. It would display a hardware error and shut down. I have no idea what broke on it, but I figured it was time to just a buy a new one anyway. I purchased that one in 2009, but upgraded the power supply, the processor, the RAM, the hard drive, and the GPU (twice). By the time it finally died, it was running an i7-860 (no, I’m not missing a zero…) and 16GB of DDR3 1333 RAM, which was the fastest CPU the mother board could handle, as well as the most (and fastest) RAM it could hold. Performance was not bad at all. Had it not died, I could have easily tolerated it for another couple years, I’m sure. It was even running two 4K (at 30hz) screens with a GTX1060 installed.
I’m kind of glad it finally died, though. It gave me an excuse to buy the machine I’m sitting behind right now, which is absolutely amazing…

I just replaced my old machine which was built in October of 2009. I had upgraded the video card twice, cloned and replaced a failing hard drive once, replaced probably every fan in the case once and upgraded the RAM once. Other than that hard drive failure( about 7 years in )and the occasional fan bearing dying it had chugged along just fine - the cpu had no problems keeping up.

What got it was probably a failing motherboard - blue screens of death started proliferating a few months age after a particularly hot couple of days. At around 8-10 years power supplies and motherboards in particular start getting a little dicey and the latter can be a pain in the ass to swap out compared to every other component. Especially with older chips. In my case I figured it just wasn’t worth fucking with anymore and got a new, moderately future-proof machine( and finally upgraded to Win10 ).

But yeah 8 years is a good run and as long as it is doing its job you might as well stick with it. It’s a bit like a car - usually it is more economical to fix it and keep it running as opposed to getting a new one. At least until the repairs become so numerous and tedious as to not be worth the hassle anymore ;).

ETA: Looks like Bear_Nenno just wrote my post for me :D. Including the I7-860, which was exactly the chip I was using.

Biggest issue on the software side is keeping up to date with all the updates from MSFT and the PC maker, not loading crapware, avoiding the majority of programs booting with the PC, etc. If you’ve got a clean image, and keep it that way, performance should not degrade. Performance may even increase over time as Windows keeps dieting.

Now you’ve done it. :wink:
I also had a HP desktop of about the same vintage, gave me the BSoD this summer and wouldn’t boot anymore.

I have had two computers go south on me in the past 10 years or so, but neither had anything to do with Microsoft; both were hardware issues. A computer should be able to last years; I have a 10-year-old iMac that still runs just as well as it did when I bought it.

I think the main reason people get new computers is, they need some new software that requires a faster CPU than the old one has, or a better graphics card than the old one can support.

Unlike some people here, my i7-860 and board are still running solid. It’s been overclocked to 3.6GHz for the past eight years, even. I have a new gaming PC but the 860 serves as a family computer and a way for my kid to mooch off my Steam account