I’ve had my current system for 2 years now. It’s a 1.0 GHz Athlon with 40GB hard drive, 256MB RAM, CD burner, USB ports, DSL modem, plus a good sound card and video card. I haven’t done a thing to it as far as upgrading is concerned. All the systems I had in the past had at least one or two components upgraded before I switched to a whole new system. Ten years ago I had a Packard Bell 486 DX33 with 4MB RAM, 170 MB hard drive, 2400 bps modem, no CD-ROM or sound card. In the four years I had it I upgraded the RAM to 8MB and then again to 16MB, the hard drive was upgraded to a 1.2 GB drive, the processor was upgraded to 66 MHz, the modem was upgraded from 14.4 and then to 33.6, plus a CD-ROM drive and sound card were added. The OS was upgraded from Windows 3.1 to Windows 95. I easily poured enough money into this machine to have bought a new system in the interim. Most of these upgrades became necessary because of rapidly changing technology. My current machine has never proven itself to be inadequate for anything I do and I don’t expect to be buying anything for it within the next year.
so is your post evidence that we do? not meaning to be sarcastic but you seem mainly to have left the question in the title and go on to tell us what you’ve done.
Anyway, I can’t speak for all people, but I can say that I upgrade my system when I need to, until I have to replace too significant/expensive a part (e.g. motherboard), then I buy a new computer, or as it was, wait 2 years saving up the money, THEN buy a new computer
oh, and I always buy mid range, I don’t see the point in getting something really expensive that’s not that much better.
I’ve seen a lot of PC history and IMO the incidence of upgrading has slackened off somewhat in the last few years (relative to past times) as the hardware included with most major manufacturer out of the box systems is good enough to run 90% + of what the average user desires, and the improvement from the bundled hardware currently being shipped to the upgrade is not as compelling as it used to be in the past.
Having said that there is still a huge community of tweakers and upgraders out there, but as I have gotten older the desire to upgrade just for the sake of having he hottest rod is less attractive that it used to be. Gamers are really the users on the cutting edge of modding for speed, but beyond that the hardware being shipped today is incredibly powerful for the dollar. Hardware (on all fronts) horsepower increases have significantly exceeded current 98 to XP OS requirements and should carry most people for quite a bit longer than it used to in the olden days unless some monster OS comes out that everyone must have. With sufficient memory I can run XP very nicely on anything from a 3 year old 500 Mhz PIII to a new 3 gigahertz machine.
I tend to buy right-off-the-bleeding-edge, cause it’s a lot cheaper. That said, I think the “arms race”, such as it was, has slowed because there’s no real difference to Joe User between a blindingly fast 1.5Ghz chip and a 2.5Ghz chip. I just upgraded to a 1.7Ghz Athlon a few months ago and that wasn’t really by choice, I was just changing motherboards, so I needed to get an Athlon CPU. Do I notice the difference between my new Athlon XP and my old P3 Coppermine? Not really. These days, RAM and video card seem to make a bigger difference.