I’d like to find a review of what percentage of increase in frame rate, one can gain on average (for instance, testing on ten random machines.) I’m not interested in speed increases of individual components like the CPU or hard drive, just the end result frame rate.
I would also like to find some sort of study of the lifespan (i.e. before one buys a new computer) of a gaming machine if you run it until you have to start running the newest games at the lowest settings, as a ratio to how much money you spent on it at the beginning, and without upgrading components or overclocking it at any point.
Cant be answered definitively, because for one thing it depends on what level of cooling is added.
Eg with liquid nitrogen you might get a 100% increase, with air cooling you might get 10% or less. Its like asking the average speed of an aircraft in a world with Microlights and Blackbirds, the range of speed is so large the answer would be meaningless.
In practical terms though, with only air cooling, Id say the majority of real world performance increases in FPS are under 20%, and barely noticeable outside of using benchmarking programs.
Edit: Lifespanis also impossible to answer, because there are usually settings on some games that cant practiaclly run with current hardware, eg Crysis at higher resolutions. So its inherently subjective based on what you consider ‘playable’, which for some is 30fps and others 60 or more.
Otara
There isn’t a solid answer. Some games are more cpu driven while others are gpu driven. Some will take advantage of multiple cores, some won’t. Depends on the hardware configuration and the software being tested.
As far as the lifespan question, I think you’ll find that programmers are making games more GPU intensive so it will require more powerful graphics cards to play the games. A top level gaming box from a year ago would have a decent quad core processor by today’s standards, but the graphics card would be pretty obsolete.
Its also asking us to foresee the future. Overall increases in speed are slowing down, because the power requirements are getting too large.
Id say a reasonable speed gaming computer now could be expected to last two years and still play new games well if not maximum settings, unless a big change comes through, which looks unlikely at this stage.
‘Normal’ overclocking almost enver gives a substantial enough change to significantly extend this, the current decision now is more whether you want to go SLI or not, which in my view isnt worth the annoyance for any longevity gains you’ll get.
Well, as for the lifespan question, I can at least provide one data point. Back in May 2006 I put together a decent gaming machine for around $1200 – a good socket 939 motherboard, Athlon64 XP 3700+, 2GB of DDR-400 memory, and a Geforce 7900GT. (The 7900GT was an overclocked-by-default “Signature” card from eVGA, and it cost me $400 at the time.) It ran most new games at reasonably high settings on my 1280x1024 monitor. Today, it will still run new games, but I’m having to turn more and more graphical bells and whistles off to keep gameplay smooth. I wouldn’t expect it to run Crysis without setting everything to “Low” on maybe 800x600, but it’s plenty good enough for Half-Life 2, Portal, and the like.
So I’d say 2 to 2.5 years is a reasonable lifespan for a modest gaming machine on a budget of a grand or so. I’ve finally decided to move up, and I just ordered the parts for a completely new build from Newegg. (Core 2 Duo, 8 gigs of RAM, and a GTX 280. :D)
Also, FWIW, for a while I was overclocking my CPU by 25%, from its stock speed of 2200MHz up to 2750MHz, all on the air cooler that came in the box. There was a modest improvement in my 3DMark06 benchmarks, but I never really saw a significant improvement in most games. I had had to increase the core voltage quite a bit to keep it stable, resulting in somewhat high temperatures, so eventually I turned it back down to stock speed.