There’s been a lot of technological hoops the gamers who are looking for bells and whistles have to jump through the last four years. DirectX 10 being one of them, dual- or quad-core processors being another.
However, the game industry itself phases in that kind of hardware requirement slowly and carefully, since they’re dependant on a large install base to sell their games. I still don’t have a DirectX 10 card, nearly a year after its’ launch, mostly because of my financial situation and because there hasn’t been a DirectX 10 exclusive game made yet.
Defining the average gamer is very hard, because the average gamer is a person who plays games, not a person who keeps on top of the median of hardware requirements. I meet plenty of gamers who identify as such, with four+ year old machines. They’re quite content playing Warcraft III, World of Warcraft, Counter-Strike Source, Starcraft, et cetera. To enter a realm of “average gamer” in which it’s pertinent to talk about hardware, we need to look at the “hardcore” gamer - i.e. the gamer with disposable income who wants to keep ahead of the curve. This is a rare breed, constituting only a few percents of the gaming population.
The hardcore tech gamer will be looking at equipment in this range or above:
CPU: Intel Core2Duo 6500+
Video: NVidia 8800+ series. (DirectX compatible)
RAM: 4GB+ DDR3 series. (1066-2000Mhz FSB)
HDD: 10k RPM for OS/games, slower for storage. (Movies, music, documents, pictures, etc)
Screen: 4ms response time is a frequent requisite. (Low response time arguably reduces popular LCD issues, like ghosting) Personally, I’m going to stick with my 21" colour-corrected CRT screen until I can get a LCD that equals in performance without making my wallet feel I’m touching it in the bathing suit area.
Cooling: Water-cooled is the rage for the time being, but at the very least a good Zalman CPU fan with good cooling paste. (I’ve got that and 4x 120mm fans mounted in my case door, and one in the roof of the case) If you tell a hardcore tech gamer that you’ve put the stock fan on a top-of-the-line CPU, he’s going to punch you in the throat.
- Input: Tastes in keyboards and mouses vary a great deal from individual by individual. Hand-fitting and responsiveness, as well as preferred button layout are good indicators. (I’m still on my old MX518) High DPI (Dots per inch) is an usual “objective” measurement of performance.
All in all, that equipment will put you out a few thousand dollars at least, particularly if you’re buying it pre-packaged. Most hardcore tech gamers prefer to buy it in parts and put it together themselves. I do, too. For me, it gives me a feel of being in control and gives me more pleasure from working on it. It also gives me more pride-based incentives to take good care of my computer. (Defragmented once a month, internet-based backup service runs twice a week, I also run a temperature tracker and check it frequently. My Intel Core2Duo 6600 has never peaked 35 degrees celcius)
On to some speculation about the video game industry. To disclaim myself properly, I’ve never worked there, but have been part of a few indie games on the development side. Mostly modelling, artwork and writing, some bug-hunting but never coding.
My experience, as an observer, is that the company developing the game tries to make a treshhold fairly current with the start of the development cycle. I.e., if you were to start now, you wouldn’t do bad to look at the list I posted above as a requirement list. Some companies prefer to pioneer for new technology, some prefer to appeal to the lowest common denominator. Crytech being an example of the first, with both their FarCry and Crysis games pushing the envelopes of their generations, and Blizzard being idiomatic of the second one, with games like Starcraft and Warcraft III having rocketed due to accessibility.
In an established company making franchise games, like Ubisoft in their Rainbow Six: Vegas games, the development cycle is quite short - usually no more than two years on the outside. This means you have good cues to where the hardware install base will be. For games attempting to pioneer new technology, it’s a lot harder - like S.T.A.L.K.E.R., for which the company had a development cycle of ten years, or Crysis, which gamers still mock as only being playable on rigs described as “brand new gaming computers resembling the monolith from 2001: A Space Oddyssey, made from obsidian by the proud dwarves of Middle Earth.” They have long, aching development cycles and it’s hard to tell how the hardware park of the average gamer will look like, since we’re looking at non-linear development.
It’s my firm belief that companies like ValvE and Blizzard have the right idea. Blizzard will take a game concept and work on it until it’s done. Then, they will work on it some more, freezing the hardware requirements at a certain point while they polish to a fine sheen, organize ritualistic sacrifices of imported animals, repeatedly sell their soul to the highest bidder (What, you thought the Diablo series was a coincidence?) and by the time it’s released, it will look like the best game out there two years before. But the gameplay will be made of awesome and the crowd will be salivating with anticipation.
ValvE, while not retaining the painstaking “it’ll be done when it’s done (which means overdone)” philosophy of Blizzard, has a huge advantage when it comes to Steam. Steam sales of other companies’ games finances their operations, so they’re not under such a strain to release games on a yearly basis as, let’s say, any other company out there. And they have access to a continuously updated overview of the Steam community’s hardware, which is reliable data about their core demographic. Which is, put simply, invaluable - lending them a nearly console-like frame of development. (In February, Valve claimed Steam has reached over 15 million unique and active accounts. Which, I might add, is around ten times more the projected sales of an average PC game. One might simplify by saying that there’s roughly as many Steam users as there are XBOX 360 owners) Valve have, in both the past and present, taken advantage of this to release highly polished games easily playable by the vast majority of their subscribes. Team Fortress 2, Portal and Half-Life 2 are good examples of this, as is the ambitious game Left 4 Dead, in which Valve left the graphical bar a year or two back and used the new resources found in most computers to make a lot more enemies. According to demo videos and their hype, you will be literally swarmed by enemy zombies.
[anecdote]
I spent some time last year, talking with two people I know who work in visual development in Funcom here in Norway, and they were subscribing to the theory of the “Plateau in front of the Uncanny Valley.” This is basically the belief that shortly - in a timespan of the next decade - CGI visuals will reach a photorealistic plateau from which we cannot improve, except in scope. At which time, video card development will slow down and eventually come to a halt and the function of the video card will be overtaken and incorporated into the CPU and become standardized. They admitted there’s a lot of flaws with the current state of that theory, but they believe it to be inevitable, because there’s a very finite limit to how good computer graphics can get. I.e., indistinguishable from our own sight.
[/anectode]
I have already completely failed to be blunt and short, so forgive me for rambling, but I wanted to address your last question directly. Video games make money when they’re sold. And as such, you must grab the customer’s attention at the time the game goes into sale, because that’s the most crucial point of income for the company selling the game. Later on they’re far more subject to diminishing returns, like re-sales and pirating. Now, you might see that this is an argument that goes both ways: it’s at the same time an argument for appealing to the largest install base and for putting the biggest goddamn bells and whistles you can get on your game. Very few people will buy a shooter from the shelf if the backside looks just like the shooter you bought last year. Word of mouth about it’s high quality is good, yes, but for there to be any word of mouth, people must have already bought it. This is the example of the Catch 22 which violated, say, Psychonauts’ right to life and breeding and which has given rise to such abject monstrosities like the Madden series.
And it’s very, very depressing.