What kind of computers are gamers using and what do game developers believe they use?

Yes, but it’s an MMORPG, so there are regular updates. The last expansion upped the hardware requirements, and the next probably will, too. But you can always turn down the graphics requirements.

Of course. If you can’t get the games for the Mac, you don’t buy a Mac – or you end up with one gaming computer and one computer for everything else. That’s what my son-in-law did. He bought a Windows machine for games only, and a Mac for work.

I’m willing to bet whatever computer the OP has will run Montjoie (the linked game) well enough. I’ve never played the game though, but let me explain my reasoning.

First, it’s a turn based game. This means that, even if the game takes a very long time to load, stutters badly when panning the map, and pauses for several seconds between turns as the game hits the swap file because it’s out of physical RAM, it will still be fun to play. This isn’t a first person shooter where every frame per second gives you an advantage or makes you miss. Secondly those are, as mentioned, very modest specs. I have an old computer (heavily upgraded but original motherboard) from 2002 that only has 768MB of RAM, but I bet it would run the game. I would just need to close every single other program before running it.

That said, it’s too bad they don’t offer even a tiny, 10 minute demo just to see how well it runs.

Anyway, to answer your question, these survey results from steam show what the “average” steam user has for a system. According to the link, 25% of gamers have less than the gig of RAM needed to run Montjoie, and 8% don’t have the 128 MB of VRAM. I’d be surprised though, if the game refused to run at all with only 512 MB of RAM. I’m guessing it would run, but just stutter a bit as it hits the swap file.

Changing the subject a bit, I’d like to note that Dwarf Fortress requires an absolutely top of the line processor (that $1500 core 2 Extreme wouldn’t be wasted) and plenty of RAM in order to run well. But its graphics are nothing but ASCII text in 2D, and could have been handled by computers from 20 years ago. Not all games’ requirements come down to “eye candy.” Some developers really need more power to expand gameplay in new directions.

I’m a hardcore gamer with a top of the line system and I don’t think I spent a cent over 2400 for it. I can’t imagine what 10k would look like sitting on my desk. But to answer the OP, those spec requirements are pretty basic, so from the looks of that game, the developers do have the lowest common denominator in mind, and if you still don’t meet those requirements perhaps some upgrades are in order.

I’m very happy with my Commodore 64 thankyouverymuch.

Okay I have a pretty high end rig, but it didn’t cost much more than 2500 dollars a couple years ago, and I had a recent graphics card update. But going to Falcon Northwest, building an absolute “top of the line” PC costs around 8.5k (and I cut a couple corners, I thought I may only need ONE 1TB Hard Drive and no SSD for instance). That may be costs from that manufacturer though, I haven’t gone through the costs for individual components for the same computer and then factored in building it myself, but I suspect you could still run yourself a nice amount.

shrug

You asked how someone could spend $10k on a computer. Judging by this response, it seems you already knew exactly how they can do it before you even asked, so, um… :confused:

I wondered how you could spend $10,000 on a computer for GAMING. Not for professional CAD or Video work. I can see a company spending 10k for something like that. I can’t see it being spend even on the bitchin’est gaming comp ever made. Unless you’re paying 200-300% retail markup.

Most of the comments from non PC gamers who complain about how you have to spend $3k for a decent gaming PC are looking at extremely bloated prices from retailers.

I remember browsing through bestbuy’s hardware section and seeing A Geforce 8800 GTX for $950. The same card on Newegg went for $400. Ridiculous.

Speaking as someone who works in video games, I’d like to point out a few things:

First, there’s no such thing as the “average” gamer rig, especially when you start talking about “casual” games (a category that I imagine would include Montjoie!). As the Steam survey shows, there’s quite a wide range of hardware even in the group of people that use Steam, which is biased toward core gamers. The casual market has an even wider range.

Second, as several have pointed out, the minimum spec for a game isn’t necessarily the min spec for the game :dubious:. While one factor in determining the min spec is the hardware requirements for the game, there are several other factors involved as well. A min spec might be inflated to avoid the need to support earlier versions of Windows (because a particular OS function might not exist in Win98), or to reduce the tech support calls that the company gets for “tricky” older systems.

Sometimes requirements are “coded” to mean something else. “At least X MB of Video Memory” might mean “a video card supporting X feature”, but game companies don’t necessarily expect that the user knows whether their card supports that feature, so instead they look at the available video cards and pick a requirement that correlates well with the required feature. It’s not ideal, but the alternative is to either publish requirements that few users understand, or spend extra time and money trying to implement features in a variety of ways to support hardware that has different feature sets.

Also, as several people have alluded to, people are getting faster computers these days. Most machines even come halfway decent 3D video now, which is (from a game programmer’s standpoint) pretty awesome*. If you, as a game company, suspect** that you’re only eliminating 5% of your audience by requiring a particular feature, but using that feature means cutting development time (and therefore development cost) significantly, there’s a good chance you’ll require that feature even though your game doesn’t really “need” it. More generally, faster CPUs mean you can use development methods that sacrifice performance for the sake of development effort. Depending on your perspective, you could see that as being lazy or as being conscious of your expected return on investment.

  • For a variety of reasons, a lot of times it’s easier to program even a “2D” game in such a way that it uses 3D video features. It requires better hardware, but can reduce development time, increase the variety of video effects available, and reduce the amount of art the game needs.

** Usually this type of info comes from surveys, and it’s hard to know how reliable the info is. Steam survey results (since they’re public) are actually pretty useful to the game industry as a whole, but it’s important to remember that they’re biased toward the type of gamers that use Steam.

:dubious:

Mate, I do this with a 5-year old system which cost me $850. And it wasn’t top of the line when I bought it. Methinks you might be overestimating the requirements to run WoW a tad.

There’s been a lot of technological hoops the gamers who are looking for bells and whistles have to jump through the last four years. DirectX 10 being one of them, dual- or quad-core processors being another.

However, the game industry itself phases in that kind of hardware requirement slowly and carefully, since they’re dependant on a large install base to sell their games. I still don’t have a DirectX 10 card, nearly a year after its’ launch, mostly because of my financial situation and because there hasn’t been a DirectX 10 exclusive game made yet.

Defining the average gamer is very hard, because the average gamer is a person who plays games, not a person who keeps on top of the median of hardware requirements. I meet plenty of gamers who identify as such, with four+ year old machines. They’re quite content playing Warcraft III, World of Warcraft, Counter-Strike Source, Starcraft, et cetera. To enter a realm of “average gamer” in which it’s pertinent to talk about hardware, we need to look at the “hardcore” gamer - i.e. the gamer with disposable income who wants to keep ahead of the curve. This is a rare breed, constituting only a few percents of the gaming population.

The hardcore tech gamer will be looking at equipment in this range or above:
CPU: Intel Core2Duo 6500+
Video: NVidia 8800+ series. (DirectX compatible)
RAM: 4GB+ DDR3 series. (1066-2000Mhz FSB)
HDD: 10k RPM for OS/games, slower for storage. (Movies, music, documents, pictures, etc)
Screen: 4ms response time is a frequent requisite. (Low response time arguably reduces popular LCD issues, like ghosting) Personally, I’m going to stick with my 21" colour-corrected CRT screen until I can get a LCD that equals in performance without making my wallet feel I’m touching it in the bathing suit area.
Cooling: Water-cooled is the rage for the time being, but at the very least a good Zalman CPU fan with good cooling paste. (I’ve got that and 4x 120mm fans mounted in my case door, and one in the roof of the case) If you tell a hardcore tech gamer that you’ve put the stock fan on a top-of-the-line CPU, he’s going to punch you in the throat.

  • Input: Tastes in keyboards and mouses vary a great deal from individual by individual. Hand-fitting and responsiveness, as well as preferred button layout are good indicators. (I’m still on my old MX518) High DPI (Dots per inch) is an usual “objective” measurement of performance.

All in all, that equipment will put you out a few thousand dollars at least, particularly if you’re buying it pre-packaged. Most hardcore tech gamers prefer to buy it in parts and put it together themselves. I do, too. For me, it gives me a feel of being in control and gives me more pleasure from working on it. It also gives me more pride-based incentives to take good care of my computer. (Defragmented once a month, internet-based backup service runs twice a week, I also run a temperature tracker and check it frequently. My Intel Core2Duo 6600 has never peaked 35 degrees celcius)

On to some speculation about the video game industry. To disclaim myself properly, I’ve never worked there, but have been part of a few indie games on the development side. Mostly modelling, artwork and writing, some bug-hunting but never coding.

My experience, as an observer, is that the company developing the game tries to make a treshhold fairly current with the start of the development cycle. I.e., if you were to start now, you wouldn’t do bad to look at the list I posted above as a requirement list. Some companies prefer to pioneer for new technology, some prefer to appeal to the lowest common denominator. Crytech being an example of the first, with both their FarCry and Crysis games pushing the envelopes of their generations, and Blizzard being idiomatic of the second one, with games like Starcraft and Warcraft III having rocketed due to accessibility.

In an established company making franchise games, like Ubisoft in their Rainbow Six: Vegas games, the development cycle is quite short - usually no more than two years on the outside. This means you have good cues to where the hardware install base will be. For games attempting to pioneer new technology, it’s a lot harder - like S.T.A.L.K.E.R., for which the company had a development cycle of ten years, or Crysis, which gamers still mock as only being playable on rigs described as “brand new gaming computers resembling the monolith from 2001: A Space Oddyssey, made from obsidian by the proud dwarves of Middle Earth.” They have long, aching development cycles and it’s hard to tell how the hardware park of the average gamer will look like, since we’re looking at non-linear development.

It’s my firm belief that companies like ValvE and Blizzard have the right idea. Blizzard will take a game concept and work on it until it’s done. Then, they will work on it some more, freezing the hardware requirements at a certain point while they polish to a fine sheen, organize ritualistic sacrifices of imported animals, repeatedly sell their soul to the highest bidder (What, you thought the Diablo series was a coincidence?) and by the time it’s released, it will look like the best game out there two years before. But the gameplay will be made of awesome and the crowd will be salivating with anticipation.

ValvE, while not retaining the painstaking “it’ll be done when it’s done (which means overdone)” philosophy of Blizzard, has a huge advantage when it comes to Steam. Steam sales of other companies’ games finances their operations, so they’re not under such a strain to release games on a yearly basis as, let’s say, any other company out there. And they have access to a continuously updated overview of the Steam community’s hardware, which is reliable data about their core demographic. Which is, put simply, invaluable - lending them a nearly console-like frame of development. (In February, Valve claimed Steam has reached over 15 million unique and active accounts. Which, I might add, is around ten times more the projected sales of an average PC game. One might simplify by saying that there’s roughly as many Steam users as there are XBOX 360 owners) Valve have, in both the past and present, taken advantage of this to release highly polished games easily playable by the vast majority of their subscribes. Team Fortress 2, Portal and Half-Life 2 are good examples of this, as is the ambitious game Left 4 Dead, in which Valve left the graphical bar a year or two back and used the new resources found in most computers to make a lot more enemies. According to demo videos and their hype, you will be literally swarmed by enemy zombies.

[anecdote]
I spent some time last year, talking with two people I know who work in visual development in Funcom here in Norway, and they were subscribing to the theory of the “Plateau in front of the Uncanny Valley.” This is basically the belief that shortly - in a timespan of the next decade - CGI visuals will reach a photorealistic plateau from which we cannot improve, except in scope. At which time, video card development will slow down and eventually come to a halt and the function of the video card will be overtaken and incorporated into the CPU and become standardized. They admitted there’s a lot of flaws with the current state of that theory, but they believe it to be inevitable, because there’s a very finite limit to how good computer graphics can get. I.e., indistinguishable from our own sight.
[/anectode]

I have already completely failed to be blunt and short, so forgive me for rambling, but I wanted to address your last question directly. Video games make money when they’re sold. And as such, you must grab the customer’s attention at the time the game goes into sale, because that’s the most crucial point of income for the company selling the game. Later on they’re far more subject to diminishing returns, like re-sales and pirating. Now, you might see that this is an argument that goes both ways: it’s at the same time an argument for appealing to the largest install base and for putting the biggest goddamn bells and whistles you can get on your game. Very few people will buy a shooter from the shelf if the backside looks just like the shooter you bought last year. Word of mouth about it’s high quality is good, yes, but for there to be any word of mouth, people must have already bought it. This is the example of the Catch 22 which violated, say, Psychonauts’ right to life and breeding and which has given rise to such abject monstrosities like the Madden series.

And it’s very, very depressing.

As I pointed out already, one of the things I like about WoW is that it can be adjusted, and it works on old, slow systems. The better the system, the better it works.

My 4-year-old Windows system won’t run the Burning Crusade expansion at all (and I’m not bothering to update it). A different system of slightly earlier vintage runs it, but when I go into Shattrath City, the framerate drops to about 3 (5 to 7 on a really good day). I have most of the options turned down or off on that system. On my MacBook Pro (one of the first dual processor models), I can turn all of the options up all the way, and my Shat City framerate still stays up at 15 to 25. On the new Mac, it’s better. A buddy of mine has a high-end Windows computer that gets framerates in Shat of about 50 to 75. He set it to dual-boot Linux and tweaked it until it would run WoW, and he’s getting framerates of 150 to 200 (yep - same computer, and performance in Windows emulation mode under Windows is 2-3 times what it is native under Windows).

So, again, my point is that the game works well on a lot of different computers (including Macs), and you don’t need to have a “hardcore gamer” system to use it. It scales its performance to meet the system.

Well sure. One of my brothers plays WoW on an Athlon 1.66ghz, 1gb ram and a Geforce 5200 FX. It looks butt-ugly and the framerate is terrible sometimes but he’s still hooked. Don’t feel bad for him, I’m buying him a new PC in a couple of weeks.

My gaming PC was build around 8 months ago. An e6750 C2D, 2gb Ram, 8800GT. I put it all together with a bunch of quality components. I wanted it to run fast, cool and quiet; with good overclocking capability for later. It cost me around £800 at the time. We pay considerably more than those in the US for the same components…

Looking at what £800 could buy me now makes me cringe a little. Still, considering I use my PC 4-8 hours a day, I think it’s worth it. I just wished I’d stretched to a quad core.

That’s exactly why it’s a bad deal to spend too much on a PC. Get the mid-range stuff that’s the best price/performance bargain, because 12 months later the high-end stuff will be the same price.

Au contraire!

It would be quite a challenge to meet the minimum requirements of that computer. The fastest P3 Intel ever made was 1.4 GHz.

Which, of course, suggests that those “requirements” are a complete fabrication. Obviously no one ever tested the game on a system with the “minimum required” specs.

[QUOTE=Gukumatz ]
[snip]/

[quote]

Actually, the era of huge accelerating performance gains is already over. There’s progress being made, but it’s largely linear these days. Storage is still going strong, and memory is good, but raw horsepower (read: CPU) and graphics cards aren’t quite keeping up.

I am playing it with everything maxed. Runs fine. Though I don’t generally check my frames, I’m getting enough that I don’t notice any frame loss or lag.