Unfortunately we’re in an era of stagnation. Never before in a generation of video games was a cross-platform port between PC and consoles considered pretty much mandatory, so almost all games are locked into whatever the consoles are capable of. But we’re looking at keeping mid-range 2005 technology as the standard for a large part of a decade, in an industry where technology advances so quickly. And because the developers are targetting this aged technology, they can’t really push the boundaries with what’s available on the hardware side. You’ll see this on this list where, for example, COD 1, 2, 3, and 4 will show significant graphical improvement, but 5 and 6 will basically be the same thing with different art.
ATI designed the XBOX 360’s graphics, and the current ATI card coming out right now is roughly 20-30 times more powerful than what they put into the Xbox. 20-30 times! And yet we’re going to see games designed for the 1/20th as powerful system for years to come. There are a few cases where developers put in the extra effort to make a better version than the console game available for PCs - for example the upcoming Dirt 2 is delaying the PC release for a few months while they beef up the graphics. Their aim is to create the best looking racing game that has ever been made. Which is great - but these efforts are relatively rare.
One of the great things about gaming was the rapid advancement. There were always releases trying to push the edge of what was graphically possible. The last major release we saw that attempted this was Crysis in 2007. The philosophy of pushing the edge trying to create the best experience that technology can give you appears to be dead and I think this may end up being looked back at historically as a relatively dark age in gaming.
It is nice to see. I agree that the graphics thing is too bad. They need to figure out some way to make them backwards compatible or something. You know, use a lot of features that can be turned off forthe console versions. You might not get the desired 20=30x better graphics, but at least they wouldn’t looks so horrible on the PC.
Heck, if they can pull it off, they could even allow mods to the actual console hardware.
SenorBeef, I’d much rather have people putting effort into art design rather than raw graphics quality. I suspect we’re rapidly approaching a plateau where design is going to be more important than pushing polygons and textures. It doesn’t hurt that good art design costs an insignificant fraction of being on top of the tech curve and it has a better return on investment.
Yes. YES. All the games now may have unbelievable graphics but they have ZERO artistic design. The color palette for games now is typically something like this:
And there are horrid ridiculous-looking “bloom” effects that send unending torrents of syrupy light to saturate every inch of the landscape. And cheesy-assed “motion blur” effects that look less real than Jenna Jameson’s tits.
Every PS3 game I’ve seen looks like this. It is indeed a “dark age” for games, or more properly a “brown age.” Seriously, the 3DRealms games like Duke Nukem and Shadow Warrior were more interesting to look at than the dusty and bleak visuals of even the best of today’s games.
I don’t think these are mutually exclusive or even that there’s an inverse relationship. The people writing the game engines and the people creating the art are totally different departments - you don’t have to choose one over the other. Arguably there’s a positive relationship - new technology like programmable shaders and GPUs with more flexibility allow a game designer to put more tools in the hands of the artist.
I disagree; Invariably, when people are working with a new technology, the art direction suffers. The reason for this, I think, is a slightly counterintuitive one. When designing a game, the people in charge (be this the actual designers, the marketing team, or whomever) generally believe that they need to make their game “look better” than the competition in order to distinguish it. Not a bad assumption, really. There are two ways they can do this. #1: Use all the latest shiny technology, for more polygons, better antialiasing, bumpier bump mapping, or whatever the heck the GPU-trick-of-the-week is. OR #2: They can invest real time and resources into art and art design.
#1 is apparently cheaper and/or easier, because it seems like most companies, when given the choice, go that route. It’s only when they stuck on a plateau, on mature console hardware, that the realize “Well, heck, option #1 is off the table, so we’ll actually need to hire us some proper artists and get a distinctive look going”. Invariably, most of the games with the best art and art direction appear either when A) A developer is looking to support a wide installed base of PCs (i.e. have relatively managable system requirements) or B) it’s late in the lifespan of the console they’re developing for, and most of graphical tricks the system can pull have been tried.
Just as an offhand example, I cite Okami (PS2, ported to Wii). Inarguably a game with a very developed, distinctive, and (for most folks anyway) visually appealing art style. And yet, the original design of the game was in a much more realistic, un-art-inspired visual style. It was only when it was determined that the PS2 really wasn’t gonna be able to hack it if they went that route did they scrap the realistic visual model and go for the woodcut/watercolor style that went on to become the game’s signature look.
I can say with some confidence that I would not have enjoyed the game nearly as much (Heck, I probably wouldn’t even ever have played it, without the distinctive style to draw me in) had they been able to stick with the realistic art.
There’s another reason, too, Airk, and it’s one of the big reasons I dislike most modern games.
Good graphics are immensely, disgustingly expensive. You wind up spending most of your time, effort, and budget on that. Sound has improved a lot, too, but in term of what it adds to the mix versus graphics, sound is a better investment. And And improving the story and gasmeplay and AI are much, much better investments. Despite this, all the reveiwers are basically sucking the graphics dong and greedily lapping at the sticky fluid of pixels. (I just grossed myself out.) The entire industry is inbred and incredibly dishonest.
Frankly, I’m satisfied with graphics now. ATI and NVIDIA are not, and they are at least half the reason everything’s gone downhill. They had to keep pushing things too fast. They had to keep coming up with nonsensical, incomprehensible numbering schemes that nobody understands (so that the consumer literally doesn’t know what he or she’s buying). Basically, they treat their customers like mushrooms. Is it any wonder that customers moved on to consoles?
I’ve actually been satisfied with graphics several times during my life, and I still blame 3d for sucking so much of the -art- out of games. I’m not really interested in photorealism. It’s not only boring, but it’s inappropriate for many styles of games, but it seems to be regarded as a sort of holy grail for all genres. “Progress” in graphics, by and large, hasn’t pleased me for a while. About the only good thing to come out of it recently is cel shading.
I agree that this happens sometimes, but it’s not the only circumstance under which it happens. To take two examples of the games I’m playing right now - Bioshock (I know, I’m late) came out at the same time as the new generation of consoles. By this logic it’s new and shiny and could get by with graphical gizmos, but it’s one of the most distinctive games in terms of an art and design department giving it its own style.
The other game I’m playing right now is Arma 2 which is a military simulator - not much need for an art department because the focus is on trees and terrain and you can get by with generic looking buildings. But the draw distance is insane - it models units, buildings, and trees up to IIRC 3 or 4 miles, and terrain further than that. It’s graphically impressive without an art department but not in an irrelevant super-poly count way, but in a way that actually is very meaningful in terms of immersion and gameplay - you can have tank fights at over a mile, you can call in artillery when you notice an enemy force on a hill 3 miles away. It takes a tremendous amount of horsepower to do this, which is why it’s inherently a PC exclusive. Which is how the whole industry used to be - if your game required processing horsepower, you’d be a PC exclusive. But this is getting quite rare - as companies are economically forced into multiplatform release, games that attempt to push the boundaries of what’s possible and/or aren’t really a genre compatable with consoles are simply dying off. 8 years ago, the market was swarming with high quality flight simulators for instance. Now almost none.
Out of the thousands of posts I’ve made, hundreds of which you’ve probably read - you’ve thought they were all wrong? That’s quite an insult.
And I’ve been around a few game coders and designers. It’s always the same deal : they start out with one kickass idea. Then they try to make it happen, only to hit some tech hangup or another. So they end up dialing it down a bit, or a lot, until they end up with something sellable that they’re not happy with at all. Then some new miracle tech hits the market, and finally their original problem is solved, but do they materialize that perfect first idea ? Nope. They get a new, grander, bigger, prettier one. Which bogs up halfway through. And they end up with a product they hate. Ad nauseam.
And it’s not like us gamers crave that latest, newest, bleeding edge tech either. We’re obviously OK with the existing state of tech, else we wouldn’t bother with it at all. And it’s not like the new gimmick is going to draw more people in, because non gamers don’t buy the absolute latest tech, and existing gamers always bitch and moan when we have to upgrade our rig just so we can play the new games (see : Oblivion, Crysis…).
But somehow, the message gets lost in the noise. And oooh, bloom on lens flare !
The review sites are another huge problem. They spend time on two things: gameplay, and graphics. Yes, it’s nice they acknowledge how important gameplay is, but they routinely diss games for having graphics which are only as insanely great as last year’s graphics. It’s kinda WTF moment when you realize this, because they’re knocking products for what is now purely the extraneous detail.
Of course, another issue is that too many companies don’t know how to minimize system requirements or debug their games. They got real lazy during the mid-90’s and it’s beocme something of a lost art. Game taking up more memory than it should? Who cares, they’ve all got 8 gigs of ram, right? The game is nigh-unplayable and crashes every five minues? Who cares, we’ll just patch it.