Evolution of graphics

Might want to check your facts. Bioshock is far from “coming out at the same time as the new generation of consoles” - according to Gamefaqs nad Wikipedia, Bioshock released 7/21/2007, while the Xbox360 launched 11/22/2005. That’s almost two years old, and, as you point out, it’s launching on hardware that was sort pre-matured due to being, essentially, older PC hardware. Your example does not stand. At all.

I just consider rose colored glasses, combined with a very, very niche product. Most games, since the dawn of 3d, that have pushed their graphical capabilities have not used them in a remotely gameplay enhancing way, so titles that released with stringent hardware requirements were almost inevitably “look how bumpy our bump mapping is! (DirectX-One-Version-More-than-You-Have Required)”.

As kind of an aside, there looks to a be a real quantum leap in graphics between about '94 and '97. It’s especially noticeable on the sports games. After about 97 or so, the images are getting much more refined and realistic, but the basic look is the same. Prior to that, there was a much more primitive look. What happened in that time frame that pushed graphics from where they were in the immediately post NES/Genesis era to where they are now?

Games being on CD-ROMs would be my guess. You can only cram so much data and textures on a series of floppies or in a console cartridge.

ETA : another development, at least as far are PCs are concerned, is the emergence of dedicated 3D graphic cards, who did all the texturing computations and freed the central processor for more important tasks. The first 3Dfx cards shipped out in '96.

Game designers finally got used to polygons after years and years of sprite-based graphics, and it occurred to someone, somewhere that polygons didn’t really look very good at all unless you applied textures to them.

Woooo! Lens flare!
I’m actually ok with not having to upgrade my PC every year to play the latest videogames. Personally, I think graphics are more or less good enough for now. I’d like to see some thought go into making the game environment more realistic and immersive in terms of gameplay and characters.

I think the continuing sales of the Wii prove that you can have crap graphics as long as your games are fun. I don’t want a golf game that is so detailed that the swing is affected by how tight your shoes are tied and what brand they are.

Also FWIW I proposed a replaceable GPU for consoles in an earlier thread and the idea was quickly and violently shot down.

Since when does the Wii have crap graphics? They may not be as good as the 360 or PS3, but that doesn’t automatically mean they’re crap.

Woops, my bad. I thought it came out in early 2006 for some reason.

I still think it serves as a decent example. The game was relatively cutting edge - or at least not behind the curve - and yet clearly prominently featured a massive amount of effort in terms of art and design.

If you’re going to claim the hardware is “pre-matured” then we’re in for an even bigger dropoff in advancement. If in 2007, developers were already good at squeezing at much out of the Xbox as they could, where do we go in the next years?

There have always been high profile technical achievement games where the gameplay haven’t been that good - for example anything from id software. But once those techniques are pioneered, they’re either sold to or copied to games that actually do fuse them together to create games that are technically impressive with great gameplay and great art. Half-life was ultimately based on ground paved by the quake engine, and deus ex came from the unreal engine. They both benefitted from technological advancement, and yet broke new ground in non-technical ways too.

Where would we be right now if we were still using, say, 1999 level technology? Sure, companies would’ve been forced to differentiate themselves in new ways, but the overall experience would not be anywhere near what it is now. I use that year as an example because I’ve heard that this console generation is supposed to last a decade - in 2015 we’ll be using 2005 technology. Compare to where games are in 2009 vs 1999. Not just in pure technical merit, but the overall artistic achievement that the new technology made possible.

Apologies if this has already been brought up as I’ve really only skimmed the thread, but I feel the need to throw this out there.

Everyone’s talking about hardware evolution and new games being designed for what is essentially 5 year old hardware based on the current generation of consoles. My understanding has always been that comparing hardware specs between consoles and computers is an inexact science because what goes into consoles isn’t the same as what’s being sold to PC enthusiasts off the shelf. The hardware is specialized for very specific sorts of tasks.

If we look at a 360 for example, the specs I managed to turn up have it containing a “3.2 GHz Tri-Core Xenon” CPU, a “500 MHz ATI Xenon” with 512 MB of RAM graphics card, and 512 MB of “GDDR3” RAM for system memory. Now I’ll confess that I don’t quite know what all that means exactly as my understanding of computer hardware is very basic but if we were to take “comparably” available hardware and build a desktop out of it would it have the same graphical capabilities as a 360? Is that even possible to do? Is this stuff commercially available to common end users? And if we can would it be available for the same commonly accepted price point of $300 or so? And if we took this hypothetical desktop and ran the PC version of Bioshock or COD4 on it would the games look as good as they do on the 360 and PS3 or would we have to turn the graphical settings down?

I’m not necessarily taking a stance on this as I really don’t know the answers, but I think the question of hardware might be more complex than simply looking at what was cutting edge circa 2005 and automatically assuming that’s what went into the current generation.

Ugh. Please don’t start the PC vs console hardware/number game debate.

If you try to find a PC that matches the raw numbers of consoles you’ll end up with absolutely ridiculous PC specs but (and this is a big but) consoles are limited in other ways (most notably resolutions, inability to upgrade over its life etc) so usually a decent PC will look BETTER then a console without matching numbers. However a poor console to PC port (one that doesn’t bother giving the engine a clean up for PC) will look much much worse. It’s been awhile since I looked at PC’s in America so my prices might be off but I’d say a PC for about $1000 (as long as you know what parts to put together) will beat up a console at the beginning of its life. $750 after about a year. $500 after two years after that any PC will be as good or better as long as it’s not some piece of junk eMachine type deal.

Unfortunately I think the debate should end there as it seems so obvious to me but there’s some serious defenders on both sides of this issue.

Err yes, absolutely. That’s because the 360 is a PC. It’s not like a PC or comparable to a PC : it *is *a PC. Those are desktop components, and not really good ones at that (but they don’t really need to be, because a TV’s resolution is way below that of a computer monitor, so a 360 doesn’t need to do fantastic rendering work).

The only thing that differentiates a 360 from a desktop PC is that every single 360 is built with the exact same hardware. Meaning game devs can optimize their games for that specific hardware combination instead of having to make their game compatible with gajillions of different configurations, including obsolete ones.
Which is why games designed for PCs are an absolute breeze to port over to the 360, and games ported from the 360 to the PC tend to run like shit (I’m looking at you, GTA 4).

Yup. Or rather, nope : common end users will have trouble finding equivalent components because they’re obsolete. Parts sellers don’t stock them anymore, it wouldn’t be profitable. Who would buy 512mb of RAM when 4 gigs is dirt cheap ? Who’d buy a 20 gig hard drive ? These days there are USB thumb drives bigger than this. And so on, and so forth.

Pretty much, yes. Probably lower than that actually - again, these are all really old components.

I’m not even sure the Xbox runs CoD 4 or Bioshock at their highest levels of detail, field of view etc… However, my guess would be : no, a PC with Xbox specs probably wouldn’t run Bioshock as well, because a PC would need to render it at a much higher resolution. However, having played Mass Effect on both the 360 and a medium-end PC, I’ll pick the PC version anytime. Graphics are much crisper, and I can aim with a mouse.

Before the Xbox, consoles were an entirely different sort of hardware generally, custom designed. The original xbox was pretty much just a pc in a box - a pentium 3 700 mhz processor and a geforce 4 video card. The current generation continued that trend - the graphics chips are made by ATI (xbox 360) and Nvidia (ps3). Neither system uses an x86 CPU but the PowerPC CPU in the xbox is a pretty conventional design that should be easy enough to compare it to. The PS3 cpu is an oddball, but since most cross platform games run better on the xbox, it seems reasonable to discard it terms of performance.

The XBox 360 uses something roughly equivelant in power to a Radeon 1900XT card. It’s actually a bit more capable, because it was developed in between the R500 and R600 generations and took on a few R600 features like a unified shader architecture. But that’s the closest comparison. The PS3 uses a Geforce 7900GT that has been reduced to half the raster and texture units. Neither card was cutting edge at the time of the release of that console - they were midrange with reduced hardware capabilities to control costs.

Because they’re made by the same companies using pretty much the same architecture (the ATI unit was between generations and incorporates features of both, the PS3 unit is a straight stripped down 7900GT), they’re easy to compare. And since the GPU is, by far, the most critical factor in 3d gaming (games are very rarely limited by anything but the GPU) this makes it fairly easy to compare.

To compare the GPUs, which are easiest to compare since they’re very similar hardware and they’re the dominant factor here.

The ATI Xenos GPU on the Xbox 360 has 48 unified shaders that run at 500 mhz core clock, around 300 million transistors, 8 render output (ROP) units, and a fill rate of 4 gigapixels/sec, with directx9/shader model 3.0 support, and 512mb of 700mhz GDDR3 memory. (The memory is actually shared - it’s 512mb for both system memory and graphics memory).

If you compare this to ATI’s newest chip, the Radeon 5870, it has 1600 unified shaders that run at 850 mhz with 32 rop units, and a fill rate of 27.2 gigapixels/sec, directx11/shader model 4 support, and 1gb or 2gb of memory of 1.2ghz GDDR5 memory.

That doesn’t tell the whole story. Not only does the Radeon have 1600 shaders (the name is sort of misleading, since they’re actually little mini general processing units specialized for this sort of data, not just shading) compared to 48 on the Xenos, and not only do they run at 850 mhz vs 500 mhz clock, but the improvements in architecture mean that each shader does more work per clock cycle. It’s like why a current I7 CPU at 2.7 ghz can run circles around an old Pentium 4 CPU at 3.6 GHZ even without factoring in the multicore aspect. It does more work at the same speed, has 33 times as many actual shader units, and each runs over 60% faster.

I may be wrong on this, but based on my reading the XBox 360 has 512mb of total memory available to it, using the same memory for both normal system memory operations and graphical memory. Compared to a typical PC that will have 4gb of system memory, and 1gb of dedicated video memory.

The 5870 uses GDDR5, which is faster than GDDR3 - and it has 2-4x as much running at 1200mhz vs 700mhz. The memory bandwidth available to the 5870 is 153.6 GB/sec compared to 22.4GB/sec available to the Xenos.

In raw pixel pushing power, at 27.2gpixel/sec it does about 7x as much as the Xenos at 4, but that doesn’t tell the whole story either.

The 5870 also has new capabilities - the shaders in the unit are able to execute arbitrary code to take advantage of what sort of data the GPU is capable of processing. For instance physics processing can be unloaded onto the shader units of the newer cards … say for example you want to show a wall being destroyed. Previously you’d probably just use some preset animation, or use a limited amount of debris and let the CPU handle it. Now you can build that physics calculation in the GPU’s workload and get a wall that fragments into a thousand pieces, each of them bumping up against and interacting with other particles - tumbling or turning into dust.

It has the capability of doing more sorts of lighting and shading and filtering, and post processing effects. It can handle more detailed textures, more and better antialiasing, and different and better filtering techniques.

I planned on writing a comparison here of the PS3’s graphics processor compared to current Nvidia lineups, but I’ve already written quite enough, and the comparison would actually be uglier, since the PS3’s graphics have both less horsepower and less flexibility/capability than those of the Xbox 360.

Yes, more or less, if you ran it at the same settings.

You couldn’t built it to the exact same hardware, but you could subsitute equivelants. You need an x86 platform to run windows, so neither of their CPUs would work, but you could get an intel or AMD processor of the same average performance.

At the time of release of the consoles, no. The parts would’ve cost more simply because you have to build a whole computer around it, and the consoles benefitted from economies of scale of cranking out millions of identical units.

Now, yes, easily. A Radeon 4350 will outperform the graphics of an Xbox 360 and it costs right now on Newegg $35. Or $20 after rebate. No shit. Look yourself.

You could run it at settings roughly equivelant of what the Xbox 360 looks like. Running the games on xbox 360 is already running them with the graphics settings turned down - max settings for a game on the PC is far better than the settings an Xbox 360 runs at.

For instance, I run games at 1680x1050 resolution, or roughly 1.7 megapixels. PC users with bigger monitors run games at 2560x1600 resolution, or a bit over 4 megapixels. On the Xbox 360, COD4 runs at 1024x600 resolution, or 0.61 megapixels. Not only that, but the target frame rate for the game on the Xbox 360 is 30 frames per second, and on my PC (which is not high end) runs at 70+. People will tell you that you can’t see anything over 24, but it’s nonsense - the smoothness at 30 vs 60 fps is readily apparent. Not only that, but I have all the settings maxed, 4x FSAA, max anisotropy, max texture detail and view distance, etc - all settings that are graphically superior to the Xbox 360. And this is for a game that didn’t even put significant effort into making the PC version better - there are other examples like Mass Effect or Dirt 2 where the PC release of the game was delayed so that they could make the graphics better.

It’s a stark difference, like watching HDTV and then going back to SDTV. Before, SDTV just seemed normal, but now that you’ve seen something better it looks like crap. My video card broke last month and I borrowed my friend’s geforce 7950GT video card - which has much more horsepower than either of the console’s GPUs - and I had to stop playing my more graphically advanced games because it looked like crap and the frame rates sucked.

Again, because for the most parts consoles nowadays are basically just mini-PCs, they actually are quite comparable. Hopefully I’ve done a good job in demonstrating this.

To be clear, it still had better graphics than a console. I’m saying it looked like crap by comparison, because I’m used to so much better. If you can, see what COD4 looks like on a current midrange PC vs a console.

I keep seeing this claim, but how many of you people are actually running desktop PCs with your monitor at a higher resolution than that of a 1080p HDTV? (1920x1080) A quick scan of a couple of large, expensive LCD Monitors on TigerDirect shows me the following max resolutions:

1920x1200
1920x1080
2048x1152

None of those are really significant change from the resolution an Xbox can be called upon to process, so lets do away with this whole “consoles (Or at least, the PS3 and 360) don’t have to push/don’t have the ability to push the same resolution as a PC” line. Yes, most games run at a “mere” 1280x720, but the TV is more than capable of better these days.

That’s nice and all, but that’s not exactly going to be running your games. :wink: While you could probably build a better system for $300, I doubt you could -buy- a better system for $300. Which is relevant.

Either way though, this is completely irrelevant to the debate; You have yet to refute the point that, in fact, with huge wide open technical ceilings, the vast majority of game developers are more likely to produce games with large numbers of largely irrelevant graphical bells and whistles than they are to put actual art development into a game. Therefore resulting in games that, for a lot of people, actually look worse.

I think PC enthusiasts consistently underestimate the cost associated with PC gaming - and I’m not just referring to dollars. The more important factor in the decline of PC gaming in favor of consoles has been the relative cost in terms of time. Building a decent gaming PC uses up significantly more time than buying and setting up a console, not just because you actually have to build the thing (or alternatively pay twice as much for a Dell POS), but because it requires expertise in computer hardware far beyond what your average video game player has these days. You can argue GPU power and clock speed all you want, you can point out that components superior to those in any console can be had for quite a reasonable price, but none of those things matter to the vast majority of gamers anymore. Most of us want a machine we can take home from the store, plug into the TV, pop in a game, and immediately play. No muss, no fuss, no spending hours and hours learning the difference between SDRAM and DRAM.

Consoles have matured to the point where they can deliver graphics that would have blown people away 5 years ago, and still look good today. More importantly, the graphics horsepower available in these last two generation of consoles is more than sufficient for a suitably creative set of artists to produce fantastically immersive and memorable work in 3D, something that wasn’t true before. Games like “The Beatles: Rock Band,” “Portal,” and “Super Mario Galaxy” may not be able to boast of their polygon counts or enormously complex shader effects, but they can boast of distinctive and clever art design (not to mention some of the best game design and gameplay in the history of video games), and their enormous sales suggest that consumers are more than happy with that tradeoff.

To summarize:

$300 console + $60 game + 10 minutes setup time >>> $300 PC-with-better-graphics-power + $60 game + 100 hours of learning how to build a PC in the first place.

Well… so what? Sure, a 1080p TV has a lot of pixels, but the Xbox is just upscaling a lower resolution rendered image. If you play a nintendo 64 on a 60" 1080p TV, it’s going to fill up all those pixels too. Also upscaled.

I play at a res of 1680x1050, which is a bit below 1080p res - but my graphics card is actually rendering 1680x1050 pixels, no upscaling. So I have (in my cod4 example) about 3 times the effective resolution.

I don’t actually know, since I never buy prebuilts. Couldn’t you buy an off the shelf PC with the minimum specs you could find and then stick a 4350 in there for under $300?

Practically, though - people already have PCs for web surfing, running applications, etc. It’s significantly under $300 to take that PC that they’re going to have anyway and turn it into a gaming machine.

The general trend in gaming is to put more and more effort into art. How much detail does the average game now have compared to the average game 10 years ago? With few exceptions, pretty much any random game you can pick out that came out after another game will have both better graphics and more detailed art. I mean - I’m sure there are some quirky arthouse games that buck the trend, but to use the COD4 example again - cod4 looks both graphically superior to COD1/2/3 and has more detailed art, and looks better. This attitude of “oh, they only care about the graphics, not the art/gameplay/etc” generally seems like snobbery to me - the sort of people who also make sure to tell you how much independent films are better than mainstream stuff. The reality is that pretty much every aspect of gaming, from AI to physics to graphical effects to art - has consistently improved on average over the years.

Well, yes, but none of the other hardware would be up to running COD4 (by way of example).

Uh, what? The Xbox 360’s CPU is roughly equivelant to something like a single core Athlon 64 at 2 ghz, so you plug in an e3200 for $53 or an A64 x2 5200 for $66 and you’ve got more CPU. $60 case with adequate power supply, a $50 minimal but functional motherboard, a gig of DDR2 system ram for $15, a dvd burner for $15, and boom, you’ve got a computer under $240 that’ll give better gaming results than an xbox 360, and be a general purpose computer too.

But the thing is - pretty much everyone owns a PC, right? So you don’t need to build a completely new PC for gaming purposes - just throw a decent video card into your computer and it’s now both a general use computer and a gaming machine. So if you have some system with even low end modern hardware, and you plug in a $35 ($20 after rebate!) Radeon 4350, you’ve just turned your computer into a gaming machine superior to an xbox 360.

I don’t think you guys appreciate how quickly computer technology improves and how cheap it gets. 2005 was a long, long time ago in computer terms.

More art != better art.

Similarly, higher polygon counts and texture resolution != better art.

Can such factors contribute to better art? Sure. Do they always? No, not even close.

Case in point: Look at the “Monkey Island” progression in the OP. You see dramatic improvements in the artwork in each of the first three games of the series, with a particularly large jump in quality between the second and third games (unless you’re one of those weird MI purists that hates the cartoony look… but I digress ;)) that can be directly linked to the massive increase in resolution between those two games. The hand-painted backgrounds and TV-quality animation simply would not have been possible given the constraints of the technology used in MI1 and MI2.

So better technology = better graphics = better art, right? Not so fast. Let’s move on the next game in the series, “Escape From Monkey Island.”

Eugh.

That’s some ugly shit.

But at the time, MI4’s graphics were considered top-notch. It was a huge deal to be able to render an adventure game in full-3D - “Grim Fandango” and “The Longest Journey” were the only games to do so before MI4. Although MI4 was criticized for its controls and story, its graphics were generally considered a success at the time.

Again, I’m not saying that graphics power and artistry are mutually exclusive - MI3 is a lovely example of how a smart, talented art team can take advantage of recent advances in graphics technology to push a game franchise to new heights. But citing numbers and system specs as an argument for artistic merit is fundamentally flawed. It’s roughly akin to claiming that Stephenie Meyers is a better writer than Ernest Hemingway because she uses a word processor with spell check and auto-formatting.

They can. But they don’t. They simply render a middle of the line resolution then scale it up, which is really not the same thing as rendering a higher resolution. You can play an old DOS game with a 640*480 resolution on a high def screen, it won’t make it a high res game. There’ll just be more individual pixels in each one displayed huge cube that used to be one pixel.

That’s neither here nor there - jihi wondered whether console tech was simply that far ahead of PCs that keeping it for 10 years made sense. I adressed the first part : it isn’t.
Whether keeping it as the entire industry’s standard for 10 years anyway makes sense is a different question - I personally think it does and that games look good enough for me right now, but then I haven’t seen next-next-next gen graphics to make an honest comparison or analysis. But if I could play, say, some Oblivion clone on a system that’d make it look like a Pixar film or Hollywood-grade CGI, I think I’d enjoy it :wink: