To add to my point: What is it that a GTX 280 on a PC has to do that a Radeon 2600 doesn’t that levels the playing field?
You launch a game. Your PC goes into full screen game mode. Your GPU fires up full speed. 95%+ of your CPU time is dedicated to running this game. How is it different now than a “dedicated gaming box” other than that a few percent of the resources are running things in the background?
The GTX 280 is outputting more pixels than the radeon 2600. It’s doing it faster and with greater memory bandwidth. It’s using more shaders. But they’re both doing the same job - rendering instructions onto the screen. They’re both using the same graphics API even.
Consoles are little computers. Current generation consoles are specialized with mid-range hardware from 2004 or 2005. A 1:1 comparison is perfectly valid. Okay - so let’s say that the Xbox only has to use 1% of it’s processing power dedicated to background processes and a PC has to use 2%. If the PC has 3 times the raw power, do you really think the console is somehow coming out ahead because it has less overhead?
They’re doing the same thing, the consoles are just doing it mid-range 2004 style.
For those that are interested, this link compares h.264 codecs utilizing the ps3, the gtx285 and the i7 965 xe. The ps3 (not sure if they are only using the cell or if they are using the gpu in the ps3 also, but if I remember right, access to the gpu when running linux on the ps3 is limited, so it could just be the cell) and the gtx285 were close and the i7 behind both of them.
SenorBeef, I can’t be arsed to argue against someone who wants to compare apples and oranges.
You keep saying “mid-2004.” I have played the same game on a 2004-era GPU and an Xbox 360, and - if the effects are the same - the FPS numbers aren’t even close. It’s a joke. The dedicated platform runs it without stuttering, usually at 60 FPS.
I don’t care a whit about convincing you, because you’re not convincing anyone who matters. “Raw power” simply does not mean anything, and you have no 1-to-1 numbers that involve, oh I dunno, actual gameplay and framerates on the systems you’re comparing. Until you have that, you have nothing but abstracts on a page.
How is it apples and oranges? They’re both computers that even use the same direct3d API to make calls to hardware with similar graphics architectures. The Xbox 360 is closer to a modern PC in hardware terms than a PS2 or gamecube.
The 360 does gain advantages in uniformity, when developers know exactly what their target system is - they can tweak the game to get the most out of the hardware available for that system. It gets a lot out of the hardware it has. But we’re talking about an industry where advancement happens at a very fast pace - where processing power doubles every few years. How can something locked into 2004-2005 technology possibly expect to keep up for most of a decade with a product that’s always on the cutting edge of industry technology?
What qualifies as mid-2004 to you? Going from memory, that’s the era of somewhere around a geforce 7800 GTX and Athlon x2 4200 or so, with 2gb of ddr 2.
I may have the development timing of the xbox wrong. It was released in Q2 2005, right? So maybe late 2004/early 2005 technology.
What game are you talking about? What are the relevant stats?
Really? Nothing?
So a PC from 10 years from now, which will have dozens of times the processing and graphics capability, will be inferior to the xbox 360 because it’s not a dedicated gaming machine?
There’s no point for you at which better and better hardware, running through the same API, doing the same calculations, and serving the same purpose becomes more capable?
The CPUs of the PS3 and Xbox 360 are of different architectures than x86. You can argue that they have different strengths and weaknesses. But the Xbox’s GPU is pretty much a Radeon 2600. And the PS3’s GPU is pretty much a geforce 7800. And that’s where the real work is done as far as gaming.
So why can’t we compare that when the same companies are producing new and improved GPUs on roughly the same architecture using the same API to achieve the same purpose?
Really?
Well, first, I own both an xbox 360 and a PC, so I can see the difference first hand.
Most console games aren’t rendered in 1080p or even close. Often they can’t even hit 720p. They often cap their frame rates at 30. They often use no or little anisotropic filtering or FSAA. They use certain tricks to keep the frame rates up like aggressive LOD bias and low draw distances. And because they can test using the exact same system the target audience will have, they can see where they have to go through and change in-game events and maps in areas that would otherwise drop the frame rates too low.
Serious question - is it even possible in your mind that a PC can deliver superior graphical output to a console? If current ones don’t, how much more advanced do they need to become? Will the PCs of 2015 beat out an Xbox 360? Or will the Xbox 360 always have the edge because it’s a specialized device?
November, 2005. Also, the graphics chip, IIRC, was based on upcoming PC chips from ATI–in short (again, IIRC), it incorporated features that weren’t yet on the general market.
The gpu is not the only place work is performed, and the cell and xenon processors are both designed for the workload typically found in games (physics and graphics, vector operations, high memory bandwidth).
I’m not sure if you are a programmer or not, but if you are, you should take a detailed look at those processors, I think you will appreciate that they optimized them to physics and graphics workloads, much more so than any general purpose cpu on the market at that time.
Very good point. I’ve been an avid PC gamer since the 80’s. I still spend a fair amount on boxed games (when they fall to the secondary or tertiary price point) but it’s almost a habit and they remain little played.
The reason? For the price of 4 full priced games I have a life sub to Lord of the Rings Online and have been playing the life out of it for months.
I find this to be pretty unconvincing. DRM has been politicized by loud voices, but theyre still very much marginalized. Ive been dealing with DRM since I was a kid. I remember opening the kings quest manual to go to page x to find a word to put into the game to continue playing! Now thats bad DRM. If anything, DRM has gotten less obtrusive and annoying, but unfortunately the complaint culture of the internet is alive and well.
PC gaming will never die. It will always have some kind of niche. Its just too easy to develop on PCs and too many people want to play games on their PCs. It may always be the “fourth console” but it will always be a force in gaming. Too many big companies have a lot to lose if gaming died on the PC, most notably intel, microsoft, nvidia, and ATI. I wouldnt be surprised to see them subsidizing ports or starting their own game studios if game releases on the PC become threatened. Actually, Microsoft runs its own studio, so we’re 1/4 of the way there.
I find this to be pretty unconvincing. DRM has been politicized by loud voices, but theyre still very much marginalized. Ive been dealing with DRM since I was a kid. I remember opening the kings quest manual to go to page x to find a word to put into the game to continue playing! Now thats bad DRM. If anything, DRM has gotten less obtrusive and annoying, but unfortunately the complaint culture of the internet is alive and well.
PC gaming will never die. It will always have some kind of niche. Its just too easy to develop on PCs and too many people want to play games on their PCs. It may always be the “fourth console” but it will always be a force in gaming. Too many big companies have a lot to lose if gaming died on the PC, most notably intel, microsoft, nvidia, and ATI. I wouldnt be surprised to see them subsidizing ports or starting their own game studios if game releases on the PC become threatened. Actually, Microsoft runs its own studio, so we’re 1/4 of the way there.
It can’t be both a niche and a force. But I agree with you, PCs are the fourth console. But I think that means that PC gaming has basically marginalized itself as less important/special/whatever word you want to use in the minds of the general public behind the Wii, Xbox 360 and PS3 (in that order).
Which is devoted entirely to developing Xbox 360 games. The last PC game Microsoft developed from the ground up for the PC was Flight Simulator X and that was almost three years ago.
No, but it’s the primary factor in modern gaming. You get a better gaming experience with fast GPU and slow CPU. It’s easier for the GPU to be a bottleneck, especially at higher resolutions.
You could attach a supercomputer to a radeon 2600 or geforce 7800 and your rendering performance won’t increase dramatically under typical scenarios.
The CPU isn’t heavily involved in the actual process of rendering, so I’m not sure what graphics processing it’s doing.
I’ve only programmed as a hobby, and not recently. I’ve never touched machine code. I do understand CPUs in a very general way - pipelines, registers, FPUs, SIMD, etc. I’m not that familiar with either the PowerPC or cell architecture, which is partly why I’ve focused on the GPUs, which I am familiar with - the other parts being that the GPU is the dominant factor in modern rendering, and becomes they the same architecture and use the same APIs as PC GPUs so they’re directly comparable.
This is a weird turn in this debate. People advocating for consoles generally don’t try to attempt to portray them as computationally as powerful as PCs - usually they stress other perceived advantages.
I have to start with an apology to SenorBeef - I was unusually short last night, because I was pretty pissed at a game (Knights in the Nightmare, if anyone’s curious). I still think we’re talking past each other, but I was more rude than was necessary. :o Now…
I wish I could tell you - I don’t know how programmers get more out of a console later in its life cycle than earlier. But it does happen.
I was actually going on your comparison from earlier in the thread: Radeon 2600/Geforce 6000 era. I played on that generation myself, so I feel pretty confident when I say the 360 surpasses them in terms of actual frames rendered.
The comparison I planned to make was such: play the game Prototype on a 2004-2005 era GPU in, say, 800x600. Compare the frames rendered to that same game on a modern console: 60 FPS. This is the sort of applied data that matters.
That’s more of a general challenge to the thread, tho: I’ve never heard of anyone actually trying to play Prototype on a computer from five years ago.
I haven’t said this, actually. I was basing my defensiveness off your comparison of an Xbox 360 to a 2004-2005 era PC, based on nothing but their raw power. Again, it is not 1:1.
This is a good point, actually, and does some toward evening out my challenge of an Nvidia 7800 vs an Xbox 360 using the exact same game. I still submit that they wouldn’t be in the same league, as far as framerates, even if you turn off AA, AF, and reduce the resolution to what consoles render natively. (Ignoring the fact that most gamers with HDTVs have neither the capability nor the willpower to study pixels long enough to discern how many lines their favorite game is rendering natively - upscaling is sufficient for most TVs.)
Again, very no. I’ve never been of the opinion that the 360 or the PS3 is as powerful as a modern PC. Hell, I’ve been playing a lot more on my PC since I got a nice Radeon HD 4870 on sale this year. I’m taking objection to the idea that the 360’s actual output of graphics power is comparable to a PC from 2004, simply because the clock speeds match up. Hell, just the fact that the 360 can run the same games and some similar apps as PCs with only half a gig of RAM should be enough to prove that, yes, they are different types of machines, and simply comparing the “raw power” between the two means little.
I’m well aware that PC rendering has surpassed consoles for this generation. I’ve only seen it do so (with affordable GPUs) around 2007-2008, however.
Yeah, this tends to be a hostile issue for me since I love PC gaming and I see it slowly dying due to multiplatform design.
No, I agree with this. They manage to squeeze every last drop of performance out of a console over time. I don’t think the effect is nearly as much as the speed of advancement in the industry though. That is, X years of programming experience tricks won’t match up to X years of hardware development.
If I get the game, I may be able to do that - I have an old system with an athlon 3200 and a geforce 6800GT from 2004. Is it more accurate to say that the consoles are from a Radeon 2xxx/geforce 7xxx era? The PS3 has essentially a 7800… I don’t remember how much later it came out than the 360.
This article the game runs on the 360 at 1120x640 with a cap of 30 fps. I’m not entirely sure what that means though as I’m not sure why games cap FPS. I’m guessing that it’s because the game drops frame rate heavily relative to its best frame rate, and it feels more laggy if a game drops from 60 fps to 20 than if it drops from 30 to 20. So that information doesn’t give me the actual frame rate average.
Is there any trick to forcing the xbox to display a frame rate?
The other issue is that it may be an unoptimized port - they may program it to squeeze every drop out of performance from the 360, and then just port it to the PC without doing the same optimization for that platform because they figure it can run acceptably anyway due to the faster hardware.
I misunderstood you. I think I referred to both modern PCs and the PCs of 2004-2005 in the post you originally responded to. I thought that you were indicating that the Xbox still outperformed both contemporary PCs and modern ones.
I agree that given its streamlined/purpose built nature, and the fact that developers now have years of experience tweaking for the platform, the xbox probably does as well as a mid to lower high end 2004-2005 system.
My point is that the hardware has come a long way since then - and it’ll still be years before the next console generation comes.
I don’t know about that. I love PC gaming – turned my back on consoles near the end of the eighties and haven’t been impressed by one until the Wii brought a little novelty in. Everything else seemed neutered in ways that make for a very unsatisfying experience when compared with PC gaming. (Yeah, it’s ironic that a console with such poor graphics capabilities won me over a little; I guess my main resistence to console gaming is the pitiful controls.)
That said, I have always been willing to slap down some cash for a good game.
I didn’t blnk when Half Life 2 came out - it looks great, and I played the hell out of the earlier incarnations. I still haven’t played that damned game, because its DRM won’t let me, although it is installed from the retail box. I’ve got a “Steam” account, and I’ve installed that son-of-a-bitch on three separate machines.
I just got a brand-spankin’ new machine, and thought I’d give it another shot. I’m pretty sure that the issue is that I’ve got PowerIso installed, and use its Virtual Drive feature. Not that I’m trying to run Half Life 2 from a virtual drive, mind - just that it’s there. I’m not about to shell out for one computer that does what I need it to do for work and another computer that provides an environment that meets arbitrary requirements for Valve’s “trust.”
It’s beginning to look like I am going to have to download a pirated copy of Half Life 2 if I’m ever going to play the fecking thing. As it is, after each attempt I just give up and install some other purchase, that’ll actually run. (This time I settled on Grand Theft Auto IV.)
One thing is certain: Valve went from my “I will definitely consider purchasing anything by this publisher” column straight over to “Under no circumstances (no matter how great it looks) will I ever buy anything from this publisher.”
If Blizzard turns around and pulls some shit like that that with Starcraft 2, I don’t know what I’ll do. It’ll probably involve a clock tower, though.
For a traditional cpu this is correct, but for a cpu like the cell, different story. If you look at that link I posted regarding video encoding, it indicates they were able to get similar performance between a cell and a gtx285. So, the point is that the cpu in the ps3 is available to perform that type of work.
I agree that the gpu’s are more directly comparable between the consoles and the pc, my primary disagreement is that because the cpu’s are customized towards the gaming workload, you can’t discount them when trying to compare the console and a pc with respect to potential gaming performance.
If, on the other hand, we were talking about a different workload, branchy code with random memory access and no simd operations, then the story would be reversed.
I’m not really an advocate for consoles or pc’s, really just trying to make sure there is an accurate understanding of strengths and weaknesses.
I enjoy the topic (different computing platforms) and have spent quite a bit of time looking for the best platform for a simulation project I’ve been working on (artificial life and neural networks). It’s highly parallel, so I looked at: multiple intel cpu’s, multiple cells and multiple gpu’s. One of my critical routines works great on the gpu (gtx280). The other critical routine is a problem, the gpu wants to perform too many of the same operations at one time, the cell wants my data to be adjacent to take advantage of simd, and the intel cpu’s just plain don’t have enough cores.
I’m hoping that intel really brings out the larrabee with a bunch of cores and that each core can operate on an independent piece of data (unlike the gpu and cell).