I’ve been hearing rumors that the PS3 and xbox 360 are going to support 3D gaming. This seems strange to me. Most games on the consoles struggle at 30 fps at 720p resolution. Even on modern PC hardware going 3D means a significant performance impact. This is because the game engine needs to render two screens, one for each eye. That is asking a lot of the rendering hardware so my question is: are the rumors true? And if so, how are they pulling this off?
It isn’t the performance hit (which is negligible once AA is disabled). It’s that gamers don’t like getting headaches, vomiting, and spending $200 on glasses.
And 120 Hz monitors experienced terrible ghosting issues. The polarizing in the glasses made it hard to see already too-dark games.
That said, when it did all work, it was pretty cool. My friend traded ended up trading them away for something stupid within a month of having them though.
Oops. This is mostly about Nvidia’s 3D Vision thing. I’ve no idea what consoles are doing.
But also headache-inducing. I mean, they even had to mandate taking a break every 15 minutes.
As for how the consoles are handling 3D: I can only guess that they will decrease graphics capabilities to be able to pull it off. Our brains don’t seem to notice graphics level as much in 3D. For example, see Avatar, which looks awfully fake in 2D, but supposedly very realistic in 3D.
They already DO handle 3d gaming via 3d TVs, and they run pretty well. Also, they don’t “struggle” at 30fps at 720p, a good many are locked in at 60fps (though I honestly don’t think it makes too much of a difference. People say 30fps looks sluggish but that’s complete bs. Movies don’t look sluggish and they’re running at 24fps. I know it’s a bit different, but it honestly doesn’t matter that much).
Anywya, yea, I know of at least one game that’s already out for the xbox 360 and supports 3d: Avatar. Not that it’s a good game, but still, it does 3d. Also, 3d for movies is terribly overrated, it might be better for video games, but not until they fix some serious issues with it. Nintendo might be onto something with their new 3DS, which don’t require wearing special glasses, but that remains to be seen
Rad Racer, man. You could hit select and it’d go all green-and-red 3d and you could wear those glasses with one green and one red lens (like you got when you went to go see MoonWalker).
Sluggish wouldn’t make sense to me. 30fps means it might be a bit choppier, but I wouldn’t think the game would actually slow down–it’d just skip frames.
30 fps is sluggish because it never stays 30 fps consistently. Anytime the level asks for more it can dip significantly. It’s not really the 30 fps that makes it feel sluggish, it’s the dips below that.
As for most xbox games being locked at 60 fps, I believe that’s incorrect. Most games with demanding 3d graphics are locked to 30 fps on that console.
Now, the avatar game on the 360 is what I’m wondering about.
The xbox 360 doesn’t seem to have the horsepower to render a game at 1080p, nevermind with 3D. Are the game’s textures lowered in quality? Detail levels lowered? Is the 3d tech they use different from that which is used by Nvidia?
The console can’t run most games at 720p (the last halo runs at less that 720p for example), from where is it getting the necessary buffer for 2 1080p scene renders?
Basically, I’m wondering if this is a much better 3D solution than what Nvidia currently has for the PC, or if the graphics detailed is just being lowered to heck in order for this to run.
I would assume that, due to hardware limitations, the graphic details are lowered. I dunno how much, I never played the game let alone played it on a 3d TV. But Nvidia has the advantage or using it’s newest gpus and the horsepower that goes with them.
I do realtime 3D stuff, so I’m one who looks for these things, but I find the difference between 30 and 60fps quite noticeable. I use 60fps as my “goal” framerate, generally speaking…depending on the onscreen action, 30 can be problematic. Also, if you want to see where film gets ugly, watch for the tearing in horizontal pans – choppy.
I agree it’s noticeably different, but I don’t think that properly faded/blurred it actually looks bad. I think the biggest problem is that people seem to want the depth of field to be infinite.
As for film, didn’t old cameras have this blurring effect on fast pans that made the tearing much less noticeable?