Why is console gaming online so crap?

“720p” was going out of style on the PC in 1996.

Well, technically, yeah. There are few next-gen games (with, again, some exceptions), that actually render in 1080, they just upscale to it. I misspoke. However, very few players (even elitists, hey) can tell the difference, and it certainly takes care of the ‘fuzziness’ that was originally brought up in this thread.

Okay now, if we’re going to play the divisive fanboy game, I could ask you how often your framerate dips below 60 because of background processes or not-quite-enough graphics cards. The best console games never do. :wink: But then, one can always spend even more for better graphics…

Not necessarily, but it’s fun to see the poo-flinging PC-console party start.

With that being said, no, more people in a game don’t necessarily make it better. Better graphics are nice, but I’m simply not going to play it on a PC. It’s because I don’t particularly like computers, and I don’t like the dance of upgrading it.

Sweet Jesus, leave it alone, willya?

Capt. Ridley’s Shooting Party: Friends Lists. Get them filled. Send invites. Problem solved.

Also, it’s not so much that I’m a staunch console defender. I like playing these games on the consoles. The problems that you guys have with the consoles simply don’t exist in my world.

Wow, I guess a lot of console players must be in dire need of glasses or laser surgery.

If you can’t tell the difference between a .9 megapixel image and 2.1 megapixel one you have issues.

Let me see… how about never? In fact consoles consistently have MORE issues with frame rate than PC gamers with decent hardware. Just chekc out ANY review of ANY game released on both platforms.

The first thing mentioned when comparing the two version is inexorably: PC has better graphics, and a much more stable frame rate.

When come back, bring pie.

Gonna disagree with this. Yes they render at 720p and upscale toe 1080, but when they’re done correctly they don’t so much “struggle” as they do “lock it in at 60fps regardless of what’s happening.” Are there exceptions? Sure, but those exceptions aren’t from console hardware faults so much as programming faults (recent examples being Fable 2 or Last Remnant, which have stutters and slow downs regardless of what they’re being rendered at). Now obviously PC hardware is superior, but the x360 and ps3 can easily render at 720p no problems, it’s just the games themselves that are sometimes shoddily programmed, something which I’d lay blame at corporate’s door for pushing out games before they’re ready.

I don’t think frame rate issues can be solely laid at the feet of developers. Sure, some level optimization can be done, but ultimately the minuscule amount of system and video RAM, the relatively slow nature of that RAM and the 5 year old GPU architecture are limiting factors. 720p, as mentioned above was a resolution PC games left behind in the nineties I would expect games to be rendered at that rez with no problem, and yet there are problems, and the games you mentioned are NOT the only ones with fluctuating frame rate issues by FAR. Not only that but MOST AAA console games don’t even render at 720p, but at even lower resolutions.

And it’s not just about resolution, AA makes a tremendous difference in image quality, and again, most AAA titles on console do not use it, because the hardware is simply not capable.

Are we talking about the same generation of consoles, here? If so, on what are you basing these claims?

From the resolution dissection post I linked to, 80 out of the 99 games listed for Xbox 360 render in 720 or higher. So only about 24% of the (generally) AAA titles listed don’t get up to 720. In what universe does this count as “most AAA console games?” And, AGAIN, I have to mention that every single one of these games upscales to 1080, and most consumers can’t tell the difference. That’s why, y’know, they have people zooming in on raw screen captures of Halo or GTA and counting the pixels to discern the native resolution - because when it’s on a TV, it pretty much looks exactly the same as anything HD.

And as for anti-aliasing, there are 69 titles, or 86% of that same list that apply 2xAA or better. Sorry, but I wouldn’t call that “simply not capable.” If you want to, tho, go ahead. :slight_smile:

My bad. I was looking at the wrong list (I’ve got a similar thread link with more games on the list). And yes, even on my list (when I’m not looking at the wrong console!) about 80% of titles render at 720p or better.

The AA seems wrong on your list though. I’m SURE some of those titles have been said not to have AA by both devs and reviewers. Also, some of the comments on AA lead me to believe these are not accurate. For example some games claim to be able to switch AA dynamically based on frame rate. I doubt this very much since hardware from the time, heck even hardware now cannot do that. A lot of the reasons why consoles can’t do AA in some titles isn’t just the higher processing power needed (and GPU’s of the time had to work harder than modern ones do to achieve the same effect) but also the increase in frame buffer required to render AA. Consoles just don’t have enough video RAM to achieve this without a monumental FPS drop.

Anyway, I’ve hijacked this thread enough.

My main point was to call out **Least Original User Name Ever ** on his “Meh” comments, which he constantly drops every time a feature of PC’s which is notably, down right obviously better than consoles gets mentioned. And to point out that consoles for the most part do not render games at 1080p.

You may continue with your scheduled thread.

Fair enough! The only thing I can figure is that console hardware has an advantage over PC GPUs of the time, because PCs, even fresh installs, are thinking about things like security, networking, and multitasking with various other programs, while console OS’ are built solely for single-program performance. Not sure how likely AA switching is, tho, in any program.

Well, there’s no accounting for taste. My guess is that console gamers (present company included) get really used to the perks of having a friends list and picking out what they’re going to play based on that, rather than picking a game first and deciding how and with whom to play afterwards. It’s just one of the ways that PC and console gaming cultures are so disparate in what they expect and want out of a given multiplayer game.

And I envy that integrated social aspect of consoles. We have steam on PC and with 15 million users it’s not too shabby, that’s more than PS3’s have sold, but no quite up to xbox 360’s. But the entire scene is rather fragmented on PC. We have xfire, pulse, steam, windows live (which ties in a little bit with xble), blizzard is coming out with a client as well + the non gaming social stuff like AIM, live, twitter, etc.

It would be nice to have one piece of software where you could see everyone and what they are playing, communicate with friends, and see achievements, etc like in the consoles, or at least like on XBLE, which seems to be the top dog and the best done system for consoles.

That’s because you don’t know any different, which is why you keep suggesting friends lists as a solution to the problem described in the OP. Friends lists are a shit solution to organizing online games; if I thought they were what I was looking for, I’d be using them already.

Online console games are superior to PC games in one key area: no hackers or people using cheaterware. Yeah there are people who exploit loopholes in games, but I don’t need to worry about people shooting me through walls, having unlimited life, being able to track my guy across the map etc…

PC gaming online sucks for that reason alone.

Xfire?

But Xfire is just another system to worry about. I have xfire and I have steam, and I have trillian handling my two other social networks and I have windows games live and I have pulse. And I’m sure when Diablo III and starcraft II come out I’ll be on some blizzard social network too :frowning:

I want ONE system for all things.

And Cubsfan, cheaters and hackers stopped being a problem for most games a long time ago. Most games come with very reliable anti cheat software, and those who don’t usually have a built in system to prevent it or have netcode engineered in such a way as to make it impossible to cheat in a meaningful way.

It still happens on occasion, sure, but it’s not really a big problem at all.

Um. Likewise?

Really, a unified friends list across games is not only a great idea, but it’s a great idea implemented very well and with no hiccups. Steam seems to come closest, but that only accounts for games through Steam (right?)

I can’t believe there’s a heated argument to be had. I suppose folks have strong opinions on everything, though.

Steam does tag along for the ride when playing other games, as long as you launch that game through the steam portal. You then have access to the social networking aspect, but unless the game you are playing is actually purchased through steam (or is a a valve game bought at retail) you won’t have access to steam matchmaking and achievements.

Steam is great but it’s got nothing on XBL. I love being able to play a game while talking to another friend playing a different game, then we both decide we want to play together so I can just invite him on in or we can play a third game altogether. It’s just amazingly well integrated into the console itself.

Friends lists are a great way to organize games, there’s no doubt, but I love the PC ability to choose servers as well. In all honesty there’s no reason why consoles can’t do this, they just don’t, which is annoying. Both options would be the best of both worlds.

How long has it been since you’ve been on steam? You can do these same things on it.

Right.

Well, hell. That’s all it took? Implement both?

Maybe I’m dense, but I don’t see the value in picking your server. The way I gather, it’s like Gears of War (the first one, at least) with regards to finding a game. what happens after the first game that you chose ends? Are you booted to the lobby to search for another game?