The death of PC games is nigh!

Hmm, I think I misspoke here. Is 720p 29.97 progressive or 59.94 progressive? It’s certainly not 60i.

Answering my own question: 720p seems to be 60p, 30p or 24p. Does anyone know what frame rate the next-gen consoles will support? I’d expect 60p, but I haven’t heard.

You can’t compare these straight up. You’re two feet from the 19" monitor, and eight feet from the 42" plasma. In terms of visual experience, the real question is one of dots per degree of visual field. I’m too lazy to do the math, but this is why consoles look okay even though they have crap resolution - you’re sitting much further from the display. Of course, unless you have a giant television, this also means that the game will be less immersive, because it occupies a much smaller portion of your visual field (and if you have a giant television, the low resolution becomes apparent).

I have a funny experience to relate that seems to support the point Alien is making. In the early '80s, I took my trusty Intellivision console and hooked it up to a portable TV/Radio/tape deck so I could test whether apparent resolution could be improved by shrinking the screen. The TV screen must have been no more than 5 or 6 inches across.

Well, I was astounded at how “good” the graphics were. I could barely detect any pikelation, everything looked smooth and crisp. Of course, everything also looked small, but there was a definite improvement in visual quality. Same principle of increased DPI, so I think it does make a difference.

The psx didn’t excel at anything at the time. It was just launched. Multi-platform games are a minority and while I can definitely the logic behind picking them, it is superficial. Multiplatform games very rarely take full advantage of their platforms for various reasons which i’ll get into if you ask me to. Hopefully, you already know why. Our goal here is to compare what looks best visually on both platforms.

I wouldn’t be so assertive in a debate which topic I lack expertise in.

It seems you’re forgetting that we’re not judging gameplay or genres. Just graphics. I much prefer japanese RPGs myself. As they say, there is no accounting for taste.

Yes. I don’t disagree.

Again, you are correct. My argument is that PCs are going to lose most of their strenghts as time goes by.

I’ve been reading that sentence over and over and I can’t really make much sense of it.

That’s what you say. I say different. I believe my methodology is quite valid. You pick good games and compare the price they sell for at launch.

There was no reason to only pick out multiplatform games. In any case, i’ve lost interest in convincing you. Let’s wait until someone who works at some game store comes here and offers us his opinion based on his empirical experience.

I’m sorry if I misled you. Yes, absolutely, console games are where the money is atfor both publishers AND system makers. Consoles are usually sold at cost or near it. With the exception of the Xbox which is definitely sold at a tangible loss. I don’t know which consoles were sold at a loss and for how long but it really doesn’t matter because I agree with you.

Once again, that is what you claim. It is annoying. I’ve spent a lot of time detailing my reasoning and providing examples. If they do not satisfy you, that’s fine. I’m not a prophet, I could be wrong. I simply believe I am right. Time will tell. You may also take into account that a real live pc/console developper agreed with most of what I said.

Don’t mix past, present and future when you’re speaking about my predictions. They all take place after next-gen consoles have been launched. My conclusion is based on the premises I have laid down for why consoles are going to take over. development costs, increases in resolutions, yadda yadda yadda. You can always go back and read my earlier posts as well as pochacco’s for that.

Again, you’re mixing present with future.

It was a rhetorical question. It is indeed coming on the Xbox 2. I just wanted to debunk you weak assertion that “real” rpgs somehow belonged to the PC for whatever reason.

Are you honestly saying that open RPGs can’t be handled by consoles somehow? Of course they can. Heck, consoles have been handling MMORPGs for christ’s sake. Doesn’t get much bigger and complex than that.

God deliver us from fanboys. Some people are still gaming on MACs too. I say: more power to you! (Now, all the macfanatics will be after my blood :slight_smile: .

You may dismiss whatever you want. When you’re done dismissing, go play on xbox live! and see if you start changing your mind. The only advantage of PCs when it comes to multiplayer are the community-run dedicated servers. It’ll be interesting to see what happens on that front when console games start trying to get 24-64 people playing at the same time. Right now, i think any console game that supports more than 16 simultaneous players must connect to dedicated servers run by the company publishing the game.

As far as the whole 64 bit thing goes, you just don’t know what you’re talking about. I’m sorry if it sounds harsh but it’s the truth. You ought to have mentionned the upcoming physX card for the pc. It really looks interesting. My rebuttal for that would’ve been that by the time it comes out (a year or so) and actually starts influencing pc gaming (another 3-4 years at least), a new generation of consoles would’ve come out and they’d just go and include that in their architecture.

Of course. My mistake (sorry squeegee. But it’s not like you thought, it’s the other way around. A 19" inch monitor at 800x600 and a 42" HDTV at 1080i both have about 2000 pixels per square inch. Such a HDTV will display roughly 2 million pixels in total, while the monitor has only 480,000 because it’s smaller. So obviously, the machine running the HDTV has to be more powerful than the other for the frame rates to be equal.

Well, as you said, when you use the word powerful about a computer/console, people (at least me) tend to think about what’s in the box, not the monitor. But since you do, I would say that by stating that consoles are visually more powerful than computers, IMO you start off from the worst position possible. I will not discard that consoles may become very powerful in the future, but visuals is an area where I believe computers, at least technologically, will have the edge for a long time. If consoles take over the computer completely, I’m pretty certain visuals will be the last to fall. I have written much about that in this thread.

Strangely enough, you’re one of the first console gamers I know who have made such a statement, almost every cross-platform gamer I’ve met or discussed this with online agrees that computer games do look better than console. Just a reflection, not to be taken as evidence.

Nowhere did I imply did that we should discuss bad games. The point is: You cannot cherrypick just one game to represent an entire multi-billion dollar industry. We rather should look at a group of games. That doesn’t mean we should include bad games, it only means we should broaden our views.

Sorry. You made the claim - you bring the cite. I’ve only brought two cites myself to this thread, and neither of them addressed the OP specifically, and it really doen’t matter that much since this is more of a point-of-view type if discussions. But don’t ask others for data if you don’t bring outside data yourself.

However, I’ve been reading up a bit the last couple of days, and it does look as the future of gaming will change (or shift) over the next few years. I’m starting to lean at Xbox as the #1 console, PS3 in trouble, and PC gaming bouncing back. I’ll bring some more information later, tomorrow probably.

RaftPeople, since you’ve cited the performance of the CPUs coming to the consoles, out of curiosity, what is the retail price on these babies? It strikes me that if three Power5’s, more powerful than any Pentium on the market in your view, are available in a console for $299 (or whatever), I can get the most powerful CPU on the market for less than $100 a piece. Can you tell me if this is correct?

Please see my self-correction in post to Gozu above about frame rates.

Anyway, the concept is simple: The more you magnify an image, the more “pixelized” the image becomes, in the end to the point where you can count every pixel in the original image with your own eyes. And since everything you see on the screen is made up by pixels, pixel density defines detail richness.

In gaming today (even at high resolutions without AA) this can be seen quite clearly with a thin black horizontal/vertical line edging an object, for instance a the edge of a painting or a box. You’ll see the black line distorted, like this:
-------------------___________

I think that b]Gozu** also said earlier that Microsoft will require Xbox developers to support 720p for Xbox 2, however, where I’ve I read about this it only says they will recommend it.

I think it would be GREAT if this is required, because by forcing console game developers to make textures with higher resolutions the first time around porting games to the PC will become much cheaper. Sadly, today many console to PC ports have many of the same low quality textures as in the console version.

Just so we’re clear about this: AFAIK, 720p HDTV is today equivalent to 1024x576 (16:9 ratio) and is equal to the 1024x768 standard on computer monitors (4:3 ratio). From what we know, this is the maximum resolution expected to be supported by most consoles games in the next few years. It’s the minimum resolution used by most PC gamers, except those with older equipment.

You should also know that 720p should actually be 1280x720, so most HDTV manufacturers are cheating here to save few bucks in production cost. Some HDTVs also have 1024x1024 resolution so they can display native 1080i by skipping some horizontal lines.

It probably isn’t worth going into detail here, but this somewhat ignores the effects of antialiasing – the graphics are supersampled at 2x or 4x or whatever of the final output resolution. The effect is that the final output has perceptially higher resolution than the pure display resolution. If I’m looking at a 1920x1080 rez display that uses 4x antialiasing, then I’m arguably viewing a subsampled 7680x4320 display. You’re probably quite familiar with this, but it seems a point worth emphasizing.

My undestanding was that it will be required, but I agree it’s unclear what the final specs will be.

Alien, I’m unclear what point you’re making. 720p is exactly 1280x720 pixels at 1.0 aspect. There are more than a few substandard consumer HD displays that leverage 1024x768 or whatever to display 720p, but they’re resampling, and the display is not showing 1:1 pixels. 1280x720 is not equal to 1024x768 pixels – it’s a completely different aspect. I really must be missing your point here.

I agree, there are too many substandard HD sets out there. I’m really missing how this is pertinent… ?

OK, on rereading, I think I see where you’re going with this. Your equating 720p with 1024x576 (wtf?) was confusing. To recap:

My asssumption was that 720p would be the minumum HD resolution in the nextgen consoles. My understanding is that MS has mandated this. I’ll see if I can google up a cite. Of course, the consoles must also support SD (NTSC/PAL) resolutions for some time, but they’ll finally be in the HD world, minumum.

My posit was that this would tend to move games up to 1080i support, because most home HD receivers will support this and high end gamers will want it. There are several tenuous assumptions here, but it doesn’t seem very unreasonable.

My question was: if the consoles move to a greater support of 1080i resolution in the next year or so, what is the resolution difference with PC games? My assumption: little difference.

Clear?

All this talk of consoles supporting HDTV seems to be glossing over a rather relevant point: HDTVs aren’t cheap. The cost of a decent-sized HDTV will obliterate any hardware price advantage consoles have over PCs. Now, you may argue that people will be buying HDTVs for regular viewing anyways, but of course people are buying PCs for other purposes anyways too.

OK, on the “minimum” standard, here’s what we know about XBOX2:

So 720p will be the minimum requirement. XBOX2 will support 1080i (XBOX1 supported 1080i) so 1080i will be the maximum requirement.

Isn’t “maximum requirement” an oxymoron?

Anyways, the highest resolution supported by next-gen consoles would be 1080p.

Point taken. I was driving toward 1080i being a “not unusual” requirement for next-gen.

Yes, that is correct and that has been the assumption made throughout this thread. We’ve always compared the cost of consoles to that of the pc hardware expressly added to an already existing basic system to play games.

As far as HDTV pricing goes, I expect moderately priced HDTVs in the 27" to 32" range to become a standard within 2 years. Of course, while the like of 42" or 50" HDTVs will significantly go down in price, they will remain expensive (just like big High End CRTs have been expensive. Nothing new there), so in the end, everything cancels out.

The point: The broadcasting standard (ATSC) defines 720p as 1280x720, but few HDTVs support that (and they don’t have to). Instead they support a slightly lower resolution, either by skipping lines or simply using a lower resolution (like 1024x576, HDTV is 16:9, remember?). This standard also defines 1080i as 1920 x 1080, and 1080p which currently, AFAIK, isn’t in use.

We’ll see how many games support HDTV over the next years, I hope it will be all of them. I’m pretty sure it will take more than 5 years before HDTV has replaced the standard 4:3 in a majority of households due to the price, but console owners are likely to upgrade faster.

Your assumption assumes that console owners also have a HDTV and that game developers make games supporting 720p. The latter will increase production cost and will be seen as a waste of money by developers if too few console owners own a HDTV. I predict we’ll see both, with the blockbusters surely supporting 720p. Unless Microsoft makes 720p a requirement in their license, as your cite seems to suggest.

I already said two pages ago that HDTV is the only thing that I really expect will close the gap visually between console and computer gaming, but I also said above (or meant to say) that HDTV resolution really isn’t that high when compared to present day computer gamers. They have already been gaming at similar to HDTV 720p resolution for years, and if game discussions on various game boards are anything to go by, the hardcore crowd is currently going for the 1600x1200 mark.

I disagree with this. Most HD sets above the bottom of the barrel are 1280x720 or better. I just clicked around on Best Buy’s web site, and most HD flat panels were either 1366x768 or 1280x720. Only the smaller, much less expensive (and mostly 4:3) sets were lower resolution than 1280x720. What is true is that very few (almost no?) sets support native 1080i resolution. In fact, I’m using an HP2335 monitor to view native 1080i video rather than downsample, because I need 1080 for my work.

It’s definitely sounding like that. This was why I’m being so optimistic that the graphics gap is going to narrow quite a bit in the next couple of years. I guess we’ll see.

You should have a look at this offsite discussion. It’s six months old. And btw, as for the 1080i format, that’s really a 540p resolution (it’s two laced 540p frames, that’s why i=interlaced, p=progressive). so 720p is still the best you get on HDTV. As far as I understand 1080p won’t be supported by the next gen consoles, nor are there any broadcasts in 1080p, as opposed to 1080i. I have to confess I have no insight into how HDTV manufacturers adjust the broadcast signal to native resolution. I’ve even read that some don’t maintain the 1:1 pixel ratio in the process. Go figure.

(As for Best Buy, I suggest you go directly to the manufacturers sites and look for “native resolution”, not “horizontal resolution”. They may differ.)