I’m sorry, but this just not the case. 1080i is 1920x1080 resolution, period. The detail on all the ‘odd’ lines is offset by 1/59.94th in time, but there is spatial information there that is discrete from the ‘even’ lines. Claiming that the interlace of 1080i somehow halves the resolution is silly.
Here is a list of the ATSC formats. 1080p and i are both listed as 1920x1080.
There is some small amount of cheating – for example, prosumer HDV cameras like this one encode video to MPEG at 1440x1080 @ 1.333 pixel aspect 1080 resolution. But actual 1080i broadcasts are at 1.0 pixel aspect, so you’re really getting all of those pixels if you have a monitor that can display it.
I’m not a broadcast engineer, and perhaps one will come along in a moment who can clarify this even further. But I work with HD video on a daily basis, and your claim that 720p is better resolution than 1080i is just not correct.
I wasn’t looking at the horizontal resolution, I was looking at the “native resolution”. I don’t dispute that some sets are < 1280x720, but you said “most”, which I didn’t find to be the case. I’d also previously looked at manufacturer sites before buying a 1080i 1920x1200 LCD monitor for my video work, so I’ve had some past familiarity with this issue. I’ll admit I’d restricted my sampling to LCD flat panels, so if you want to trot out a bunch of rear projection displays that have poor resolution, maybe you’ll make a case.
Almost forgot: to the best of my knowledge, there’s no such thing as “540p”. If you want to make the case that one 1080i field @ 1920x540 is “540p”, I suppose you’re entitled, but I don’t know of any standards body that has defined such a thing.
The whole interpolation thing is just not a very intuitive concept. In terms of the amounts of raw image data that is streamed to the screen, the amount needed for 1080i is 12.5% higher than that needed for for 720p. That doesn’t mean that the image looks 12.5% better though. Most people think it looks a whole lot better than that while a few with discerning eyes say they prefer the stability of 720p.
So go ahead and do an actual comparison. Just demonstrate that you are actually doing someting other than asserting without support.
Again I ask you for more details on exactly how a console game (the cream of the crop) is supposed to look better at half the resolution and half the refresh rate or worse than you will find on a PC (also playing the cream of the crop).
Ah, the old fallacy of authority. We know nothing about your expertise, and are rather more interested in arguments and evidence.
No, you’re the one focusing on graphics – as I recall I hit a number of different criteria in my past messages.
Indeed, and my argument has consistently been that I see no indication of that just yet. This is not the same as saying that there is not an impressively booming game console market, which there obviously is.
It’s simple: provide support for your assertions. Cites. Back9ing for assertions. Facts.
You don’t have any methodology to speak of, not to mention any results to go with that lack of methodology! You have done no comparison. You have simply (again) asserted that this is so and that is that. Prove it or stop repeating it. You’ll note I actually did the work and sifted through Gamespot to collect a sample, while you have still avoided any semblance of investing some effort to validate your assertions.
The reason to pick out multiplatform games was (very clearly, as I stated) to provide a common frame of reference in the comparison exercise. For example, graphic quality is highly dependent on what is actually happening on the screen – it’s easy for Silent Hill to have amazingly good graphics because most of the game is static. It’s a lot harder to do that in a game like Half-life, which is in constant motion, always being redrawn and sometimes at a rapid pace. Do you finally see my point? Also, different publishing houses have different approaches to pricing - just look at the low price for the Chronicles of Riddick game, for example).
You need a frame of reference, period.
I don’t know how many more times I have to point out that this is entirely wrong. Consoles are money-losers. Look:
The one possible exception (among consoles that actually succeeded) to this rule is Nintendo, and only to a limited extent. Look again:
Spare me any more unsupported, unfounded, unwarranted denial. Consoles are sold at cost only towards the end of a cycle, when redesign process has dropped sufficiently in cost to reduce the losses on hardware sales (but then, consoles drop in price throughout their shelf life, so I suspect they always and consistently lose money, even toward the end of a cycle).
You’ve given your opinion, and it’s not been especially well supported because you’ve come out with some rather surprising statements tha require a modicum of hard evidence or at least speculation and extrapolation of adequate quality. You don’t seem to understand that you can’t gainsay what someone else says merely on the strength of your opinion. This is GD: pull out some cites or modify your arguments if you can’t make any further progress.
As if I haven’t already. I repeat, you are voicing an opinion without particularly good support. You are speculating on the future of gaming in a supposedly informed manner, and yet your primary conclusion has been merely that the end of PC gaming is nigh. A ridiculous conclusion for all the reasons already stated.
You could at the most argue for a downturn in PC gaming that would in all probability be cyclical in nature. But until consoles replace PCs, until HDTVs come with built-in consoles, or something like that, your argument is specious.
Oh yes, and one game title on a console that is as yet unavailable really constitutes a “debunking”… sure thing!
Again, the exception rather than the rule. As I said before, PC RPGs are “deeper” (actually, those were someone else’s words). Console RPGs are role-playing “light”. XBox has tackled this problem by going after PC-style RPGs, and has had a measure of success with a few titles. Again, exception rather than the rule.
I am not by any definition a PC fanboy (which is a particularly weak way to address an argument, you’re obviously a console fanboy but you don’t see me bringing that up).
Look! You made a glimmer of a sound point by laying down some arguments after being prompted and prodded! Aside from LAN parties and such small potatoes multiplayer, I would say the community dedicated servers right now are a significant attraction, and one that is generally lacking from the console world.
This coming from someone whose idea of a discussion is automatic and unsupported denial? I’ve been exposed to harsher treatment than yet another unsupported statement.
This has precious little to do with my oft-repeated point that 1) 64 bit computing has brought (and will bring further) performance/quality increases with the important caveat that it requires corresponding 64 bit software, 2) that is yet in its infancy and must be adopted further, 3) that it confers advantages to games that make strong use of AI, etc.
I’d find you a cite on the last one, as I distinctly recall reading as much in a variety of sources, but I’ve run out of time.
I must admit all this talk about the actual technology comparisons are rather lost on me, so I’m not going to overly address them. You guys have gotten caught up in the hardware argument, and I’d like to see some of you address other points to the debate.
As I said earlier, I am primarily a PC gamer. We have a PS2, but it’s mainly for the kids, because the consoles don’t really offer games in genres my fiance and I prefer (RTS and MMORPG), and where they do, the console falls short in the genre. (I’ve tried the EverQuest Online Adventures game for the PS2 by the way, and although it was an ok game, it definitely fell way short of any PC MMO in pretty much every way, even with a keyboard, which is essential for a MMO.)
Now, I also stated that we have multiple gamers in the house. Multiple PCs for the household are sensible as well as feasible, because PCs have so many uses. The kids need one to do their schoolwork, my husband needs one to do his work, and I need one because…well, because I love my PC, and I ain’t sharing it with anyone.
Now, consoles may very well look as good as PC games with the addition of an HDTV to hook it to, but how many homes currently own one HDTV, let alone several?
On the other hand, multiple-PC households have been a growing trend the last 5 years, and the average home according to IDC now has 1.5 PCs.
Take World of Warcraft for instance. We have 3 players in the house, , and we frequently play at the same time. If we wanted to play a console MMO (of which there’s currently only one of each (and it’s not a HDTV), and it’s not doing well compared to PC MMOs, but for the sake of argument, let’s assume there is a comparable MMO for the console here), we’d need 3 TVs and 3 consoles, of which we currently own one, whereas we already own 3 computers and plan on adding a fourth in the next year, because of the PC’s diversity. That’s not going to happen, and we’re definitely not going to buy 3 HDTVs to do what our PCs already do now.
Another factor to look at is demographics. Who is playing these games we’re talking about?
The average age of gamers across the board (PC and console) is rising, with a median age of 23 (cite to follow after I raise the other half of this point). The other part of that is a growing market demographic that gaming (both PC and console) has traditionally overlooked, the adult female.
So there you have the demographic of both adult females, as well as online gaming to add into the debate, both of which I think will be powerful factors in the PC vs. console debate, especially when you factor in multiple PC households.
Like I said, while alot of people want and are willing to purchase multiple PCs, I don’t see multiple HDTV household owners becoming a large factor for quite a long while, so the “which looks better” argument becomes kind of moot for multiple-gamer households if you’re going to argue you need a HDTV to rival PC-quality graphics.
Why would people want to argue over who gets to use the HDTV/Console tonight, when everyone can be satisfied playing their preferred game on their own PC?
I also think you’re going to have a harder time convincing adult women to play console games more often than they play PC games (especially when they likely share a household with other gamers), but that’s just my personal anecdotal opinion/observation, and shouldn’t really be taken into account for the sake of this debate as such.
COMPLEXITY is the key why PC’s games will one day rule. Look here for a brief description of what looks to be an exciting new game!! Just a brief tast:
already explained why we were focusing on graphics earlier. It’s really obvious, I think you just got caught up trying to argue against every single thing I say.
Nah, no cites. Too much work.
already addressed.
Already addressed. Silent hill looked like crap by the way. Anyhow, i asked a retail guy what his experience with pricing for pc and console games was and he said:
“New games usually come out at the same price. Afterwards, pc games tend to drop more in price. On average, I’d say pc games are $5 to $10 cheaper.” This is consistent with both our findings so I’m assuming we can put this issue to rest.
Some consoles sell at a loss, some at cost. As I said, doesn’t matter.
It’s possible, doesn’t matter either way.
Nah, no cites. Too much work.
Time will tell. Our arguments have become cyclical, that’s for sure.
Question: can consoles handle nonlinear rpgs?
Answer: absolutely. Q.E.D
I wasn’t referring to you.
You shifted from consoles vs pc to 32 vs 64 bits. You must first establish that upcoming consoles have “less bits than pcs”. Turns out it’s not true.
Alien, sorry for the delay in responding to this, I haven’t been back here for a few days.
Power5 vs Pentium, "in my view"
You say “in my view”, but really it is in the view of the data, and all of the unbiased, independent analysts. The Data
Integer Performance
Xeon @ 3.6ghz 1,800
Itanium2 @ 1.6ghz 1,535
Power5 @ 1.9ghz (1 core=1,452) 1 processor=2904
And yes, the Power5 is most definately more powerful than any Pentium, Xeon, Itanium, UltraSPARC or Athlon on the market including the new dual core pentiums, and the new dual core AMD’s as well. To Your Question of Price
Valid question. I have wondered the same thing. Power5’s aren’t cheap, but then again, neither is Intel. The processor is a variant of the Power5 (PowerPC 976) just like the Mac has a PowerPC 970 which is a variant on the Power4. They may save money in a variety of ways, like reducing L3 cache (probably won’t be the 32mb cache that is on the chip that crushed HP and all others in the TPC-C benchmark).
I would guess they will do the same types of things they are doing in the Cell, which is removing stuff that doesn’t help for game type processing, or that they can work around through software/compilers.
I’m really not sure where the most money is spent when making a processor, so I couldn’t tell you what they will do to get down to a reasonable price per unit.
You’re abosolutely right, the hardware talk is getting us nowhere. Your argument however seems to be that the future of gaming consumers is multi-person households, which is an interesting idea. In the end it’s who’s playing the games that counts, it’s not the hardware. Will gamers continue to play games after they get past their school years to settle down and marry? Some yes, but many?
IMO, it pretty much boils down to game genres. There’s a huge potential for consoles to totally own the fixed camera genre (fighting games, racing, jumping puzzles). I played “Beyond Good and Evil” on the PC not that long ago, and the fighting scenes (fixed camera mode) were quite different from the rest of the game and a pain with the mouse. It can be done beautifully on PC, as proven in Jedi Knight & Jedi Academy. Playing an action game with a stick is like begging for mercy (or death). While console owners will continue to love their action games, it will be never be gaming the way gaming is done with a precise tool.
Looking at the future, it’ll be interesting to see if the game industry continues to increase game complexity (not counting MMORPGs), or settle for better graphics inside a window of simplified gaming. The future might hold huge open landscape levels for single player action games, where the player can engage in a multitude of tasks that requires more than a console controller.
On the other hand: Has there ever been released any ground-breaking intuitive game on consoles? Or will there even be any more ground-breaking intuitive games ever? Doom, Quake, Half-Life, Descent, System Shock, Deus Ex, etc, they all seem like ages ago. I’m sure someone will throw Halo in the mix, but IMO, that was a mediocre game by all standards. Some might say that Far Cry, tactical shooters, and MMORPGs have been pushing the game industry the last couple of years, and surely strategy games have made giant leaps lately, but I’m still waiting for the next big game that everybody will talk about 5-10 years from now.
In the fairness of sharing data, it might be interesting to add the current release by platform for the last 3 months:
PS2 = 60 (65)
PC = 56 (107)
Xbox = 50 (51)
GC = 21 (23)
The primary number is the number of games rated by reviewers, the second includes additional games released during that time period. Doesn’t look like PC gaming is going away yet.
If you’re still talking about graphics, this has been proven to be true beyond a shadow of a doubt. Computers push more pixels per square area, and PC gamers are already playing at the same or higher resolutions than next-gen console gamers.
Nope. 1080i is two interlaced 540 frames. Together they make up the 1920x1200 pixel picture (they only push half of those pixels per frame/group). That’s what the i stands for in 1080i, and not the p in 1080p. Pictures with fast moving objects are hurt by this.
Randy Hoffner, ABC Television Network says:
Top of page, highly recommended if you’re interested in this.
You have been stating the superiority of the next-gen console prosessors in this thread, quoting benchmarking results available per today. And now you claim that this is not the CPUs we’ll see in the next-gen consoles? That next-gen consoles CPUs will be a downgraded version? Care to explain?
Currently top of the line desktop CPUs cost between $500-$1000. If there’s 3 CPUs even better than the best stuffed in a console for $299, can I disassemble the console, take out the CPUs and use them for something else? That was really my question.
Alien, if you want to argue that “interlace is bad”, that’s your priviledge. But you’re conflating this with resolution, which is misguided.
1080i is not 540p (there’s no such thing) times two. It’s 1920x1080 resolution. None of your cites dispute this, they just talk about how “bad” interlace is. Different subject.
As I stated from the beginning, the processor in the Xbox2 is a custom variant of the Power5, which is being called the PowerPC 976. Optimized for the game environment.
As for the downgrade:
All of the mfg’s sell chips with different configurations of L1, L2 and L3 cache and clock speed, bus speed, etc., and each is at a different price point.
Don’t worry, both the IBM (Power5) and HP machine (Itanium) had extremely expensive options, but it still tells you something when the best Power5 config performs 300% better than the best Itanium config.
Some key points:
Microsoft spends significant money and risk switching from one platform and instruction set to a different one. Not a trivial decision, they must have seen a benefit.
Processor they switch to is proven in the market place to be well beyond anything else available (>x2, although as we know some of that gap is being narrowed with AMD and Intel’s dual core, not all of it)
IBM is 4 years ahead of AMD and Intel with dual core technology (Power4 (chip before the Power5) was dual core in 2001, AMD/Intel in 2005)
IBM multithreading far more effective than Intel hyperthreading
They will be placing 3 of these processors in console. Now it is possible that they are putting 3 cores on 1 die, instead of 3 dual core procs, lets assume that is the case, which means >x3 instead of >x6.
AMD and Intel release dual core so it’s back to >x1.5
But it will run at 3ghz, AMD/Intel at 1.9ghz, now back to >x2
Low level benchmarks testing just chip performance show Power5 winning significantly.
High level benchmarks testing throughput of a full system show Power5 winning at about the same rate as the low level benchmark.
Can You Grab CPU from XboX2 and Use It Somehow
Sure.
But I assume you know it’s know the same instruction set as x86, so you couldn’t run windows etc.
For me personally, I am in the middle of an AI project that requires far more power than I have on my PC.
I plan on purchasing either 10 Xbox2’s, or 10 PS3’s and linking them together to get the power I need, should be cheaper than doing the same with PC’s.
That’s why the PS3 and the cell has me excited, because each of those boxes has 8 processors, which means I’m really getting about 80 procs.
You’ll have some serious hurdles to overcome. They’d need to be hacked in order for you to run unsigned code on them and there is the matter of linking them up which is also non-trivial. Then you’d need to get your hands on development tools to code your application for them. I hope you’re not in a hurry