PS4 to be announced Feb 20th - Your predictions

And this is one of the best things about console gaming, in my opinion. PCs will always (thanks to upgradeability, hobbyist interest, etc) be at the very cutting edge of gaming. But the Xbox 360 came out in November of 2005; I suppose it’s possible a new xbox will be out by Christmas 2013, but some time in 2014 seems more likely. By the time major publishers stop producing xbox 360 games, someone who bought this console on release will have enjoyed at least eight years of new releases, and possibly more. And during those eight years, our console gamer will never have had to worry about system requirements, or buying more RAM/upgrading video card, etc. Eight years of solid mainstream gaming.

PCs can’t do that. Seriously, can you imagine declining to upgrade a gaming system for eight years? Even if you’re just upgrading the video card and buying more RAM after four, you’re very nearly at the cost of a new console right there.

Granted, there’s a cost to this - xbox 360 games look better now than they did when the console first game out, but it’s nowhere near the leaps in PC gaming over the same period. But if the most important thing to you is new content with high production values (but not necessarily cutting-edge graphics), consoles are certainly the best buy.

In short: Consoles hold their value over the years much, much better than gaming PCs.

Given the number of $80-$100 media extenders, none of which is optimal for games, I wonder what that means for console pricing? $100 for a box and $50 per Bluetooth controller could be a sweet spot.

I agree, something in that price range becomes an easier impulse purchase.

I think the required upgrade is really only a single generation on the PC.

If you bought a PC a few months after the xbox 360 came out, you’d still be playing all of the new games on it… but at a console quality experience. you’d be getting 20-30 FPS at 720p (or slightly lower).

Now if you bought it a few months BEFORE, then yes, you probably would have had to purchase a new GPU a year or two into the generation, as there was a significant shift in GPU architecture at that time, and games eventually starting taking advantage of that new architecture ina way older GPU’s wouldn’t be able to keep up with. But that would have been a $200 upgrade a couple of years after your purchase - assuming you purchase a lot of games, you probably would have saved 4 times that amount by now on Steam sales.

Now, please understand that I’m disputing the reality fo the situation - I’m not disputing the fact that you’d need to be tech savvy to understand the situation int he first place.

THAT is the real entry to barrier on PC. You need to to know and probably be interested, in PC hardware. It’s not about having to upgrade your PC every x months (unless you want to).

I upgrade my PC because I like immersive gaming experiences. I don’t want to game at below HD resolutions at 25 FPS with low rez textures, outdated lighting and shadowing, and jaggies all over the place. If I wanted that experience I’d buy a console.

I think the required upgrade is really only a single generation on the PC.

If you bought a PC a few months after the xbox 360 came out, you’d still be playing all of the new games on it… but at a console quality experience. You’d be getting 20-30 FPS at 720p (or slightly lower).

Now if you bought it a few months BEFORE the xbox came out, then yes, you probably would have had to purchase a new GPU a year or two into the generation, as there was a significant shift in GPU architecture at that time, and games eventually started taking advantage of that new architecture in a way older GPU’s couldn’t keep up with. But that would have been a $200 or so upgrade a couple of years after your purchase and, assuming you purchase a good number of games, you probably would have saved 4 times that amount by now on Steam sales alone over consoles.

Now, please understand that I’m disputing the reality of the situation - I’m not disputing the fact that you’d need to be tech savvy to understand the situation in the first place. I agree with you there.

THAT is the real entry to barrier on PC. You need to to know and probably be interested in, PC hardware. It’s not about having to upgrade your PC every x months - that is simply untrue for the msot part, it’s really mostly about upgrading because you want to.

I upgrade my PC because I like immersive gaming experiences. I don’t want to game at below HD resolutions at 25 FPS with low rez textures, outdated lighting and shadowing, and jaggies all over the place. If I wanted that experience I’d buy a console.

Uhm, not sure what happened there. I was trying to edit. Sorry.

Yup; The major obstacle to the “mainstream” success of PC gaming is that most people either don’t care or can’t be arsed.

Don’t really see that changing. Though if Gabe Newell somehow performs the absurd magic he was talking about in his speech where he said the “steam box” would launch at $100, well…

Reasonable points. Bottom-line, if you want a premium gaming experience, and you’re willing to pay for that premium, PC gaming is the better platform. No question.

With regard to the costs involved, though - note that as soon as you’re talking about investing in a $200 graphics card (which you almost certainly will, at some point in the PC’s life cycle), you’ve more than doubled the initial expenditure of even a brand-new console on release day. A newly released console might go for $400 - a midrange gaming PC would probably run you $800. Add $200 for a single videocard upgrade over the 8-year cycle, and that’s $1000 over a console’s lifetime - compared to $400 for the console. And that videocard upgrade really isn’t going to be optional, because new games released for PC after about five or six years are going to be relying on a “modern” card.

If you’re buying a lot of games, Steam might allow you to make up the difference - but that really will take a lot of games, and it’s far from a certain proposition. One can’t deny that PC gaming at its best delivers a better experience than console - but it’s far, far more expensive, and most folks really aren’t going to get that much of a benefit from it, IMHO. (Of course, there are also amazing games for PC that just won’t ever come out for console - but there are also single-console-exclusive titles. )

I don’t necessarily disagree.

I would never recommend PC gaming to anyone not interested in gaming as a hobby. But as a gaming platform, PC is the best platform for the hobbyist, IMHO.

All the gaming hobbyists I know own a gaming PC. Some of them also own consoles though. Makes sense for a hobbyist to own as many platforms as they can.

I can imagine it. It would be like owning a console. I don’t mean that in a snarky way. You’re essentially asking “can you imagine how crappy the pc gaming experience would be if you just picked a certain point in technology and stayed there for 10 years?” but then praise consoles for the same thing. PCs can do that - a 2005-6 gaming PC can still play all the games it could’ve then, so it can still play multiplatform console ports with console-like settings. You’re essentially saying that it would be torture to do that, though, but then it’s all part of the fun of consoles. I don’t get it.

Well, if we all still played Atari 2600s still today, think how much value we’d have gotten out of those things. 30 years and millions of modern mainstream games.

This will be interesting to see if we see a tablets vs console wars in the future. Tablets have certain advantages of the PC (they’re made by people who are actually trying compete on the capabilities of the hardware, so they get better at a really fast rate like PCs have. Smartphones and tablets have come an incredible way in 5 years) but they still have the all in one easy accessibility of consoles.

Unfortunately tablets are made with an enourmous handicap - they have to use parts that use roughly 1/10th the electrical power of PC/console parts, so even though they’re advancing fast, they’re starting from so far behind that they have a long way to go. Tablets could never compete directly with PCs, because they both advance quickly and PCs can use 10 times as much power, but if the current gen of consoles decides to start behind the times, tablets are going to catch up and surpass them at some point in their lifecycle.

Sorry for being late in responding to this, forgot about the thread. I should’ve been more specific - when I said that used sales were on the decline, I didn’t mean in terms of popularity, but more than I think it’s just a matter of time before the Powers That Be find ways to eliminate them. The publishers of the games don’t get a lick of profit from a used game sale - and from their point of view, a used game sale at Gamestop for $30 is $60 they lost directly if that consumer had been forced into buying new instead of used.

There’s already been a long history of efforts to stop used games - heck, CD keys were a thing back in the 1990s. Plenty of games on the PC require you to join the game to your Steam or Origin account, making it impossible to sell that game used to someone else. Lots of console games come with one-time keycodes you need to entire to unlock multiplayer - you can sell the game used, but whoever buys the used copy can’t access multiplayer.

And honestly, I don’t think Sony or Microsoft could really care less if Gamestop went out of business. They would just continue selling their games at other retail stores like Best Buy or Target or Walmart, or heck, online. Yes, it won’t happen tomorrow, but I won’t be surprised if used games are basically extinct 20 years from now. As soon as one of the companies makes the decision that they will gain more sales (from forcing everyone to buy new instead of used) than lost sales (from bad publicity, bad will, etc) from eliminating used games, they’ll do it.

$600 over 8 years? Remember the average $10 mark-up on console titles compared to PC. If you buy 6 new games a year you make that difference up.

A kotaku article is stateing the Xbox will have an 8 core 64-bit 1.6 Ghz CPU and an 800mhx DX11.1 GPU and some custom units to offload the CPU (codecs and the like), and 8 gb RAM

Not a bad sounding springboard, considering it’s twice the RAM and twice the cores of the average Ultrabook. (4Gb, 4 hyperthreaded cores in an i5)

That difference, of course, only concerns the nominal prices for full priced games. The average actual purchase price of PC games is quite a small fraction of that.

The only part of all that that sounds any good is DX 11.1 and 8GB of ram. 8 core slow CPUs is just an awful idea, IMO. Part of why the PS3 never reached its full capabilities is that it’s just not that easy to multithread games. It takes an entirely different programming philosophy, it has major drawbacks, and it’s often just not practical.

You can take something that does a lot of similar processing to a bunch of similar data when one function doesn’t depend upon the next, say, a video encoder, and you can hit pretty near 8 times the performance on 8 cores. But for almost everything else, you simply can’t come close to perfectly multithreading programs. There’s going to be one main game thread, and everything else is secondary. So that one core doing the main thread is going to clog up the whole works and bottleneck everything.

Now clock speed isn’t everything, but 1.6ghz on pretty much anything is way behind the times. Even low end desktop chips run well over 2ghz. The 1.6ghz thing makes me worry that it’s a cell/tablet processor, since 1.6ghz is pretty up to date for those, but as I said, ghz isn’t everything. A quad core 1.6ghz tablet/cell processor is not half as fast as a quad core 3.2ghz desktop, it’s more like 1/10th as fast. Pulling that number out of my ass. It’s actually probably worse than that.

I’d much rather have one core from a modern i7 running at 3.5ghz than 50 cores of whatever CPU they’re going to use, as the single cored machine would more likely match up better with the real world requirements of game programming.

Ironically, it’s the reverse for video cards - “800mhz dx 11.1 processor” is too little information. GPU performance actually is much easier to paralellize, so the number of cores/pipelines/memory bandwidth/all that stuff matters. There are 800mhz high end video cards that have 10-20x the performance of 800mhz low end video cards.

If we are talking the same CPU just different clocks then it is 1/2.

I think his point is that they aren’t going to be the same architecture.

So is the “actual purchase price” of most console games if you allow the same amount of time between release and purchase that you need to get a big deal on a PC title.

Can we stop arguing about some theoretical “actual purchase price” that has far more to do with the behavior of the purchaser than it does with the cost of games?

I bought XCom for $35 at release. Console price $60.

Wrong. Amazon had it for $35-40 very soon after release. Comparable console deals can be found, you just have to look for them. It’s a bit more work, for sure, but I’ve never known a PC gamer that was afraid of a little work.