PS4 to be announced Feb 20th - Your predictions

Well, anything, really. The complexity and realism of physics and interactions with the world. The complexity of AI routines. Tracking and calculating more things about the world - worlds can be more dynamic and have more interactions. More things and more complex things can exist within the world. It’s hard to give specific examples without talking about specific types of games.

The idea that games will cost 10x as much this generation is absurd. Why would that even be? New developments in API and game development actually lower the cost of developing games. Newer graphics API simplify the process for both programmers and artists. We’re actually going to see a decrease in the cost of certain aspects of making games simply because of the upgrade in APIs.

Now - if having better technology were the only reason that games cost more to make, then why have we seen an increase in development budgets between 2005 and the present? If we’re using the same hardware, and development costs go up, then it’s not the new capabilities of newer hardware that’s driving up the price.

One of the factors is actually the amount of money spent having to try to cripple games to work on old technology. A significant portion of development time is spent not creating new content, but getting that content to work within the rather restrictive limitations of your hardware. Artists have to cull their models to lower the polygon count because the engine can’t handle it. World designers have to constantly prune out little bits here and there from their world because the hardware can’t handle it. All the effort spent to try to get a modern-looking game to run on old, crippled hardware costs time and money. We have to pay more to make sure our games are sufficiently crippled to work.

Yeah, I don’t hate it as much if the dev actually puts some effort into maximizing the PC versions of the game. Back in the early 2000s, we’d have games develop for PC - made into a proper pc game - and then crippled down from there to work on console. Often there’d be a seperate but related game that would be released on console. So games would still use the technology available first, and only dumb them down later.

A few years into the current generation, that flipped. Most games simply became targeted towards the weakest system, so PC games became dumbed down as a consequence. This isn’t just in terms of graphics, but often in terms of map design, world detail, difficulty (when you design certain types of games to be playable with awkward little thumbsticks, they become ridiculously easy on PC, or if multiplayer, then time to kill values become super low), and all sorts of other ways.

Occasionally you see still a developer go through the effort of making a proper PC game, or at least taking advantage of some of the PC strengths - battlefield 3, for example, is massively better on PC than it is on consoles - but it’s the exception now and not the norm.

And I wonder why that is. It couldn’t possible be because it’s more expensive to create all the bells and whistles for the 10% of people with PCs good enough to turn them all on, could it?

Seriously. Why would they spend the money to build the shiny version if only (let’s be generous here) 50% of their players are going to see it?

Missed edit window. -_-

To add: I don’t know why games cost so much. I figure developers, however, are more likely to know what games cost than we armchair theorists.

It’s more expensive to add any development time to a game, yes. Your assertion that only 10% of PC gamers have the hardware to appreciate games more technologically advanced than current gen console games is pretty absurd.

But your implicit assertion that it’s a slam dunk that it’s not worth it is contradicted by actual cases in which it’s done. Do you think that the developers of Battlefield 3, Just Cause 2, Codemasters racing games, the aforementioned Tomb Raider game, etc. are too dumb to know what you just said?

So yes, most of them feel that putting out a superior product isn’t worth the extra work, so we all have to suffer from the limitations of the lowest common denominator. Which sucks for gaming in general.

You “don’t know”, and yet you’re willing to make the argument that game development costs have risen because of the increase in available technology, and link to an article that ridiculously asserts that costs will go up 10x with the next generation of consoles. This is directly contradicted by the increase in game costs within a console cycle. So you’re wrong on this.

I listen to a lot of gaming podcasts, and yesterday I was listening to the GameSpot one with Kevin VanOrd as host. He always has some industry insider on as a guest, developers, writers, critics, even publisher CEO’s.

In the last episode he had an indie developer on and they were talking about the PS4 and the topic drifted to what the new technology means for indie devs.

He pointed out that what one of the things that made him exited about the new technology was that with new technology comes new tools to create games, tools that abstract some of the complexity of game development. He pointed out that today an indie developer can put out a game like Journey, something that technically could have been done last generation, but at that time, would have easily taken hundreds of developers and an investment outside the scope of any indie dev, to actually create.

He expects that with new, more powerful hardware, and new, more powerful tools, his ability to create more immersive worlds/interesting gameplay is now one notch up higher.

In a few years indie devs can bring us gaming experiences that only a big budget studio/publisher could bring us a few years ago.

I don’t need to be able to explain WHY for something to be true. Also, I guess you didn’t really READ the article, because that’s not actually what it said.

What is said is that the LAST console generation created that kind of cost and that people need to be careful that this one doesn’t, because the industry can’t sustain that.

Reading. Go figure.

A lot of people do know and budgets to develop a blockbuster game have been ballooning for a long time. Hell, it will cost the developers of Skullgirls (a small, indie fighter) over $150,000 to do a simple game update:

Yes, you do. You’re trying to make the argument that increased technological capability is causing development costs to increase. To make this argument, you have to show that this has actually happened. You haven’t done so. And in fact development costs have been going up even within the current console cycle, which directly contradicts the idea that it’s the increased technological capability that a new console release brings that’s driving up teh cost.

The difference is greatly exaggerated, at least based on what I’ve seen on other sites, the average development cost roughly doubled, rather than went up 10x.

Don’t play games with dancing around implicit arguments. The “gee I have no idea why costs are increasing, and surely you can’t try to say I meant that it was a downside of technological advancement which is what we’ve been talking about all thread!” thing comes off as disingenuous.

Incidentally, I think increasing the technological level of games through procedural and algorhythmic factors is actually quite cheap. Better lighting, particle effects, etc. are really just a matter of a few programmers and time. Take Shattered Horizon for an example - it’s one of the best looking games available but it was made by a small team on a shoestring budget. The increased costs come from elsewhere. More advanced development tools and APIs actually increase worker productivity, leading to lower cost. Now if you keep scaling up the size/content/etc of your world anyway, the total costs may increase even if the cost-per-resource goes down, but this is a developmental decision rather than an absolute limitation.

Oh, I should add that I too think that phones and tablets will supplant mobile game devices - the value is just too good.

The irony in this case is that tablets and phones are far more like what PC gaming used to be - every year their capabilities increase noticibly, and new games come out all the time they surpass technically anything before them.

And the current state of the art, tablet-wise, is pre-Xbox. The latest infinity blade looks good for what it is, but you wouldn’t confuse it for a LAST generation game.

Pre-Xbox? Do you mean the 360? Because they’re definitely ahead of the original xbox. If the current gen of consoles aims low, tablets will pass them up sometimes in the late 2010s. But keep in mind that an ipad 4 pushes about 5-6 times as many pixels as most console games when you compare them.

I’m not saying tablets are currently more powerful - I’m saying that manufacturers are in a competition to field the best product, to push the edge, and game developers are using this technology to push the edge in gaming. That makes tablets more like the golden age of PC gaming than they are like consoles. The fact that they’re behind consoles in an absolute sense is of course due to size/energy constraints. But their rate of advancement blows consoles out of the water.

See now you made me go reinstall Infinity Blad and Infinity Blade II and compare.

(see http://www.flickr.com/photos/33743995@N00/sets/72157632886747244/ )

Infinity Blade 1 has no atmospherics, small map sizes, and carefully crafted, but simple objects.

IB2 has a few light source shading effects, 20-30 ‘environmental’ objects, slightly larger maps, and shadows attached to more detail character models…what do you think?

Hard to say without seeing it in action. I don’t have IB2. I do have Real Racing 3 on ipad 4 and it looks as good to me as console racing games do. Maybe more minimalistic, but it’s a track-focused game rather than driving through the city or something.

I think you’d find that tablet games look good for tablet games but are often simplified in subtle ways to account for architectural limitations.

There are also other aspects that make it hard to compare, pushing pixels at Retina™ resolutions takes some power, as well.

Apple’s branding makes it likewise hard to say ‘it’s running equivalent to Tegra X hardware’

And looking at the current bleeding edge for tablet graphics:

Shows they’ve got a ways to go before reaching the hardware even the cheapest PC has.

Alright…I think I’ll just up and admit I was wrong. I can’t imagine something looking like the link below will be console level gaming…if it is…holy crap are we in for a good time…if not, then the PC guys will be in for a good time. :slight_smile:

Battlefield 4: Official 17 Minutes "Fishing in Baku" Gameplay Reveal - YouTube!

Battlefield 3 was one of the few games that actually made a proper PC game first and then crippled it down for consoles, so I’m not sure what they’re showing off would be representative of the console version.

Rumors I’ve read are that the PS4 is going to have a GPU roughly in the range of a 7850, which is actually better than I’d expected. So it’ll have a fairly okayish GPU with a bad CPU, which will actually be sort of counterproductive in some ways and limit game development.

Read a rumor that the Xbox 720 is going with an ARM (phone/tablet) architecture with 16 cores. So take everything I already said about the folly of the PS3’s CPU design and double it. Dumb dumb dumb.

If that’s true, it looks like the PS4 will be a significantly better system this time around.

Combined with GDDR5 vs DDR3 rumored to be on the nextbox, yeah, I’d say PS4 will win the technology game. But Microsoft is making kinect a big part of their system. If they can manage to appeal to a more casual market, they might retain the crown next gen.

I’m saying: I don’t see games liking like the video on a $300 piece of hardware. The part where they come upon the construction site and there are hundreds of birds? That’s pretty impressive.

GDDR5 isn’t clearly superior to DDR3. They’re good at different stuff. GDDR5 has tons of bandwidth, DDR3 has better latency. It’s better to specialize them, ie DDR3 for main system ram, GDDR5 for graphics to maximize their usefulness in their niche. I guess there are other benefits to unified memory though.