What common traits do console AAA games tend to have?

Compared to PC-only games, what tends to distinguish AAA games which are only on consoles or were made primarily for consoles and then maybe ported to PC? I am particularly curious about the graphical traits of console-based AAA games.
What PC elements* do they underuse or not use at all?

Please note that I’m only asking about tendencies; Exceptions may be worth point out but do not invalidate significant trends.

*By “PC elements”, I mean both elements which are usually found on PC and elements which are available on PC even if they represent a minority of machines.

I’m not sure about graphics, but the first thing that came to mind is simplified control/UI schemes. They don’t have a mouse and a gazillion keys, so on the bad side they don’t have all of the options that a PC can provide, on the good side it prevents them from too much bloat (“hit CTRL-SHIFT-3 for your alternate weapon skill…”)

Aside from a few genres having a much larger footprint on one platform or the other (RTS on PC, sports on console), the differences between PC games and console games have largely evaporated.

That is a benefit of controllers I hadn’t thought of.

As for my own answer:
Console-based AAA games make little use of physics, dynamic lighting and specular light. They make extensive use of detailed matte texture and bump/normal maps (I don’t know how common displacement maps are). Colors tend to be diffuse & desaturated and there’s lots of brown & grey.

There is often little interactivity with the environment. When playing Thief 2014, I was struck by how many bookshelves there were. So many books, yet all those assets were little more than detailed wallpaper.

An awful lot of the game looks like this: http://media1.gameinformer.com/filestorage/CommunityServer.Components.SiteFiles/imagefeed/featured/square-enix/eidosmontreal/thief4/review610.jpg
They often use humans as enemies and allies and put emphasis on people’s faces. Far Cry 4 seems particularly proud of its human faces and goes out of its way to show them off even though they hit the uncanny valley pretty hard.

Thief 2014 and Far Cry 4 were both also released for the PC.

Games are as colorful or not as the developer wants.

Games are as interactive or not as the developer wants.

The most popular physics engines (Havok, PhysX, Euphoria) are all platform-agnostic.

I don’t understand the “faces” comment at all.

There are now very, very few PC-only AAA games, because they generally don’t want to limit their audience if they can get away with it. The ones that are tend to be 4X, RTS or simulator types that can be difficult to get simple enough to work from a controller.

And that’s the major difference - the controller. As people have said, the graphics and physics engines on PC and Console games are the same now.


Having said that, the PC versions of AAA games often have extra graphics options for those with powerful machines, but the recent gen consoles tend to just turn these options on by default as far as I can tell.

Here is a screen capture of a small detail from the new Thief game. Other than a darker image from the XBox360, I don’t see much difference between the PC and the two new consoles, but the older consoles have much simpler textures.

Graphics for console games are typically worse than PC. They may start out as good or even better than PC, but because the console is a stock device, it doesn’t change over the 5 or so year lifespan. PCs continue to increase in processing power and graphics capabilities every year.
From my own observation, AAA console games tend to be story-driven open worlds with simple gameplay mechanics. Call of Duty, Bioshock, Assassin’s Creed, GTA, Halo. Games where you can just sit down alone or with a friend and run, jump, drive and shoot a bunch of stuff and then either put it down or continue where you left off later.

The same games are popular on the PC, with the addition of strategy games (StarCraft, Command & Conquer), sandbox builder games (Minecraft) and MMO games (World of Warcraft). Games with more depth (requiring more buttons) and where the point and click mouse interface is easier than a controller.

Minecraft is actually more popular on consoles in roughly half the time.

I’d say it’s the opposite really. A 104 key keyboard means 104 individual things you can do without “CTRL-SHIFT-3”. On a controller though, you’re stuck with clumsy radial menus to make up for the lack of controller buttons. A common complaint these days is shooters with a 2-gun or 4-gun limit, a direct result of games being made with consoles in mind instead of keyboards where you commonly use the number row as hotkeys for all your guns. The fact that DOOM was able to give you eight guns way back when but Far Cry 4 today restricts you to four isn’t fixing bloat, it’s just cutting out options to make itself controller accessible.

ARK: Survival Evolved is coming out for the Xbox One and PS4 next year after being initially developed for PC. Looking at the game Wiki, I count thirty-two keybinds for things like hotkeys, opening various menus, whistles to control your dinosaurs, going prone/crouching, tribe commands, etc. If it had first been designed with a controller in mind, I doubt it would have half as many controls or options available.

Keyboard controls are, of course, a major reason why RTS and simulation games continue to be largely the domain of PCs.

It sold 20 million copies on PC as of early this year. It didn’t hit that number on TWO consoles combined until later this year. Also, it sold those 20 million copies on PC with VIRTUALLY NO ADVERTISING. It wasn’t until the console versions came out that advertising for the game hit hard, and they all promoted the console versions.

What I’m getting at, is that the comparison, as you stated, means absolutely nothing, other than you can possible sells tens of millions of copies of your indie game on PC without any advertising, and it takes two different consoles with a combined 160+ million install base to maybe, match it - after a 50 million marketing campaign, that is.

Well, maybe what it means is you’re stupid as dirt to take up Sony or Microsoft on some type of exclusivity deal. Millions of fans await on ALL platforms.

And that’s being born out by the market it seems. For the most part games are all multi-platform now a days. Control schemes are the biggest reason to keep a game exclusive. All other exclusives are basically paid for directly by Sony/Microsoft.

You didn’t read my link did you?

Minecraft sold its 20 millionth copy on the PC this past June (four years after going on sale). It reached that number on the Xbox 360/Xbox One in May 2015 (three years after going on sale).

As for advertising, Minecraft has never advertised as far as I know (a quick Google turns up nothing). Not on consoles, not on the PC, and not on mobile (which has sold 30 million copies, by the way). But foam swords and shirts featuring Minecraft characters and books teaching kids how to craft have been available since way back in 2011. It’s always had popularity and its always had a fervent fanbase pushing the game on new players.

That said, my comparison was just to illustrate the fact that Minecraft isn’t a “PC game” and hasn’t been for a long time. People will play Minecraft wherever it is.

I disagree with this. Minecraft, wouldn’t exist if it wasn’t FOR PC. IT’s the PC fanbase that picked it up when it was mostly a great idea that barely functioned, and helped it’s developers iterate on that idea until it reached mass appeal.

It was only then ported.

I think that makes it a PC ass PC game.

The big differences are in the graphics configurability and the controller.

By that, I mean that on a console, you generally have your graphics, and they’re pre-set based on the console hardware. You can vary your gamma or something, but that’s about it.

On a PC, you can make tradeoff choices- you can downgrade graphics for systems not able to run the game at full-framerate with every quality setting checked. IMO, console graphics are usually somewhere about at the 2/3-3/4 mark, in that console graphics are usually good, but only about 3/4 as good as the best PC graphics. So sort of by default, you’re downgrading your graphics on a console relative to what a tricked-out PC could do.

But the flip side is that in general, that console game is tested to hell and back, and there’s no driver consternation, or finagling your settings or anything like that- it just runs and you play the game.

So it kind of depends on what’s important to you personally.

Of course, I’m not arguing with that. But it moved on to consoles within a year of its PC launch and has proven to be just as popular (and in some ways, moreso) away from the PC. It’s an everybody game, just like every other game outside of most RTS and 4X titles.

I think this view is outmoded… by many years now. First, graphics differences are hard to really asses in a blanket form like 3/4, or what have you. It varies from game to game. The main differences, specially this early in the console cycle, are draw distances, LOD distances, shadow, lighting, post process effects quality, IQ, and frame rate (sometimes resolution as well).

In some games these differences add up to a nice little upgrade, but nothing major, on others it makes a VERY noticeable difference. The gap, will continue to get wider as times goes on as well.

And, I haven’t had to “fiddle with drivers and settings” for a very long time. Unless you count clicking “yes” on the prompt to update things - which is something that is happening on consoles now too.

Now, I personally LIKE to play around with settings for a minute or two before jumping into a new game, but most games automatically set settings base don hardware, and both Nvidia and AMD also provide game profiles based on your hardware, which will automatically set graphics options for you.

PC gaming now a days, on modern hardware on a modern OS, is just as plug and play as consoles. And (sadly actually 'cause this sucks when it does happen regardless of platform), for every Batman Arkham Knight on PC (a rarity now a days, thankfully), there is a Master Chief’s collection on consoles.

I think you’re misunderstanding me; I was saying that if you have some sort of beastly core i7 computer, 32 gigs of RAM, and a GeForce 900 series GPU, you’ll be able to run better graphics than a Xbox One or PS4 if the game supports that, and that similarly, if you’re struggling along with a i5, 8 gigs of ram and a GeForce 600 series GPU, you can downgrade your graphics for a higher framerate… if you so choose.

Plus, the Xbox One/PS4 probably kicks out graphics of about 75% of the quality of the i7 computer listed above on a graphically intensive game like say… one of the Battlefield games. It’s hard to measure, I agree, but the consoles are usually good, but not great graphics wise, but are pretty much foolproof.

Again, I’m not sure how you’re arriving at that figure… And no, they aren’t foolproof. From frame rate drops down to the teens in some games, to issues with screen tearing, LOD pop-in, etc, the experience can vary wildly.

Battlefield 4, for example is a 30 FPS, 900p experience on equivalent medium settings on PC, except for LOD, which is low.

These things make a big difference in graphics, immersion, and gameplay.

If you have a physical copy of any console game, and you have the console it plays on, you can play it, no matter what, for as long as you have that console.

If you have a PC game, you probably need to upgrade your graphics card/processor/driver every 6 months or so in order to play the new games.

Nah. I upgrade video cards every 2-3 years and that’s more out of choice than necessity. My processor (i7-860) was released in 2009.

This is just… well, completely and utterly clueless… unless you’re being sarcastic?