What games make the most use of the PC's power?

In what ways can PCs do more than consoles?
What games make the most out of PC’s increased capabilities?
What capabilities would you like to see game developers make greater use of?

RTS games are a good example, both in graphics and in AI. Really any graphics-heavy game: the visual quality on PCs can far outstrip consoles. And then there’s the frame rates. Consoles seem to go for 30 fps but PC games can achieve 60 fps or more. 4K / UHD gaming is another area: a relatively modest GPU can run games at 4K at better-than-console graphics settings. Games with good AI can really stretch the PC - look at Galactic Civilisations 3

Take a look at Ashes of the Singularity, The Witcher 3, GC3, etc.

I don’t think there is anything that a PC can do at the moment that a console cannot do as well. Modern consoles are basically just snapshots of PC hardware. They share all of the same major components and their design is the same. They literally use the same chips for their subsystems. The same chip is soldering onto a console motherboard as a PC motherboard.

The only difference is that PC hardware continues on with getting ever faster. The Xbox One and PS4 aren’t very out of date yet though. Plenty of people have PCs slower than a PS4. The extra grunt of a state-of-the-art PC isn’t utilized in any novels ways yet. Just higher resolutions and more frames.

Pretty much the only difference today is the openness of the PC platforms allows for a lot more experimentation with games (which is both good and bad.) Also, the process control of a mouse and habit of PC games to sit two feet from a high-resolution screen allows developers to jam a ton of small things on a screen. Either console could do these things too; Microsoft and Sony just choose not to allow it.

If you wanted to compare 1999 consoles to 1999 PCs, then you’d see some serious differences.

Modern consoles can’t even drive standard 1080p HD TVs at 30 fps for many newer games. They have to drop to 900p (Destiny, Witcher, Watch Dogs) or even 720p (Battlefield 4, Star Wars).

More pixels and texels per second, then. Which is definitely an improvement but it has to be short of PC’s potential.

Seems like PC games could make use of voxels/tessellation/particle systems for qualitatively better lighting/animations/physics.
As another example, could mid to high range PCs handle using softbodies and dynamically destructible environments rather than rigidbodies and the canned destruction of games like Battlefield and Rainbow Six Siege?
I note those possible features for discussion but if there are others, I’d like to know them.
AI: Does PC enable better AI or are good AI games simply more common on PC?

True, but the major limiting factors are that console hardware remains static for years (I suppose each specific generation is static forever aside from offering different hard drive capacities) and cost. There’s no reason why you couldn’t have a console with a top of the line i7 processor that handled physics and AI as well as a PC and a cutting edge graphic processor that handled 4k except that Sony doesn’t think you’ll pay $900 for a console with those capabilities. And they’re probably right. It’s a different market and the market for consoles is essentially the stripped-down-PC market.

Eh, I don’t know about that. Show me a PS4 that could run something like Total War Attila or Total War: Warhammer (holy redundant titles Batman!) and I’ll show you a PS4 on fire!

CPU power is the big differentiation right now. GPU too, but to a lesser extent, since most games going for amazing technical graphics need to make sure they run on console, first and foremost.

Now, CPU power is still locked behind poorly multi-threaded graphics API’s with a lot of CPU overhead. Enter DX12 and that CPU power (even entry level i3’s significantly outperform console CPU’s) is freed to do all manner of things.

In terms of pure GPU right now, you’re basically talking about better bandwidth, faster computation on PC vs consoles, which usually “only” amounts to longer view distances, better LOD, better lighting/shadowing, better image quality, higher resolutions, and better performance/frame rates. Of course those things can add up to make a significant difference, depending on the game.

I think we’ll start to really see significant differences over the next few years. PC gaming only keep son growing, and that growth is likely to get developers to invest more into their PC games.

And in many cases they STILL can’t hold a solid 30 FPS, dropping in some cases into the teens.

Certainly many games exist that crush the specs of a PS4, but every game could run at 1080p60 on the PS4 (even Total War) if that was the target of developers. It isn’t, of course, because games would still look like late-generation PS3 games.

I just don’t see it as a novel use of computer hardware compared to console hardware. An example of a novel difference from the past is non-local multiplayer. You could connect to another PC since the late-80s with a modem and with broadband in the mid-90s. When was that possible on a console? 1998 with the Dreamcast? That didn’t get decent until the broadband adapter in 2001. Even once it was possible, it was a feature in only a handful of console titles, compared to hundreds of PC titles. It wasn’t a pointless gimmick either. Being able to play a game with your friend who wasn’t there was amazing.

Thinking about it some more, I think you could give the modern PC a point over consoles for its multi-display capability and maybe hot-swappable control schemes. Neither of those are nearly on par with non-local multiplayer, but they’re pretty nice.

For a game like total war, it would’t be just about the graphics. There are literally tens of thousands of troops on screen at a time. It’s a key feature of the game. You can’t do that on a PS4.

Tribes 1 could host 64 v 64 multiplayer matches back in the late 90’s, consoles STILL cant match that.

I wonder what Dwarf Fortress would run like on a console? hell the menu system alone would make game play a nightmare never mind the processing power required.

At certain settings, perhaps. The number of individual troops/unit is an option that depends on PC power, and if for whatever (admittedly crazy) reason they wanted to make Rome II on the PS4, it would run similarly to lower-capability PCs. The game isn’t “not Total War” if it’s running at those settings.

ARMA III is another game that requires a very high spec PC to run at full settings, even though it came out two years ago. It has very sophisticated AI and realistic projectile ballistics.

Checkout this video to see the insane amount of detail they have put into modelling the trajectory of every bullet:

Main stream shooters like Battlefield 4, don’t do this, they have extremely simplified models.

I came in to say ARMA as well. I’ve got an i4790K, R9 290X, SSD, and 16GB ram and it still lags a lot in Cherno.

By that logic there’s nothing a PS4 could do that a PS2 couldn’t. Witcher 3? Sure, just bring int he fog world cut down the resolution, and increase the load screens…

That’s a strawman and you know it. I’m saying a PS4 port of Rome II could probably be made to work at settings comparable to what is *currently available *on lower-level PCs.

You said that “a key feature of the game” is having “literally tens of thousands of troops on screen at a time.” I deny that assertion since many people play and enjoy the Total War series without being able to access it at those levels.

I actually can’t find anything concrete online - in the unmodded game, what are the maximum army sizes at each graphics setting?

One of the reasons I liked the Command & Conquer series (#4 did not exist!) was that you could have unlimited units. I’ve yet to find a replacement but have high hopes for Ashes of the Singularity.

I don’t know. Each unit type can have different number of men, so it would even depend on your army composition. You could have, I believe, up to 50 stacks per faction.

I guess it comes down to what is or isn’t key to the game, which I’m guessing might end up being subjective. I tried to track down benchmarks to see if an equivalent CPU could run he game, but a CPU a bit more powerful couldn’t run the game at normal settings above 25 FPS. Couldn’t find anything below normal though.

I think it’s actually MMOs. No stop, hear me out.

MMOs these days have much nicer models than are generally used in RTS games (Because the ones in RTS games don’t need to look when they’re half a screen tall) but they can still be called upon to display a TON of them all at once.

I’ve never seen my PC grind like it does in crowded areas in MMOs, especially if people are throwing around lots of spells with particle effects too.

Damn edit timer…

Also, you objected to the Witcher 3 on PS2, but I’d bet most of the RPG elements of the game would fit fine on PS2, so long as we make the game look like ICO, and reduce the landmass a bit.

I guess my point is: if the question is “What can a PC do that current consoles can’t” a VALID response is Total War Attilla battles with the grandeur and scale of thousands and thousands of troops on the field.

You coming in and saying: Well if you play the game with 200 troops instead it’ll run just fine on a PS4, doesn’t make my answer invalid. a PS4 still can’t do a Total War battle like a modern PC can. This is definitely an example where PC’s can do more than consoles, even if the game just happens to have a scalable enough engine that a cut down version could, maybe, possibly run at 25 FPS on a PS4.