Anybody else get completely complacent about computer graphics?

Maybe it’s just because I’ve seen them get incrementally better, but awesome graphics on a computer game just don’t do it for me anymore. I get a new game and I just don’t notice the graphics after about 10 minutes, no matter how good they look. This is making me think that true photo-real graphics in a game isn’t going to be all that awesome.

If it’s a fun game, graphics don’t matter as much.

Depends on the game. For games like Red Dead Redemption or The Division, I think the immersive nature of the game, aided by their amazing visuals, is a big part of the fun. You can certainly make a fun low-res shooter but it’s not going to transport me to post-plague New York/DC or the 19th century frontier like those games do.

With the caveat that I’m not a hard-core gamer, I’ve never been one who really cared about graphics and, honestly, generally prefer old school 2D games. Like, for example, I just worked my way through Chrono Trigger on the SNES and am working my way through Final Fantasy VI (or III US). The immersion of a game is cool, and playing sports games or racing games with the latest graphics is fun, but I just like the aesthetic of 8- and 16-bit graphics. Like I don’t even like the updated 2D stylings of FF6 on iOS. Just give me a good story and good gameplay.

But, then again, I think Pac Man is visually gorgeous.

Immersion is cool, and having the best graphics helps for that, so it’s not like I’m anti-great graphics. But my heart is still in the simple 2D stylings. The only graphic style I really don’t like or don’t think aged well is some of the early 3D, you know like “Alone in the Dark” and that really blocky choppy look early 3D had. That game made up for it atmosphere, and the 3D was really cool back in the day, but that would have been a good game in 2D, as well.

Graphics aren’t really going for Wow Factor anymore. Photorealism is always nice, and it helps for games like Tomb Raider or Assassin’s Creed, where there’s a movie-like quality to the story and action, but that’s not the be-all end-all. As @Jophiel says, it’s about immersion these days.

Or open to any interpretation you want, with amazing stylisation, sometimes deliberately emulating crude graphics of yesteryear, giving it a very particular atmosphere.

I completely agree. I find the graphics of games of 5-10 years ago already sufficiently photo-realistic for me. It is fun to look at some detail in amazement, but in the end it is about gameplay. I recently played Return to Castle Wolfenstein again, which is some 20 years old, but still great fun, even though the graphics indeed look dated. Even Doom is still fun (but preferably in a slightly updated engine).

I completely get why remasters are in vogue: I’d love a real remaster of these games in modern graphics (i,e, with the same levels and game mechanics), although it is possible that this would take the fun out of it.

The current state of computer graphics amazes me, but there does not seem to be much point in going for even more detail.

We’ve gotten “true photorealistic” graphics, what, ten times now? And then, a year or two later, someone comes out with even better graphics, and suddenly the previous “photorealistic” is so fake, and obviously you need to upgrade.

Meanwhile, most of the recent Steam games I’ve enjoyed have been sprite graphics, and often ludicrously cartoonish sprites.

I compare two games with relatively similar graphics qualities: Skyrim, and Fallout New Vegas. I loved the former and hated the latter, in large part because of the graphics.

It’s not the technical proficiency that matters to me any more: it’s the aesthetic. One game had a highly varied palette, and I could stare at those purty colors all day. The other was mostly beige rock and blue sky, and I couldn’t stand to look at it.

It’s a little like appreciating oil paintings, once folks have figured out the basics of mixing oil paints. At that point, sure, every now and then someone might come up with a new formulation that makes a brighter green or a darker black, and that’s cool; but I’m gonna be way more interested in what they paint than what paints they use.

While brown on brown with more brown is boring (looking at several games here), it’s not something I would fault New Vegas for. Between the real-world setting of the Mojave and the game setting of post-nuclear war, lots of brown is reasonable. Not that there can’t be great beauty and lots of color, especially with the way the sun lights the area at various times, which is something that the game definitely did not do well.

In any case, I tend to think that the worst thing that can be said about a game is that it’s pretty if that’s the biggest comment about it. A game can be pretty and have good mechanics, a good storyline, and other things that make a game fun to play. But if the main response is “Well, it looks good”, that’s not much praise.

Generally speaking, I’m all for better looking graphics. But that’s totally dependent on your hardware. Not everyone has the most whiz-bang new GPU or monitor to display those.

I’m more impressed by good art (?- not sure if that’s the right term) where the designers have put in the effort to make sure things look as correct as possible at any resolution. Color, shape, etc… are all where it’s at- RDR2 looks fantastic on an Xbox One, and it looks fantastic on a much more beefy machine, because the Rockstar crew did the art right. I liked looking around while playing RDR2 as much as actually playing- I’d find myself exploring just to see what different areas looked like. And I can say that parts of RDR2 are eerily accurate renditions of the Black Hills in SD, western Nebraska/Wyoming, and parts of Texas/Louisiana.

But there is a certain expectation; we expect our trees to look like trees these days, not vaguely triangular or conical abstractions of trees like in 2001. We tend to expect some degree of atmospheric effects to obscure stuff at a distance. We expect accurate shadows, etc… I don’t think anyone will be overly impressed if some game designers decide to individually render each leaf and twig individually for maximum verisimilitude, but we’d definitely notice if they stepped back and abstracted things more than they currently do.

I just saw this ‘trailer’ for new Epic software - MetaHuman Creator. Seems like it would fit in this thread.

To address the OP, I still notice well done realism. In the right games, it adds considerably towards immersion in the game world. But it is indeed incremental and I don’t suspect there’s any generational leaps like from 8-bit to 16-bit but rather additions of things like ray-traced lighting and tech advances to bring what we can display on a monitor into a VR format. So I’m not surprised that some people get blase about the latest advances in graphic realism but I still think they’re pretty neat and (depending on the game) still notice it. Or, even more so, notice when it’s missing like muddled textures on some surfaces or poor fire/water/weather effects.

As for the rest of it, these discussions seem to bring out a reverse Stormwind Fallacy. The Stormwind Fallacy is an RPG (D&D) thing where people claim that you can’t be both a serious roleplayer interested in character development and also be invested in the mechanical side of the game with stats and character optimization, etc. It’s usually heavily wrapped in the implication that only True Roleplayers are Doing It Right and the other guys are schmucks. Conversely, these discussions always seem to bring out the I Play It For The Game declarations with the implication that caring about foo-foo shit like pretty pictures means you’re not a True Gamer.

Everyone likes games that are fun. That’s a no-brainer; it’s like saying “I like food that tastes good”. Even people who prefer their food to be attractively cooked and plated still want it to taste good. And still understand that a hotdog wrapped in paper can still be tasty. And, of course, “good graphics” runs a huge spectrum. You can have attractive pixel sprites or attractive cartoony stuff or attractive minimalism, etc. SuperHOT would suffer for having realistic dudes attacking you but it would also suffer if they were replaced with pixel columns. There’s a million low-res 2D shooters out there mechanically similar to Cuphead but Cuphead’s art makes it a lot of fun on top of the mechanics. Shooting bandits on a hillside is mechanically about the same in Skyrim, Oblivion or Morrowwind but I’d rather shoot the dudes in Skyrim than the muddled potato people in Oblivion. A lot of stuff in Minecraft looks great but Minecraft with ray tracing looks amazing.

Yeah, sure, there’s a ton of games that look nice but aren’t much fun. There’s also a lot more games that look like shit and also aren’t fun since at least an attractive game implies a certain degree of skill and budget. And while I’d rather play a “fun” ugly game than a “not fun” pretty game, I’d really rather play a fun pretty game where “pretty” means a cohesive and aesthetically attractive theme which can come from a wide spectrum of styles.

And for a lot of us, it’s getting to the point where the weak link is actually our eyesight. I’ve lost track of the number of posts I’ve seen comparing “amazing” PC graphics to “laughably bad” XBox or PS graphics, where I literally couldn’t tell the difference, even side-by-side. I’m sure there’s a measurable improvement, but if I can’t see it, what’s the point?

Honestly, I think the last major “graphics update” that I even noticed was between the original Halo and Halo 2. Everything since then has basically been tweaking, as far as my eyes can tell.

There’s that, too. I have a hard time identifying frame rate dips or differences in refresh rates that other (younger) people see as huge differences. Also, sometimes there just isn’t a huge difference in some graphics or settings. Your modern PC game might include twenty graphic toggles or sliders and half of them have minimal impact on how the world looks. Are the shadows under the bushes slightly fuzzy or slightly fuzzier?

Yes. The artistry matters more than technical details. The problem for game developers is that it’s very subjective. It’s easy to develop to numbers (this many polygons, that size textures, etc), but having a aesthetic vision and then making it happen is hard. And even if the vision is implemented well, because artistic tastes vary, no one else may like it anyway. Ouch.

In my own dabbling in digital arts, my major hurdle is that computer screens simply can’t represent some colors. Look at the image linked below. The colored shape covers the gamut of computer screens. The gray area covers colors that cannot be displayed, because of hardware limitations. There is definite potential for more colors.

CIE Lab* Colorspace

Unfortunately, I know what you’re talking about. I play some games with a buddy of mine who’s about 5 years younger than I am (I’m late 40s, he’s early 40s) and he’s got a humongous 1440p monitor and can see stuff well before I can on my 24" when we’re playing.

While I can see the differences between the various sliders and stuff, I’m not convinced that they really add or detract from the gameplay in a well written game. I mean, having the super-duper water reflections turned on so that the puddles in the swamp reflect just so isn’t going to be the thing that helps you with your immersion in the game, so long as they’re reasonably realistic looking. Same for the shadows of individual blades of grass, or the distance fog, etc…

True. I guess that’s part of what I was getting at. You might turn everything on and post a high res screenshot of a PC scene next to an Playstaion scene and I can see the differences and one being obviously superior. When I’m actually sprinting across the map though, I probably won’t notice the difference in shadow quality or water texture provided that they’re reasonably close.

Another point I’d like to make: movies vs video games. While some games are going for the realism, many movies are going in the opposite direction. The camera captures some images, which already perceive the world differently than the human eye. Then the images are processed and filtered, deviating from human perception even more. This is because the goal of a movie is not to record the world, but to tell a story, and realism sometimes gets in the way.

And the same holds for computer-generated images used by movies. Often, they’re not trying for realism but have an artistic vision instead.

The take-away is that in the arts (games, movies, etc) it’s the aesthetic that matters most.

Interestingly there’s starting to be a lot of overlap between video games and film/TV.

If you look up how they film The Mandalorian, they actually use a virtual set that takes the Unreal engine and huge curved LED screens and they film in this giant virtual set where the background and lighting are displayed dynamically.

ILM Used ‘Fortnite’ Tech to Make Virtual Sets for ‘The Mandalorian’ | WIRED

You can actually see it here- they turn off the screens a couple of seconds later.

Right now, I’m spending a lot of time in a game from about that era. And the trees are polygon trunks, with several intersecting flat planes to make the foliage. It’s good enough that you can clearly tell that they’re trees, but photorealistic they ain’t.

But… if you zoom in on those flat planes of foliage, they’re the right foliage. If you have a tree that’s shaped like a maple tree and has bark like a maple tree, when you zoom in on the foliage, you see maple leaves, and in fall, those leaves will be red. When you see a trunk of a beech tree, if you zoom in, the foliage is beech leaves, and in fall, they turn yellow. You could make a game with perfect photorealism, but still get details like that wrong: In that sense, the old game is actually more realistic than many.

As an aside, the word “photorealistic” is telling: Photorealistic isn’t the same thing as realistic. Photorealistic means looking like a photograph, but a photograph also doesn’t look like the real thing. We settle for mere photorealism, because in many ways, our equipment simply isn’t capable of full realism. For instance, the dynamic range we can see in reality is a factor of about ten million, but computer monitors can only do a factor of 256.