Are games really that much more buggy nowadays?

From the Assassin’s Creed 3 thread:

I see this a lot, but is it really true?

Whenever somebody brings this up, my mind instantly jumps to one game from the past: Ocarina of Time. OOT was basically a giant bug, there have been entire communities dedicated to dissecting, finding, and playing with every bug in every version of the game. You could bug the came to not have a sword, be able to use items on Epona, get to places that shouldn’t be reachable (or not reachable yet), and many other things. And that’s without a Gameshark.

However, I’ll give some credit, all of the intended content worked fine. There was no way to completely mess up the main quest, or become unable to complete a sidequest if you accidentally went to the Lake Hylia before the Fire Temple.

Or maybe we should talk about Pokemon RBY, where because of all the RAM tricks you could do things like encountering impossible Pokemon and potentially screwing up your game. But again, to its credit it was hard to find these bugs on your own.

The way I see it, when saying games are more buggy, there are potential statements to deal with:

[ul]
[li] There are, by pure quantity, more bugs[/li][li] There are, by proportion to game content, more bugs.[/li][li] There are more/a higher proportion of bugs are higher impact bugs (more “A” bugs) – i.e. gamebreakers, or bugs that require workarounds to be able to complete major game content.[/li][li] A higher proportion of high-profile games have them, compared to long ago.[/li][li] They’re more prevalent because of the ability to patch[/li][/ul]

I think number 1 is a no-brainer. 2 and 3 are possible, and 4 has something to it. I don’t think 5 holds at all though.

I don’t think it’s because developers can just fix it by patching. It’s because games, by and large, are larger, and have more potential interactions. Once you start getting into destructible environments, thousands of quests each with multiple outcomes (including affecting dialogue or content in in other quests), and similar you’ve introduced complexity into your system. It’s simply that, combinatorically, there’s simply no possible way even a good QA team can possibly test all possible configurations and orders of quests. The best they can do is test what they consider the most likely routes players will go through.

I also expect that in development, there’s a similar number of completely unacceptable bugs as there were years ago. This means that they still have to fix dozens of really, truly gamebreaking bugs that are completely obvious, which means that their development cycles still have them fixing a lot of bugs, by sheer volume probably more than were squashed years ago – but by proportion, less are squashed.

This is also, incidentally, why indie games are usually much more bug free. Not because they put so much more care in, it’s because they’re smaller. Even Minecraft, while infinite, is merely procedural. Which means as long as your procedural generation too is relatively bug-free and behaves as expected, you only need to monitor a handful of interactions, because all other interactions should follow that template. There’s no Skyrim-esque chain reaction bugs where talking to this guy will cause a shovel to fall on the other side of the continent and if this happens between 6 and 7 PM (in-game time) it will hit an important NPC and ruin a sidequest.

Does this mean there aren’t AAA games with inexcusable or bad bugs? Of course not! I’d probably agree that some of ACIII’s were inexcusable. Hell, in Skyward Sword there was a completely inexcusable bug with going after the Desert Dragon first would ruin your game, which could have been found with the radical step of doing the final 3 levels in a slightly different order. Hardly a difficult interaction to test. I just think that the higher number of bugs has more to do with the extra features we expect major games to have, and the length and complexity we expect of them, and the speed at which we require them to be done, more than it does carelessness or an attitude of “we can fix it with patches”.

Dude, trying to get PC games running in the 80’s and 90’s was a total crapshoot sometimes.

Elder Scrolls II is insanely buggy, for example.

Was that due to bugs, or due to the fact that environments were less standardized? I always thought it was the latter.

Ah yes, I heard Arena and Daggerfall were insanely buggy (in fact, Bethesda themselves said so), but I didn’t mention them where I mentioned OOT and Pokemon because I’d never tried them myself.

I agree. And at least now, you can find patches. In the 80s, they were impossible to find. Several games were unplayable or unwinnable as a result. (Impossible Mission pops to mind for me). In the 90s, some companies put patches online, but not enough. And if a company went out of business, good luck finding the patch.

Working as intended :smiley:

Ultima IX was a total pain too. And not just because it was terrible. It was almost impossible to get running even with the patches, of which they released a ton.

The software development industry has shifted over the past twenty years to a more flowing, dynamic model of develop, test, patch the big glaring problems, release and then patch everything else at a later date. Rinse, repeat as needed. This has made its way into the gaming industry.

Back in the 80’s, the need to finagle things to get things to run involved rewriting your autoexec.bat, config.sys and anything else you could think of so as to scrape up just a few more K of RAM so the game would run.

I remember buying Diablo I, putting the CD in the drive, it installed and ran. I’d never seen that before. They may have released some patches for it later, but they weren’t required in order to play the game. I haven’t seen that since, not even with other Blizzard releases: the first patch for Diablo II was actually available for download before the game was released.

Total War: Shogun 2 was remarkably stable right after release. I was actually shocked, because previous Total War games had generally been quite unstable for a while after release.

I remember religiously annotating and archiving the CDs that came with my monthly Joystick magazines - half demos, half patches and video drivers they scoured the 'net for. I had a whole drawer full of the bloody things, kept around for years on end because that one had that patch for that game.
God bless the Internet.

Anyone who thinks games are getting buggier needs to go back and play the original, unpatched Daggerfall. Or Temple of Elemental Evil. Or, God help them, Battlecruiser 3000 A.D..

Hell, anyone remember the remake of Pool of Radiance? This is the game that had a bug that would reformat your hard drive.

Or even the patched version of Daggerfall.

There’s definitely the odd game that gets pushed out far too soon and will be buggy at release because of it, but I’d say that older games tended to be far more buggy, at least on PC. As others have noted, even getting games to run could be difficult. The other aspect is that it was much cheaper to make games, which led to a lot of amateurish studios putting out games. Nowadays, games have gotten so expensive to produce that most of what you play is going to be a AAA title from an AAA company that can afford a high-quality QA staff.

That’s the one I was trying to remember! :smack: Fortunately I never fell prey to that, but I was horrified at how terrible that bug was, and always remembered that as the pinnacle of awfulness whenever people complain about buggy games.

I think people are conflating the fact that patches come out more often with games being buggier. I also suspect that there are more bugs, because there is more code, but I’m generally happier now than I was with, say, Darklands. Which was an awesome game, but it had some serious bugs.

That said, there is a tension between marketing-driven release dates and feature sets on the one hand (“it must be out by Christmas, and we need Capture the Flag!”) and code quality on the other. If the date won’t move, and marketing won’t budge on features, then it ships broken and you patch it as you can. Particularly when you have a manufacturing step, with the printing of discs and boxes (and the associated copy on the box), you can wind up with the version that goes out initially being patched before it’s released.

I don’t regard that as a technical problem. It’s a marketing and planning problem.

Yeah, the mid-late 1990s through about 2003 or so were a dark time as far as buggy computer games were concerned.

Prior to that, the game developers had to be a bit more careful- without widespread internet connectivity, a glaring bug would be a game killer.

Once the internet came out, the bug rate went WAY up, from what I can tell. To the point where it seemed like most games came out roughly at what should have been beta testing in terms of how many and the severity of the bugs, and then might be patched later, if the game was successful enough.

You ran the real risk of spending $40-50 on a game, finding out that it didn’t work right, and that you’d get a single half-assed patch a couple of months later because the game wasn’t popular enough to warrant much more developer time.

Console games have been relatively bug free, and for whatever reason, PC games have gotten better- most of the ones I’ve got over the past 4-5 years, haven’t had the issues that the prior ones had. Most patches have been more about gameplay balancing and tweaking rather than fixing outright bugs.

PC games in the early 80’s like The Ancient Art of War or Karateka would often just randomly crash without warning or explanation. When you fired up one of those games, you knew an hour’s worth of progress could be wiped at any moment. To add to the frustration, most of these games didn’t offer a save feature, so if you crashed during the final boss fight, sucks to be you.

Nowadays, you have about a 90% chance of any PC game working out of the box as long as your system meets the required specs. Console games have 100% chance. It’s quite a relief.

Oh, and for anyone who still disagrees: remember that the original Metroid for NES had a computer-generated “secret world” that was several times the size of the intended playable world. It was simply a byproduct of the world generation code. Can you see that happening in a modern game?

Some of this relates to survivorship bias, too. By which I mean people are more likely to remember classic games from 20-30 years ago that were mostly bug-free compared to the stinkers that are rightfully lost in the mists of time.

Uninstalling Myth2 was something of an adventure.

There might be more minor bugs now, simply because games are a lot more complex then they used to be. But unforgivable, game-breaking, hard-drive destroying bugs were a much more common issue “back in the day”, so the situation has definitely improved, even if the absolute number of bugs might be higher today.

I had no trouble playing (or winning, once I mastered some of the tricker rooms) Impossible Mission.

When I first bought a game console it was so freaking exciting that when I put a game in it would HAVE TO WORK! How cool is that, after years of rewriting autoexec.bat and such?

Now I can’t get past one of the first missions in Saints Row The Third, and thus the game is broken, because a mission won’t start.