That’s not my entire point. It’s not just the issue of requirements, it’s things like a CTD or Dave’s Mom finding out that she’s got to update her display driver to run Pokemon: Death Arena.
Consoles are winning and will continue to win because they work for the average user without requiring any sort of technological contortions by a user who just wants to kill Nazis or beat up on Dhalsim. Oh, and those games also tend to work out of the box, rather than requiring a patch because someone needed to make their target release date.
Personally, I don’t like consoles. I can’t stand D-pads and thumb sticks and all that other crap. I have a Wii because the family likes to play on it with me - Sins of a Solar Empire isn’t exactly a family-friendly game…at least not for group play.
PC gaming is cutting their own throats over and over. Because of that, consoles have risen in prominence and will continue to do so.
What are you talking about? How many software and firmware updates have the XBOX and PS3 received to date? 10?
Let me guess, you’re going to say that the updates are easy to install just press a button. Well, you’re right, navigating to ATI’s or Nvidia’s website and clicking on a button (“Download latest drivers”) is sooooo much more complicated!
Patches also are a fact of life for BOTH PC and console games. This is due to the ever increasing complexity of games. They are just going to have bugs, no matter what. Period.
The Wii and PS3 are notoriously bad for this. Seems like a new patch comes out every other week.
And aside from firmware updates there are many examples of games for the consoles that are bugged. Some are show stopper bugs (i.e. at some point the game becomes unplayable).
As noted bugged games are not unique to PCs only. What’s more is a PC generally has more ability to find a work around (3rd party drivers or rolling back to older drivers or tweaking this or that). On a console if it doen’t work you just have to hope and wait for the manufacturer to provide a fix someday.
I have also noticed an alarming reach into players pockets on consoles that you rarely see on PC games. Things such as being able to unlock more content if you pay a fee. They have also started to get rather bad at ad placements to the point of absurdity in some cases.
So, buy your “cheap” XBox, then pay $60+ for games rather than $50 or less, pay for XBox Live, pay for added content, enjoy ad spam in your games, if your game is bugged shelve it till (if) it gets patched. Honestly it looks like less and less a bargain.
Well…yeah, it is. For one thing, you have to know which type of card you have (and if you have a premade system from Dell or someone, that might not be immediately obvious), you have to know which website to go to, you have to install it yourself. It’s possible to muck it up, such as installing new drivers over old, or getting the wrong drivers, so on and so forth. You screw up video, you’re gonna have a hard time getting it working again.
Most of this is not that difficult, no, but compare it to: Turn on console. Console says ‘you need to update now. Click OK.’ You click OK. Console updates. You play your game.
PC gaming requires a certain amount of basic education and knowledge that a not insignificant number of people just don’t want to bother with. And I say this as someone who vastly prefers the PC for gaming.
It is more complicated. When my Wii needs a firmware update, it finds out by itself, asks me if I want to install the update, I click “Ok,” and everything is fine.
Even having to go to some website adds an unecessary and counter-intuitive level of complexity to installing updates or patches, and it’s just one more thing that makes PC gaming totally esoteric compared to the console side of things.
Some people seem to enjoy comparing this component to that one. My husband designs computer systems when he’s bored. However, I’m not interested in this. I don’t want to spend time doing that. I certainly don’t want to spend time at most gaming message boards, trying to find useful information.
I want to put my game in and PLAY. I am very wary of upgrading to a PS3, because I don’t want to install patches and upgrades. I want both my software AND my hardware to work right out of the box, with minimal effort from me. As far as I’m concerned, making the hardware and software work is the job of the codewriters/designers/whoever is selling the thing. Not my job.
It’s the same thing with a car. Sure, I’ll maintain it, in the sense of getting fluids refilled, and various items checked. But there’s almost no comparable maintenance on a computer or a console, other than maybe defragging a hard drive now and then. When I get in my car, I want to turn the key and GO. I do not wish to adjust the timing, or fiddle with the carburator, or adjust the transmission in some way.
I really, really like some PC games. However, it’s such a nuisance to install them, find out why they won’t work, and then try to fix it, that I’m reluctant to buy yet another PC game, when I can buy a PS1 or PS2 or even dust off my SNES carts, plug them in, and play.
In addition - if a patch on a console turns out badly, the absolute worst that can happen is you can’t play games on the console. If a patch on a PC turns out badly, no more games, or web, or email, little Timmy can’t do his homework, can’t pay the bills online, not sure how much money we’ve got in the account anyway, etc. etc…
This doesn’t seem like a good way to compare the two, since almost no one has to decide whether to get a computer or a console. Pretty much everyone has a computer. The question is whether to get a console too. But since you already have a computer, you have to “patch” it (I’m not sure which patches you’re referring too though) no matter whether you have the console or not. So the fact that you have to patch it is irrelevant to the question at hand–whether or not you should also get a console.
Most of the updates for the games on the PS3 are only if you are hooked up to the internet and/or if you are going to play online. If you never hook your PS3 to an internet connection and just buy games to play locally, you would probably never need an update or patch. Don’t be afraid. Join us.
I’m a network engineer and also support multiple small companies. I know all about the joy of patches.
Now ask your mom what kind of video card she has. I’m sure she’ll know if she should go to Nvidia’s site or ATI’s site - right?
Mom doesn’t play games? What about the one she just bought for the little kid in the family? She can just go ahead and pick up an Xbox, but how much RAM does [insert name here] have?
One is designed to be easy, one isn’t. Which is a typical consumer going to buy?
If a patch on a PC goes bad, there’s a lot more it can mess up. And console games (at least until now) have had more incentive to work out the bugs before launch, since they can’t be easily fixed after the fact.
Not to mention PCs have more things to update–video drivers, sound drivers, OS updates, individual games, unrelated programs…there’s just more potential for things to go wrong.
If your PC is working fine, and you buy a new game that requires you to upgrade your video drivers, you are installing a patch (i.e. the new drivers) solely for the sake of that game. If the patch screws up your PC, so you are stuck with 640x480 16 colors, your entire PC is useless until you can fix it.
And yes, it can be an either/or - do you buy a lower end PC that does all your email, web, bills, etc, but can’t play new games at anything beyond the minimum settings (if that) AND buy a console for gaming, or do you take the same money, and buy a higher end PC that handles gaming as well?
This is not necessarily so. A mid-high end rig has a decent life span of about 2-3 years at top notch gaming, followed by usually a small upgrade for another 2 years or so. That’s about the life of a console. And the small upgrade need not cost much at all + that gaming PC will also handle editing of your family’s videos, photos, manage your DVD and mp3 collections, balance your checkbook and probably help out with work, not to mention email, web, etc.
An example:
I built a rig for one of my friends a few months ago. Core quad processor, single 8800 GTS 20 inch monitor. Total cost ~ $800. That rig will play crysis at a high resolution on medium-high settings and if it handles crysis it handles any other game out there right now.
In 2 years he’ll either get another 8800 GTS (probably very cheap by then) for an SLI system, or sell his 8800 and get a mid range card for $140 minus sell price of his 800 GTS).
At that time he’ll be playing BOTH older games and cutting edge games at settings that won’t compare to the console games being released then. By the time the new consoles hit the shelves, he’ll probably upgrade his rig again and it’ll likely be cheaper since he won’t need new peripherals, and he might even keep some of his internal components and the cycle repeats.
Or in other words, PC gaming is just a different beast. It’s got it’s ups (Always on the cutting edge - able to do a LOT more than just gaming) and it’s downs (can be expensive and unintuitive at times).
Back to the OP, Again, I do believe that if system vendors where giving joe schmoe a PC that could handle 3d gaming AND software developers made sure game engines scaled and stopped with the invasive copy protections PC’s would be THE gaming platform, as it once was). The market is already there. It’s built in fer chrissakes! We need the industry to move on this.
The OP has convinced me to delete the PC game torrent I am seeding. You’re right, it’s wrong to deprive the developers of their money for a game.
Except where they release sequential ‘upgrades’ that should have been included in the original game, and end up quadrupling the price one pays to play a decent version of the game. That is really annoying.
The flipside of this is that in the console world, game developers are forced to refine their programming techniques to squeeze more and more power from the same hardware. (And there is big progress; compare the earliest PS2 games to something like God of War II or Final Fantasy XII.) In the PC world, developers can get away with wasteful and inefficient programming because enough customers will just keep buying more RAM or whatever.
This sucks when it’s actually what’s going on. And it does happen, but rarely.
What I do sometimes see now (and I’m happy to see it) is developers releasing updates that include EXTRA content. I heard people say what you just said about games like Neverwinter Nights. Nonsense! The devs have added actual game content along with bug fixes and game improvements for free, sometimes YEARS after the game has shipped. This does not mean the game was unfinished, it was fine upon release, this is just extra support for the fans.
And this is yet another staple of PC gaming that I don’t see in consoles. Devs of (some) PC games will go that extra mile releasing new content for their games, much for it for free.
Really? Will the computer lose the ability to run the games it used to?
No, but it might not keep up with the bleeding edge. But the xbox can’t play xbox 360 games either.
This was the point I was making - the console will continue to handle games for a few years only because it’s completely frozen and has no advancement for years in a field where capabilities change noticibly from year to year.
The quality of the technology is fixed at release and becomes more obsolete every day. And people point to this as a big advantage.
We’re coming right off a console development cycle so the current consoles aren’t that far behind the current PCs. They’re definitely behind, but it’s not a massive gap. But where will we be in 3 or 4 years? PCs will have advanced considerably and the consoles will be left way behind. It’ll be a jarring difference, like where in the last cycle the PS2 looked like total garbage after playing half-life 2.
People also greatly exaggerate the speed at which computers become obsolete and their price. People talk about spending thousands of dollars a year, when the most expensive computer I ever built, when I felt like I was splurging, was around $1200. You can build a capable core system for half that.
For what it’s worth, I’m not anti-console. I think they have their role - they’re much better for social playing… your friends can come over and play multiplayer games with you without having to bring their computers. They’re better at party games. And so if I got a console, it’d probably be a Wii - because it’s not trying to be a gaming PC, it’s playing into its role as a fun, different, more casual gaming system, whereas the 360 and PS3 seem like they’re trying to be crappier gaming PCs.
I want to add that I’m not like… a PC fanboy. I’m not trying to be elitist and say “my chosen gaming is superior to yours!” like in those 360 vs PS3 type arguments.
If it weren’t for the very existance of my (vastly) preferred method of gaming being threatened, I wouldn’t be saying this stuff.
But with consoles gaining popularity, and the PC market declining (and being EA-ized), I feel like I need to defend the strengths of it.
I view the Wii as the sort of thing you’d want in a console - not the greatest hardware, but that doesn’t matter, because that’s not what you want in a console. With the slow development cycle, that’s not what a console does well. You’re not using it to push the edge of technology. It just has to be good enough.
It has novelty, convenience, and party game factors going for it. It doesn’t take directly away from the PC market in my view because it doesn’t try to cut into what the PC does best. So I could see the Wii and PC coexisting quite well.
But the 360 and PS3 try to usurp what the PC does best - and apparently they’re succeeding, despite, in my view, being worse at it.