Fourth generation consoles (1989) Is PC gaming dead?
Fifth generation consoles (1993) Is PC gaming dead?
Sixth generation consoles (1998) Is PC gaming dead?
Seventh generation consoles (2004) Is PC gaming dead? checks watch yeah I guess it’s time for this question to start popping up again. At least it won’t be really frequent until the next big console generation. It’s silly enough to see it come up every six months or so but I can’t wait until they announce the stats of the Playstation 4 or whatever is next and PC gamers are told yet again that THIS console will kill off the PC once and for all because it is SOOO much more powerful then ANY computer ever can be.
The PC gaming market isn’t going away and no matter how often it is said it’s not some obscure niche market that only gets pity releases. It’s always strange to me when these debates come up. Consoles are good (though I don’t play them) PC gaming is good and the market can support both as both have different strengths and weaknesses.
If we’re going to do this again, it’s going to be polite and factual. Not saying it isn’t right now, but this is something a lot of people get heated about, so it usually descends into fanboy vs. fanboy. If that happens in this thread, it gets closed.
I’m not in the US, and in my part of the world (Australia) computer stuff- especially “Gaming” components- has been not so much expensive as “not as cheap as you’d think” until fairly recently. Actual computers themselves haven’t been too bad, but graphics cards and all the sort of stuff needed to actually get the proper “Gaming Experience” has been a bit costlier. Even trying to get a “Gaming” computer with respectable but not top-of-the-range specs set up is going to cost you around $1100 or so these days. The difference is, that $1100 system is going to give you a decent gaming experience for a lot longer now than its equivalent five years ago would have.
I’ll add that I’ve been working in the electronics retail industry for a long time- I know what it all costs, and even at “Mates Rates” there was still a period between before about 2007 where it seemed every upgrade to the computer seemed to require something else to go with it (New processor required a new motherboard, which meant new RAM, new graphics card, etc) and whilst $1000 might have been a slight hyperbolic exaggeration, it was pretty easy to spend $600 without really trying just to upgrade an existing computer to get performance better than the “Minimum Requirements” on the new games that were coming out.
Also, games here are- and have always been- bloody expensive. If you’re spending $90 or so (and an Australian dollar is now worth the same as a US one) on a computer game, you want it to look good, and that means having a graphics card that supports higher resolutions, anti-aliasing, HDR, and so on.
And $400 a year is still the equivalent of buying a new Console every year, so as much as I’m a dedicated PC Gamer and have been for a very long time (early 1990s), the reality is that being a PC gamer does cost more than being console exclusive if you’re looking to get a decent gaming experience.
I don’t view this as a good thing. It means that games are no longer pushing a technological edge. They’re not trying to be more real or look better or model a more detailed physical world or improving AI. It means development is stuck in a rut because our target computers are fixed at the mid-low range from 6 years ago.
We used to live in a world where games would constantly amaze us with their beauty and realism and physics and AI, every year something better than the last came along. We no longer live in that world. We’re stuck. In an industry where amazing advances are made, where processing power doubles on a regular short term basis, we’ve decided that we’re not going to take advantage of that and instead choose to stagnate.
This is not a positive. If you think it’s a positive, then would you be happy if the Nintendo was the last console ever made, and the $150 you spent on it in 1985 lasted you 25 years?
Pretty much everyone already owns a PC, for normal PC tasks. So you’d subtract the cost of having a non-gaming PC that the person would otherwise have, and that marginal increase is the effective price of owning a gaming PC. Not only that, but since they’ll build a computer that’s good at everything else as well as gaming, they’ll benefit from having a smooth, fast system to do their normal computing tasks instead of a $200 shitbox that freezes every time you try to load a webpage.
Have you played some of the new release games of the past two years or so? Bioshock 1 & 2 looked great and had amazing stories, FarCry 2 looked breathtaking and was a fun game on the first playthrough or two, Fallout 3 & Fallout: New Vegas are amazing both graphically and from a gameplay POV too. There’s no shortage of PC Games that still make you go “Wow!”, and things like the upcoming X-Com reboot and Bioshock: Infinite only confirm that they’re still continuing to do impressive things with PC gaming technology- but the tech just isn’t “churning” as fast as it used to, and on the balance I do think that’s probably a good thing for the most part.
Since I don’t like consoles that much, it wouldn’t bother me in the slightest to be honest. Consoles now have only caught up to computers; for years they were way behind.
Except it doesn’t quite work that way. You can’t generally take a run-of-the mill “Word Processor” that someone’s had for a couple of years and decide to whack in a 3D card, double the ram, and install an I5 processor. You basically have to replace the motherboard, and the ram, and the power supply, and then install the new stuff- so pretty much all you’ve got left from the word processing system at that point is the tower, the hard drive, the network and sound cards, and the DVD burner.
In much the same way, you can’t really take a run-of-the-mill car like, say, a Toyota Camry and competitively enter it in the Paris-Dakar Rally without doing such extensive modifications to it that all you’ve got left is a car that looks like a Toyota Camry but otherwise is completely different.
Sure, if you spend $1100 on a system now it’ll run well for everyday tasks and play games with the graphics set to “Ultra-Bling” for some time into the foreseeable future, but you and I know that because we’re computer people.
The average punter doesn’t think along those lines- they want a computer for Internet, E-Mail, Facebook, Youtube, And “Work Stuff” (usually word documents and/or spreadsheets), and they want to pay as little as possible for it. Much like your car is fine for getting to and from work, and possibly going on long drives, and that’s what you purchased it to do.
The Paris-Dakar Rally Special can do the same thing, of course, but the reality is you’re not likely to pay a hefty premium for the option to enter your car in gruelling cross-desert international rallies when the reality is that it’s not likely something you’d ever do in the first place.
I think the cost has dropped considerably however in the past few years. I just helped a friend build an HTPC that he wanted to be able to play games on. Cost him $600. Plays games at a much higher resolution than the consoles, at double the framerate, at much higher graphics settings/view distances/improved effects and lighting. And on top of that it’s a home theater box that records his TV shows and streams netflix + hulu + amazon video on demand, etc.
That’s ~$200 more than a console would have been, does more than a console ever will, and does he things a console does, a lot better. I guarantee that on the next Steam sale he’ll make up for those $200 in savings.
Never happen. There are essentially NO advantages.
Your point A is irrelevant. Gone are the days where “Must have this crazy superawesome PC to run this game” was a selling point. Now it’s a negative. Why did Starcraft 2 sell so well? I daresay the very low minimum system requirements have something to do with it. There’s no advantage is building games for bleeding edge hardware. That’s why people aren’t doing it. It’s expensive, and it limits your audience. And even if it doesn’t actually raise your minimum sysreqs, adding features that only the top 25% of people will “see” (In quotes because they might not even realize they are seeing them) isn’t going to see much return on investment.
Point B is just… how are they going to make games that run beautifully on next-gen hardware when they don’t even know what next-gen hardware is? Just doesn’t make any sense to me.
PC gaming isn’t “dying” but it has, essentially, completely CEASED to be the “lead” platform for big releases, and it’s never going to get that back. Multiplatform releases are going to be the rule from here out, until some platform establishes crushing dominance. Here why:
It costs too much to develop for. Games as a whole have gotten a ton more expensive over the past decade. Factors of ten here. And that’s in “developmentally retarded” space that straddles PCs and current gen consoles. Developing games that actually take advantage of the features of modern PC hardware makes it even more expensive, while simultaneously reducing the size of your target audience. Bad plan. And that’s ignoring the extra cost of QA on the PC platform.
Piracy. Yup. It’s been said already, but it bears repeating. It’s way too easy to pirate software on the PC. Pretty much any buffoon who can google can do it. Pirating software on consoles, while certain possible and popular in certain circles, requires more effort than the average lazy person is willing to put forth.
The PC will remain an excellent platform for indie developers because of the low barrier of entry that Kinthalis cited - it’s really easy to take a small team and build something cool on the PC, and you don’t have to do battle with Sony or Microsoft for licensing etc, and since your team of 1-2 programmers and 1-2 artists isn’t going to be building a monstrous 3d world with bump mapping, 3x antialiasing, dynamic lighting, blah blah etc, you sidestep all the reasons it sucks to develop for the PC - namely, extremely high costs to take advantage of all that crazy tech, and extremely reduced market because only enthusiasts have the tech to run your game. The indie space is going to be increasingly representative of people think of when they say “PC gaming” over the next few years. Or at, I hope so, because otherwise, the space is going to stagnate further.
It’s all about cost. Developing a AAA title for the PC only is no longer really cost effective unless you are a Blizzard - which is to say, you have Scrooge McDuck levels of capital that allow you to develop the game “until it’s done” even if that takes 5+ years, and you are essentially your own publisher, so you get every cent the game makes when it’s finished. This is, essentially, an entirely unrealistic picture of the situation for most developers. (Aside: I’m really curious how much money StarCraft 2 has actually made for Blizzard so far. Yeah, it sold a lot of copies, but it must’ve cost a mint to develop.)
Oh, and for the folks who keep holding up Steam as why PC gaming is thriving… I have a question. How many of the games you’ve bought on Steam have been for full price? I get the distinct impression from reading this thread that a lot of people are, like me, using Steam as a sort of “bargain bin” where you pick up all the stuff you missed out on for, essentially, pennies on the dollar. While this is great for people playing PC games, again, I’m not sure how much revenue it’s really generating for the developers. While I’m sure there are titles (Civ V, stuff by Valve) that come out on Steam and sell a lot of full priced copies new, I suspect most of the stuff people buy off Steam is part of the great “Steam Sales” that sell things for mighty cheap. And well, a glorified, very nice bargain bin is great for bargain hunters, but I’m not sure it speaks well for the health of the platform overall.
I don’t think it’s dying, and personally, I see it as a viable platform for my gaming for the first time in quite a while. Here’s why:
I have two things going against me when it comes to PC gaming. First, I’ve always been a console guy; I have no patience for games not working, for troubleshooting my computer to play a game I just bought, etc. I tried PC gaming in the past (the distant past, early to mid-90’s) and always found it to be an enormous hassle, despite the fact that back then I had a pretty sweet computer for the time. So I went all console, all the time. Pop in a disc and it plays. Easy peasy.
Second, like a lot of people, I switched to Macs several years ago, and obviously gaming didn’t play any part in that decision, so I just wrote PC gaming off completely.
When Steam was released for the Mac a few months ago, I decided to give it a try. It’s been nothing like I remembered PC gaming to be. I just counted, and in the few months it’s been available I’ve bought 15 games off Steam, mostly casual, but some (older) AAA titles. With one exception, every single one of them has been trouble free. The one exception was a release that the developer completely botched and fixed with a patch the next day. I can live with that.
My hardware is pretty limiting due to age, so I can’t run anything terribly recent. But the experience has been positive enough that I’m already thinking about upgrading my computer earlier than I would otherwise need to so I can run some of the newer stuff, especially Starcraft and Civ V once the Mac version hits later this month. I’m really hoping to see more stuff come out for the Mac (so far so good, but obviously it’s still 10-1 in favor of Windows), but I can boot into Windows when necessary.
Anyway, my recent impression is that PC gaming is better and easier/more trouble free than I ever remember it being, that Steam is completely awesome, and that my old biases against PC gaming are mostly gone. It’s still not perfect, and I’m still buying the big recent games (like FO:NV) for the Xbox due to my aging hardware, but once I get a newer computer, I can see converting to being primarily a PC gamer.
As someone who mostly plays video games socially these days, there’s one very clear advantage to consoles: a one-time expenditure of <$500 gets you the ability to play games with three of your friends (up to as many as six in the case of some titles, such as Rock Band 3). I can only speak anecdotally, but it seems like most of the people my generation (mid-20s) and younger view this as their primary mode of gaming. Big-screen TVs are much more amenable to this style of play than trying to organize a LAN party or clustering around a computer monitor.
That being said, PC gaming is clearly not dying. It was comatose for a while, but as numerous folks have pointed out, the rise of Steam, MMOs, and casual games have reinvigorated the PC market. And of course, certain game types just work better on PC - there’s a reason why RTS- and MMO-driven Blizzard is the most successful PC developer in existence.
What I think may be dying are the AAA juggernauts of yore - the graphics-driven shooters and the like. Whereas ten years ago, companies like Id and Epic were seen primarily as PC game developers, today they’re either multiplatform or viewed as technology developers. Epic, for example, rose to prominence on the sales of Unreal Tournament and its sequels, but its most recent success has been the console-centric “Gears of War” series and the continued development of the Unreal engine (rather than actual Unreal games). Even Valve, surely the most PC-centric of the top-tier shooter developers, is solidly multiplatform now, as its recent games have had simultaneous releases on PC and consoles.
The fact that pc gaming involves so much less hassle now a days is definitely one of the reasons the pc market is growing.
That plus steam, d2d, gog.com, and otherndigital download providers are tempting people to try gAming on the or pc, and like in the case of thefifthgear this might lead to people trying more hardcore games, or even upgrade their rigs.
Morover once they try it they are likely to see the difference, which is huge, when it comes to playing games on the pc vs on the console.
Playing games on the console, just feels archaic by comparison. Take a simple example:
I feel like playing peggle. My comp is always on, or in sleep mode if its not in use currently, so I wiggle my mouse and I’m playing in seconds. Then I feel like some star craft and boom, and busting zerg a few seconds later. In game I get an I’m from beef telling me to get my ass in gear its time to kill some zombies. Bam! I’m in the left 4 dead lobby ready to play in a few seconds.
On the consoles that set of actions would entail waiting for the console to boot, looking for various DVDs waiting for a ton of super long loading screens at each of thpse steps. I have a limited time to play games, as do most people, and on the pc most of that time can be dedicated to actually playing.
Are these things that much better than, say, Oblivion and the other stuff that was coming out in 2005? None of them even come close to Crysis, which is now going past 3 years as the most advanced game ever made. And since Crysis 2 is going to be consolified (and it’s something to watch - they claim they’ve come up with a dev process that will use PCs to their full potential while still making console versions - if anyone has the technical know how to pull it off, it’s Crytek) there’s a good chance we’re not going to see a game more technically advanced than Crysis until the next batch of consoles in 2014+ when the systems are close enough to develop technically modern PC games again. In the past, games typically had 6 months when they were the most beautiful, realistic, advanced games in the world - now we’re looking at stagnation for most of a decade.
That’s the problem. They’re still way behind - 6-7 years for computers is a huge gap of time. Not only are they using old technology, but they’re getting by with the minimum hardware they can. The xbox 360 has 512mb of total ram, shared between the system and the video card. It has a CPU roughly equivelant to an old single core a64 3000. But they use directX and so they can still run the same code that PCs can - at least if you limit yourself to dx9. Which means we’re stuck with systems that are similar enough to develop for both platforms, but where one massively outclasses the other. The similarities between consoles and PCs since the original xbox (where consoles have just become a little shitty old PC in a box) is the problem here. If consoles had different hardware and weren’t basically shitty PCs, we’d have a different sphere of game development, and the existance of consoles wouldn’t make my games worse. This is how it was for many years, and then I didn’t care about consoles.
See, I don’t approach this subject as some sort of Chevy vs Ford debate or something where people get riled up but ultimately who cares. It’s not a recreational argument. It’s not me trying to prove my preference is better. It’s the fact that consoles are now negatively impacting my favorite hobby by codevelopment forcing the dumbing down of my games that pisses me off. I only care about consoles, and people who use consoles, in as far as they actively harm my hobby.
Yeah, if they buy a $200 best buy computer that barely functions, that’s true. But a lot of people own PCs for general PC tasks that have decent core 2 or i5 processors in them, 4gb of ram, etc. In those cases you only really need to stick a $150 video card in there and suddenly it’s massively better than any console.
You’re exaggerating how much hardware is needed to be cutting edge. ATI has sold 25 million 5xxx series cards, and they’re all advanced enough (except maybe the extremely low end ones) to use all of the fancy new technologies we’ve developed in the last few years. A mid-range general purpose PC (core 2 quad, 4gb ram) that someone might own for general computing plus a $150 video card will be advanced enough to use all the features of an advanced game.
Next gen console hardware is just going to copy what PCs are doing now and use the cheap/low end of it. So develop games for PCs and by the time the consoles catch up with the next batch you can port the games over to them.
I agree that piracy is a problem, but why is piracy discussed so much and second hand markets for console games rarely discussed? They both prevent the original publisher from making money off a person while letting that person enjoy the fruits of their labor.
The money that the original publisher and devs see from digital sales is way higher than retail. If they sell a $50 game in retail, they get something like IIRC $8-10 back to the publisher, who then gives the developer a cut. If they sell it online, they get more like $35. So even if the publisher sells them at 2/3rds off online, they make about the same money per sale, but because of the lower price, they actually sell more of them. So they’d actually make more money selling a game for $15 on steam than $50 at retail.
Steam also allows small studios to become their own publisher - if you make a game and you want to sell, in the past, you’d have to sign up with a major publisher and have them take the majority of the revenues away from you. Now instead steam will take their cut (which IIRC is somewhere around 20-25%, much better deal than you can get from any publisher) and the developer gets the rest of it, not the publisher.
And… people are way more willing to buy games at reduced prices. I’ve bought a lot of games at $5 and $10 that I never would’ve bought at full price - so the creator of the game gets some of my money rather than none. Since a lot of people are in the same boat, they get money from a lot of customers that would’ve otherwise not bought their game. If these sales were not profitable, publishers wouldn’t do them.
So even though I don’t spend much money on individual games, I have 200+ of them and I’ve spread lots of money around to the game developing community - and they got more of it than if they went with retail middlemen.
Yeah, and this is fine by me. This is what consoles should be, and used to be, and if that’s what they were now, I’d be fine with it. But it seems like the console market is dominated not by the casual multiplayer games but stuff that was traditionally PC territory like shooters. I mean, yeah, you can do local multiplayer with shooters, but everyone crammed into their own 240x150 pixel boxes at 20 FPS on a TV with awkward ass controls seems more like torture to me than fun.
No, I’m just defining “cutting edge” more accurately than you. Cutting edge is near the best of the best. It’s essentially the best of the best behind the truly absurd “bleeding edge”. People who buy the best video cards currently available are cutting edge. People who buy those cards and then overclock and liquid cool them are bleeding edge. Anything else is not cutting edge. Now you can argue that you don’t need to be “cutting edge” to be better than a console, and that’s certainly true, but one can also argue that playing games on that hardware is as much holding back PC development as playing games on consoles. (i.e. if you’re going to say “you’re not taking advantage of the platform!” does it matter by HOW MUCH you’re not taking advantage? Especially if what you are producing is still apparently good enough to sell to millions of people.)
I don’t think there’s a solid basis for this assumption, since only one of the three consoles this generation uses hardware that resembles a PC in a meaningful way. Certainly, it strikes me as a bad gamble to bet on. Fine, fine, the Xbox720 will probably resemble a PC, but what PC will it resemble? And why would you need to do anything special to “refine the games” and “create rock solid releases” if that’s what you’re planning on? The point is irrelevant.
I expect it’s the fact that piracy means that one person gets the game somehow and then spreads it to a nigh infinite number of potential pirates, whereas a used sale transfers the game to a single other person. Is it a problem? Well, depends on who you ask (there’s some good logic in the argument that people don’t sell their old games to buy food - they sell their old games to buy more games), but I expect it’s just not on the same order of magnitude. After all, if you sell 150 copies of your game, without piracy, there are, at most, 150 copies of your game out there for people to buy. With piracy, there’s a functionally infinite amount.
I think we need to differentiate things like Steam from “direct” online purchases here, because I suspect the differences are significant. Though I don’t really know for sure - do you have a cite for how much of a cut Steam and its ilk are/are not taking? Otherwise we’re both just making numbers up.
This is true, but again, mostly of value to the “little guy” and not outfits that make AAA titles. EA doesn’t have to worry about getting a publisher.
Don’t get me wrong - I’m not saying they’re not profitable. But there’s a big difference between “Modern Warfare 2 sold X million copies” and “Well, we had a sale on Steam and sold a bunch of copies at 75% off.” The AAA model depends on the former, because heavily discounted sales a ways down the line aren’t going to produce the revenue stream they need when they’re investing tens of millions of dollars in their games.
Again, great for the little guy, not good enough for the blockbuster.
But yeah. I’m not happy with the FPS convergence either. It used to be that the titles that came out on consoles were weird and interesting stuff that I couldn’t get on the PC, frequently in genres that basically didn’t exist over there. Now it’s the same old junk I’m not into on the PC either. (I’m not an FPS player, so seeing them as the primary game type on consoles makes me sad, because I’d rather see games I actually want to play over there. Sadly, I don’t see this changing.)
Functionally, technology has brought us to the point where “top end” games just cost too much to develop to be sold on a single platform.
I wonder if the new usage of online registration has cracked the PC piracy issue?
High-end PCs still offer a gaming experience consoles cannot match. Triple-monitor gaming, anyone? Even ordinary PCs excel at certain types of local multi-player gaming. Playing over the internet is a great leveller, of course.
Yeah - I’m not sure how much of an advantage the PC has here. Consoles have sleep modes as well. And to fire up a small, downloaded game from XBL or PSN (or Wii equivalent - I believe they have one) is only a few pushes of an anolog stick away. Unless you have a shortcut to every program you own on your desktop, PCs have got menus to navigate too.
There are no loading screens in PC games? Really? :dubious: I guess I’m playing the wrong games on my PC, because they all have them.
Granted if you have every game installed via Steam directly to your hard drive, you do not have to mess around with putting a DVD into the tray. But do not make this step seem as difficult or onerous as an infomercial makes a routine, mundane, everyday task without the latest $19.95 (plus processing and handling) gadget.
Ok, fine, there are people who cool their $1000 video cards in liquid nitrogen. Who cares? What game has ever required this level of technology? None - not even close.
When I say taking advantage of the platform, do you think maybe I’m talking about the 6 years of hardware development since the release of the xbox 360 in which computational power of GPUs has multiplied several times? Or do you think I’m talking about people inventing games for people who kidnapped Taiwanese kids to hack them together a series of 18 video cards?
The technology in the current generation of consoles wasn’t even current when they were developed in 2004. 512mb combined vram and system ram would’ve been absurdly low end even then. We’ve come so far since then that even the crappiest PC with a real video card will run circles around a console. Cutting edge doesn’t need to be a factor - dead center middle of the road average systems are 4 or 5 generations ahead.
PC games have always been very scalable so that people with high end rigs can run it maxed out and people with low end ones could run it on low/medium settings.
The idea that you need to be absolute bleeding edge in order to keep up with PC game development has never been true.
Yeah, we can exclude the Wii out of this, since it’s not trying to be a shitty PC in a box like the PS3 and 360. I have no problem with the Wii.
But the 360 and PS3? Their GPUs are a crippled Geforce 7800 (half the ROP units IIRC) and the xbox 360 has basically an ATI x1800 with a few features from the 2xxx series. The Xbox 360 is designed to run recompiled X86 code (I think) - I don’t know about the ps3. They use ddr dram. Their bus systems are similar to PCs.
Anyway, the guy’s point is that you can make PC games that are up to date with current technology and then when the next generation of consoles are out, you could release it on those platforms. That happened a lot with the original xbox, with old pc games getting released on xbox.
I’m having trouble finding a definitive cite. Valve is a private company and doesn’t disclose a lot about their business. The wiki page says “Gabe Newell, CEO of Valve, estimated in 2002 that $30 gross profit can be made from a $50 game sold over Steam, much greater than the $7.50 profit made from games sold through retail.” That’s an old quote, so it may be outdated, but it’s in line with the other claims I’ve seen.
This article talks about how sales not only increase the units sold but the actual revenue due to selling more units than the amount is discounted by. It also apparently re-energizes the community and leads to full priced sales of the game after the discounted period.
If digital sales have no sunk cost like retail shelf space, packaging, production, etc. then what does it matter if you’re selling your games at 2/3rds off if you’re selling 4 times more units? You still make more money.
Yep, like I said, I have no problem with the Wii because it’s more in line with what I think a console should be. I wouldn’t care about consoles if they stayed in their own sphere of gaming.
The irony here is that PCs have developed a lot of new ways to save development time recently. DirectX 10 and DirectX 11 were at least half as much about making development easier as they were about advancing the technology. And yet we can’t use these time/resource saving technologies with consoles because they’re stuck in the past.
To add to this, there’s a degree of sunk cost with retail distribution. If all of the production, shipping, shelf space, etc. ends up costing, say, $20, then on a $50 sale you’d make a $30 profit (which then has to be split with the retailer and other middlemen). If you were to discount the game to $25, you wouldn’t be losing half your profit - because of that sunk cost you’d be losing $25 of your $30 profit, leaving a $5 profit. When a retail store marks a game down to be very cheap, it’s not because it’s profitable, but because they’re tired of wasting the shelf space on it and they want to at least recover some of the cost.
Digital distribution has no such limitations. The costs of distribution are neglible. So you can play around with the price/units sold ratio in any way you want. Maybe you discount the game 1/3rd and 50% more people buy it. Profit. Maybe you discount it 90%, and 12 times as many people buy it. Profit. I don’t know what the ideal is, but you aren’t limited by conventional wisdom because you don’t have the same limitations as a retail sale.
I suspect the current scheme, where they sell it at full price for the people who anxiously waited the game, and then a few months later sell it at a heavy discount where several times more people buy it, is close to optimal.
This might be a hijack, but here’s something I don’t really understand: how is it that my Xbox 360, which is apparently 2004 technology, can easily play modern games like Fallout, Left 4 Dead, etc, but my 2006 iMac, which as I understand it has a relatively decent processor/GPU/ram for the time, struggles to play even stuff running on the old Source engine, which Wikipedia tells me was released in 2004?
FTR, since someone is bound to ask: the system specs are 1.83GHz Intel Core Duo, 2gigs ram, ATI x1600. I know it wasn’t designed to be a cutting edge gaming machine, and I’m not expecting it to play anything recent on high settings, but I’m surprised it can’t keep up with stuff that is older than it is. Especially when the 360 can, apparently with lower specs.