Inspired by this thread in GD in which the contention is made that a console can never be as powerful as a PC because the console hardware is outdated when it hits shelves and cannot be upgraded.
So what if anything is stopping console makers from having an upgradable GPU? I’m picturing something that would function much like a Nintendo cartridge (as I assume such technology would be proprietary). You take out the old GPU, slide the new one in, power on and voila, your “old” console jumps up a generation in graphics. Throw some flash memory on it and the firmware could auto-update along with any other graphics updates a company might wish to include for older games (pipe dream I know- who wants to work on old games?). These GPU upgrades don’t need to be top of the line bleeding edge hardware. Indeed they should be rock-solid tested and approved for full compatibility which should be easy for NVIDIA/ATI to do since they are always dealing with the same hardware base.
I think that since modern systems like the XBOX 360 and the PS3 pretty much are computers anyhow, why not take it a step further and make their hardware upgradable?
Who cares? I play on a console so I DON’T have to worry about that crap.
First, it doesn’t work like that. You need much more than a processor and firmware.
Yes, this exactly - it would break the whole point of being a gaming console rather than a computer.
If I buy a PS2 or Nintendo DS game, I know it’ll work on my system - unless it’s an expansion for something like Guitar Hero and requires a peripheral I don’t have.
Now, imagine if the processors could be changed out…then games would be developed for the new processor…and being ‘PS2’ or ‘DS’ games wouldn’t mean that I could definitely play them any more.
Besides the obvious technical problems of having multiple hardware standards to develop for, test on, etc., it doesn’t make much financial sense.
Add-ons and upgrades are rarely purchased by a significant portion of console owners. The only ones that get serious traction are things that
-
Are orthogonal to games. Like adding an HD-DVD drive to an XBox or a better remote to a PS2 to watch movies.
-
Come with the game they’re required for. Like special controllers.
Anything else, and most customers won’t buy it. If they don’t buy it, most developers won’t be willing to put the extra effort into developing for it. If enough people really do buy it, you end up with a split market, where some games are really tailored to one set of hardware, and others to another. At that point, you get confusion among your customers.
It’s a much better plan to just make a completely new system, and if you want to have backwards compatibility with the old one, design that in.
Nintendo already tried this with the N64, it was the expansion pack. You swapped out the old one and put in the new one with more RAM. The problem was, it came near the end of the console’s life cycle, and very few companies tried to utilize it. They still had to design for people who didn’t have the expansion pack, so it only got us higher res graphics. So the companies who were good at using the equipment - mainly Nintendo and their second parties - put out a few good games with higher res graphics. Big deal. It was nowhere as revolutionary as the SuperFX chip was, which brings me to my next point.
For compatibility reasons, developers would go the other way around and make the addons for the game, not for the console, like the SuperFX chip I mentioned. Nintendo stuck the graphics addon to the games, but because of that, the carts were very expensive and not many games were made with it.
The Sega Saturn had a RAM cart too. I’m not quite sure how it managed to catch on so well, but I think it was because some killer apps needed it. The thing is, the Saturn was so much more popular in Japan. Capcom’s fighting games were notorious for being better than their Playstation counterparts partly because of the RAM cart.
Addons and upgrades kind of went the way of the dinosaur after Sega’s disastrous CD and 32x and Nintendo’s botched handling of the 64DD.
I suppose what could happen would be to take your machine in to a business which could gut it and replace a several components, but keeping a hard drive (games aren’t increasing in disk space required as fast these days) and settings and casing and mobo (likewise as HDD). It could be cheaper for both consumers and mre profitable for the sellers, and speed up cusomer buy-in ensuring a good harvest.
If the developers changed their development significantly to capitalize on the features and power of the new hardware, they may alienate people who don’t own the newer hardware. If they only change things minorly (like add better post processing effects) that don’t fundamentally change the function of the game, it may not be looked at as much of an upgrade.
Console users seem to like to brag that being locked into a certain level of hardware for years (in an industry where hardware advances come fast) is an advantage, so I’m not sure there’s even demand for this.