Justification for the exponential increase in Windows "minimum requirements"

This has been bothering me for a while, I say bothering, I haven’t been losing sleep or anything but it confuses me.

Why does each new version of Windows require so much more “power” than the previous version and add virtually nothing, at least on the surface. I compiled this list of recommended requirements for recent versions to illustrate the point:

My question is basically why are so many more system resources needed for each new version? I struggle to think of anything I can do with XP that I couldn’t do with previous versions that would justify the increase and Vista further confuses things with a massive 2GB of RAM recommended.

Is it all bells and whistles that are pushing the requirements up or are there significant changes in the “background code” of the operating system that require more resources?

Could we theoretically have a version of Vista that forgoes the pretty façade (giving it the Windows Classic look a la Windows 2000) whilst retaining the improved security and stability of the code but ultimately requires far fewer resources?

That pretty façade costs almost nothing more in terms of resources than the Windows classlc look. The real cost of running an operating system comes from everything it’s doing under the hood. That’s not to say that XP and Vista aren’t bloated, mind you; you could have an OS that does everything they do that runs on less powerful machines.

Can you please elaborate on what is going on “under the hood”? Specifically what is going on under the hood in XP that wasn’t there in 98 and what will be going on under the hood in Vista that wasn’t there in XP?

I’d have to disagree with this. Everything I’ve read indicates that you need a high end graphics cards, probably another 512MB of memory, and a somewhat faster processor to run Aero. All that translucency and animation comes at a cost.

Some of the Vista requirements may come from its ability to run 64 bit code, but I’m guessing software bloat and bells and whistles accounts for a great deal of it. There is also the tendency for core technologies to migrate from application space into operating space, e.g. Windows Explorer.

I’m not sure about Vista, but I believe that Windows XP added some big thing where pretty much every Windows object would convert everything shared into XML, and then back for internal processing. Similarly, most any Windows or MS object can be loaded and called from any scripted language–necessitating that everything is in proper COM format. Etc.

MS keeps adding new object formats and standards (like COM or .NET) that supposedly make it easier for the whole system to be viewed as a big Object Oriented Environment where everything can be accessed from any language and by humans too.

Overall, pretty much all this has accomplished is that

  1. Windows is easy to make viruses and trojans for, because various things are made to like to run scripts (which can of course gets a nice handy interface to anything and everything on the system)
  2. No one except virus writers are actually aware of how much of Windows is scriptable, since they are the only ones who can’t just use the GUI like everyone would ever expect. Which means that the documentation on Windows scripting is left to remain arcane and out of date, since no one uses it (publicly) or asks about it.
  3. Everything is bloated with useless abilities that are unneeded.
  4. What functionality there is, is relatively unclear to most of everyone at Microsoft as well, so they end up rebuilding similar things (with similar interfaces) that are redundant to the old stuff, but leaving the old stuff so as to preserve backwards compatibility.
  5. They’ve created and recreated various waaaaay overblown techniques for making Object Oriented applications. Given as they’re making the only compilers and script engines that use these objects, there is no great need to make this great all-encompassing object that requires: A machine version of data, an Ascii representation, and an XML version; A Global Resource Identifier, Local Resource Identifier, Universal Resource Identifier, and so on; CAB formatted, with Security Clearances that no one knows how to set; Embeddable on a VB app, regardless of whether it actually has any visual component or not; Uses Wide Characters, even though your compiler doesn’t understand Wide Characters so you have to convert everything, most Windows functions don’t understand Wide Characters, and no code outside of random Windows/COM/.NET functions even uses them; etc.

Overall, 90% of the overhead in both terms of size and run time of MS/Windows Apps goes to fulfilling a bunch of standards that are haphazardly implemented, poorly understood, fulfill no usefull purpose to anyone with any business doing that, and ill-documented so you can’t do it if you wanted to.

Does this mean I’m going to HAVE to have 2GB RAM? Or can I get by with less?
And, what’s the cost of off-the-shelf Vista? (when it comes out)
Do you think it will be worth the extra cost of RAM and purchase price?
I’m pretty happy with Win2K and 512 meg of ram.

These answers all seem to miss the point. Microsoft and computer manufacturers have a pretty tight relationship, and it’s still fairly hard work to avoid Windows - at least if you’re using multiple software applications and working with other people with them. Windows does this because Microsoft can, and because it costs you more.

I bet people spend more money purchasing Windows while wishing they could avoid it, than any other product.

I can’t, really, because 98 and XP have very little in common under the hood. They’re different operating systems, with the only real connection between them that they both look similar and are called Windows. I haven’t seen much on Vista’s architecture yet, but I expect it’s closer to XP than XP is to 98.

That hasn’t been my experience running it.

Hardware’s cheap. Most people use Windows, the only real competitors are Macs and Linux, both with tiny markets compared to Windows. What impetus is there for reducing the minimum requirements?

I’m glad this question was asked. I’ve been wondering this for quite some time. Windows 95 ran fairly well on my old 486/66 back in the day, whereas Vista requires about 20 times that. Conversely, FreeBSD 4.0 required a 386 while 6.1 (the newest version currently) only requires a 486. (It probably could run on a 386 if they hadn’t removed the 386 code due to so few people running 386’s nowadays.) I guess applications such as window managers have increased their requirements by a greater margin, but still not up there with Windows, despite greater changes. I always figured the code in Windows simply got sloppier with fewer low-level optimizations, like how practically no one optimizes their code in assembly language nowadays because hardware is so much faster.

The same thought occurred to me snailboy, I have no cite to back this up but I have always thought that each iteration of Windows re-used much of the code from the old version with some additions and alterations made. Great advances in hardware have happened and continue to happen and surely code needs to be changed to take full advantage of the new technology. I have always thought that if Microsoft started with a blank piece of paper and built a completely new OS from scratch it would run twice as well (in terms of ‘smoothness’ and stability) and be far less of a resource hog. Thinking about it, the whole Millennium Bug thing came about because people were still using code that was designed to run on <1MB RAM when 1GB RAM was nothing unusual, in my mind this is pretty damning evidence that computer programmers are a lazy beast by nature and would rather re-use old code wherever possible rather than starting from scratch and producing code designed to work on today’s far superior technology.

Typically, each new release is optimized for the CPU speed and RAM of what the developers predict will be present in a new low-end computer at the time that the operating system is released. Most operating system sales are made to OEMs who bundle the software with new computers, not to individuals or companies who are upgrading old computers.

Not only do we do that, but there’s no serious argument that we shouldn’t. It has nothing to do with laziness and everything to do with the cost of software development. Going back to your example, Microsoft might be able to start over and write a new OS that worked significantly better than any current version, but it would take ten years to develop and probably be priced out of the consumer market.

Security and stability? Is that really what they’re promising? I was expecting they’d try to wow you with the big round buttons that all end up saying it’s not recognizing your hardware.

Not really.

So long as they kept the interface for all the drivers the same, they could rewrite the kernel in a couple of months. Most likely they have rewritten it several times. What holds them back from fixing things is the goal to preserve backwards compatibility. For each poorly implemented bit of functionality they ever made, there is X percent of all apps out there that depend on it being just like that. Thus, they have to keep alive old interfaces and methods of dealing with data, like the Y2K functionality.

The only reason that Y2K wasn’t a problem is because not just Microsoft, but everyone in the world got together and all decided to dump the old standard at the same time. But no one person can unilaterally start dumping all the old compatibility. (Or so goes the argument. Personally I would argue that in MS’ position in the market, they could do just that successfully.)

Much of the (potential) Y2K problem was in mainframe COBOL applications that were written back when disk space or even tape was the limiting factor. I’ve seen a few projects to try and rewrite a code base, with core function dating back to the early seventies - and none of them were very successful. For businesses that still need to service legacy products from decades ago (mostly financial companies AFAIK) there isn’t really any financially prudent solution other than to retain the legacy code. The only remaining knowledge of the legacy business lies in the code - all the original development and management teams have retired. (Documentation? Hah!).

A little history of Microsoft…

Windows 3.0 - the first windows where people actually started using it, ran in 286 protected mode.
Windows 3.1 - minor changes to the above with some 386 enhancements, but still basically ran in 286 protected mode

Windows 95 (which identifies itself as Windows 4.0 - things make a lot more sense when you look at version numbners) - compatible with older versions of windows and DOS. Minimum requirements are based on what most people have available, and the thing will barely run on the minimum requirements (486 with 8 MB of ram, IIRC). Microsoft takes a lot of flak for setting the minimum requirements too low, but also takes a lot of flak for making an OS that won’t run on a lot of hardware. You can’t win.

NT 4.0 - the “business” version of windows. Has a better kernal, better protection, but a lof of the things that make it “better” also completely break compatibility. Also requires a higher horsepower CPU to do the same thing, since programs can’t directly access hardware to speed things up.

Note at this point that Microsoft has two completely different OS lines that they are maintaining, “Windows” and “NT”.

Windows 98 (which identifies itself as Windows 4.1) - As the version number indicates, it’s really just a minor improvement to 95. The minimum requirements are pretty much the same as for 95, but are listed higher becuase of all of the problems that Microsoft had with the minimum requirements for 95

Windows 2000 (which identifies itself as NT 5.0) - At this point Microsoft doesn’t want to maintain two different OS lines. They want to kill off the “windows” line because it has a reputation for being unstable. Note that a lot of this instability comes from being backwards compatible to DOS. Microsoft says they are “merging” the operating system lines. Unfortunately, most games and a lot of home software don’t run worth a crap on 2000. Halfway through development, Microsoft completely shifts gears. 2000 is going to be for business folks only. Minimum requirements are much higher than NT 4.0 because of significant changes to the OS, prettier graphics, and a lot of multimedia apps that are thrown in because 2000 was supposed to be for home use also. Ignore the multimedia crap and 2000 runs just fine on a much smaller machine, although it is much more of a memory hog than NT 4.

Windows ME (which identifies itself as Windows 4.9) - Microsoft hypes this up as the “home” version of windows 2000, but if you pay attention to the version numbners you realize that it’s only a minor enhancement to windows 95 and 98. Microsoft didn’t have a “home” version of 2000 (since they intended only to have one version of 2000) so they threw this OS out the door in a real hurry. The took a lot of the features of 2000 and hastily ported it to 98, called the thing ME, and shoved it out the door. As a result, it’s a buggy piece of crap. They intentionally crippled things like DOS mode just to get you used to not having DOS, because they intend to force all home users to NT whether they like it or not. Even though it looks a lot like a new OS, it’s just 98 under the hood. The increased minumum requirement come from the multimedia stuff that is bundled with it. The OS itself takes up pretty much the same resources as win 98.

Windows XP (which identifies itself as NT 5.1) - This is also hyped as a completely new OS, but if you pay attention to the version numbers, you realize that under the hood it’s basically the same as windows 2000. “Windows” is officially dead. We’re all running NT from here on out. XP is basically 2000 with a huge facelift, and its all of these pretty graphics that result in the increased minimum requirements. XP and 2000 will both run quite happily on 128 MB of RAM, but microsoft bumps up the min requirements on both to insure that things run smoothly no matter how many windows the user has open. If you turn off the “eye candy” then XP will run on just as small of a machine as 2000.

Microsoft also keeps the minimum specs pretty high because they don’t want to support old hardware. The less stuff they have to support, the easier it is for them. If you have old stuff, you’re screwed. Microsoft’s advice? Buy new stuff. Not only does Bill get more money, but so does everyone else in the industry.

Microsoft seems to be foundering a bit with Vista. They are playing catch up to Apple, so a lot of the higher system requirements are a result of new multimedia apps. Microsoft is also switching to a graphics system that looks prettier, but requires a lot more computing horsepower. Vista is going to be a significant change “under the hood”, so it’s basically like the switch from NT 4 to 2000. A LOT of things are not going to be backwards compatible. This is a notable difference compared to the switch from 2000 to XP, which was basically a facelift with a few new features thrown in for good measure. One of the key features of Vista was supposed to be a much more secure file system. Part way through development, this was completely scrapped, so I’m guessing they found some major security hole that made the whole thing impractical. Microsoft has been rushing around ever since trying to find enough things to stuff into vista to justify coming out with a new OS. Vista’s code name during development was “longhorn”, and a lot of folks were calling it “shorthorn” due to its lack of new features. I personally haven’t had a chance to play with vista yet. I’m a little underwhelmed by what I’ve heard about it so far. We’ll see what it’s like when it comes out.

The Hitchhiker’s Guide to the Galaxy says of the Sirius Cybernetics Corporation products that ‘it is very easy to be blinded by the essential uselessness of them by the sense of achievement you get from getting them to work at all.’ In other words – and this is the rock-solid principle on which the whole of the Corporation’s Galaxywide success is founded – their fundamental design flaws are completely hidden by their superficial design flaws.[right]–So Long, And Thanks For All The Fish[/right]

While it’s unlikely that Adams (a lifelong Apple enthusiast) was specifically thinking of the then-barely emergent Microsoft, his words apply to that company like a surgeon’s glove.

Stranger

Barely emergent? In 1985, when SLATFATF was published, IBM-style PCs already dominated the microcomputer market, and they came loaded with MS-DOS; MS was also the largest vendor of Macintosh applications. Granted, Adams must have written most of the book prior to 1984, and probably had IBM in mind, but calling MS “barely emergent” is just silly.

However, the Windows product–with which Microsoft is predominantly associated–was in its infancy in 1985, prior to publication. I doubt Adams had a specific company in mind, but was referring to any number of products and the companies that produced them. The statement could easily, for instance, be applied to the British automotive industry (in its death throes at the time) or Lucas (“The Prince of Darkness”) electronics and appliances. But it does seem particularly applicable these days to Microsoft.

Stranger