Look, guys. You’re both sort of right. Windows can run some Windows 1.0 apps, but it needs a lot of help. Windows has changed tremendously from Windows 1.0. Windows XP only runs 16 bit apps (and even some Win32 (i.e. Win9X) apps) because it included a backwards compatibility layer. And now, Windows 7 requires a full-out emulator/virtualization environment to run some Windows XP apps.
Windows 1 and 2 both require you to convert the resource data into a Windows 3.x format. And Windows 1 apps require you to change their header to tell Windows that it is a 2.x file. And even then, some programs don’t work correctly. PAINT can’t display dialog boxes. And MIcrosoft Executive can’t run at all.
So, yeah, with a little tweaking, you can get Windows 1 apps to run on XP, and, I assume, using XP Mode on Windows 7. But that’s a far cry from saying they are the same.
Even with your clarification, the statement “between different versions of Windows, you essentially need to rewrite most of the code base” is wildly off the mark.
I was going to say that people don’t want innovation as long as they’ve got something up to the level of Windows XP or better.
Thats an odd thing to say. How did we know we wanted XP when we were using DOS? We didnt. OS designers keep anticipating future needs, innovating, etc. In 10 years you wouldnt touch XP with a ten foot pole and someone else will say the same thing about Windows 12. Heck, I dont like using XP now. Running admin 24/7 isnt good for anyone and user-level general use is much, much better in Vista/7 with the UAC. The OS world is ever changing, as well as the applications word, the hardware world, the networking world, etc.
There is no perfect OS or perfect architecture. Heck, look at how the new, and by new I mean 4 or 5 year old, multicore chips surprised everyone. Very few apps do native multithreading and we’re only starting to see this in the browser world with Chrome and in the gaming world with some recent releases. The SSD revolution had begun and only Win7 supports the TRIM command. Support for RAM drives has always been poor. The Microsoft people are still trying to figure out how to secure an OS which gives admin rights to end users who will click on anything. We’re still trying to get some kind 3D glasses standard protocols for games. We still use aging filesystems like HFS+, NTFS, EXT3. Boot times are still long. Disk fail prediction is terrible. etc.
There’s lots of room for improvement over the latest linux distros, win7, and OSX. Lots. Win95 users didnt know they needed a firewall, protected memory, standby, etc.
Maybe Steve Jobs is just really afraid of doing an overly ambitious project.
I mean, that’s not unreasonable. Think of grandiose Apple OSes being abandoned one after another during the 90s before even version 1. Microsoft abandoned an OS too, just now during the years between XP and Vista (and of course Vista itself may be said to be an example).
Lots of stuff is going to change whether Jobs wants it or not. It happened before.
As far as I can see, the introduction of the internet as a general user resource forced Apple and Microsoft both to work bloody hard to get their OS up to scratch. A connected OS basically means you have to have strong protection of processes, in the areas of security/permissions and in the scheduling mechanism. Neither OS9 or Windows 9x and earlier were capable of doing that.
Apple chose a unix core probably because unix was a completely well proven system in exactly what was required. Reliable full multitasking, networked, multi-user systems is what unix was good at. Unix isn’t rocket science by a long shot, but it sure works, and you can get plenty of developers who understand it. Putting a nice bit of gloss on top is what Apple is good at, and they’re not afraid of forcing developers to follow their lead - for which I give Jobs credit.
What’s interesting right now is many-core machines and distributed systems. In my area of web applications the interest in getting the best out of both is increasing every day, and though traditional multi-threading apps will probably hold out on the desktop for maybe a decade or so, but everybody who’s working on servers can see that it won’t last. What’s needed is a real distributed system. I don’t know if that will result in a completely different OS, or if we’ll see more systems like Erlang/OTP build on top of fairly generic, bare bones OS’s (which Unix certainly can be).
Anyway, choosing a Unix as the base of the OS was absolutely the right decision at the right time for Apple. I know you don’t like OSX, but hell, the sheer amount of stuff that’s been ported from other Unices just because it’s so damn easy is staggering. There’s no way that would have happened if they’d written their own “state of the art” OS in 1999.