Why is Windows so darn complicated?

I have never been in windows DLL hell despite owning windows machines for years and putting a variety of software on them. On the other hand with Linux I have had tons of issues trying getting one program to work only to find out that I have broken a different problem because of incompatible libraries. Recent distributions have gone a long way toward fixing these problems. Ubuntu seems to do this by placing all software you want to install in some central data base. I don’t know what happens if you want to install something that is not in the central data base. I can however install lots of stuff on my windows box that has not gone through some microsoft approval process.

Great post, and I hate to nitpick (who am I kidding, I love to nitpick), but Windows 3.11 was the first version of Windows that was not at all just a GUI running on top of MSDOS. Once Windows 3.11 had loaded (admittedly, using DOS for that), if it were possible to somehow remove DOS, Windows would have happily continued to run, because it never called on DOS to do anything.

Earlier versions of Windows were also independent of DOS to varying degrees.

That’s true. My Commodores came with such. Also every ROM routine was described, and virtually all of the processor instructions, and the RAM registers. All of this in the user’s manual! I guess I meant, though, that the architectures were proprietary in the sense that you couldn’t duplicate them without running into legal issues. Remember that the Commodores had their OS in ROM (Microsoft BASIC plus all of the ROM routines). The Atari ST had most of the OS in ROM. The Amiga’s Kickstart was in ROM (to upgrade from Kickstart 1 to 2, I had to swap a chip!). Even the Mac ToolBox was in ROM. That’s not to say the OS loader didn’t apply patches on the fancier machines, but the heart was built into the machine. Many of the chips were proprietary and not easy to duplicate at a reasonable expense (the Amiga in particular!). To clone any of these machines that I’m calling “proprietary” would mean all reverse engineering all of the firmware in addition to the proprietary hardware. The IBM-PC, though, had only a BIOS in ROM. The operating system was third-party, and everything else was commercially available. You could buy MS-DOS off the shelf, or buy reverse-engineered variants such as DR-DOS. I do know that eventually there were some Apple-][ clones. I think the “Lazer” was one such knockoff. There were some Mac clones, too. The one I’m thinking of was an Australian laptop, but even so you had to pay for a Mac Plus as well, just for extracting the ROM! Even modern Mac emulators aren’t distributed with a ROM; they all say you’ve got to get a ROM image yourself. And Amiga emulators. And probably ST emulators (never looked into them).

Yeah, I didn’t specify the timing very well. For business throughout the 80’s there was still the Apple-][ (thank you, Visicalc), but that waned out in favor of the IBM and in certain circles the Mac. At home, all of the 8- and 16-bit machines were still popular. But due to mismanagement, they never penetrated much into the business world, and as a result they ultimately died out. I had my Commodore-128 through 1990, when I sold it in favor of acquiring a Mac SE at very great personal cost! I sold it in favor of an Amiga 500, but that only lasted until I could get back into a Mac Colour Classic in 1993.

In this time, the Amigas were the de facto standard for video work. I still saw the boot screen after power failures at the cable company as recently as 2001! The Atari ST’s were still huge for the music industry, but the Mac started to move in on that niche. The Video Toaster people pretty much killed the Amiga when they moved to Windows. Even Commodore had a line of PC-AT compatibles for a while. But in the end, all of these non-business players failed.

It’s interesting to note that computer emulation is a thriving subculture today. While it was impossible at the time to clone a great number of these machines using actual hardware, we can run all of them as virtual machines today with exact hardware compatibility. Even things so hardware dependent as the Amiga.

To expand on what Dominic Mulligan wrote, this is wrong on several levels. DLLs (or some variant) are used on Mac and Windows and, I’m told, on Linux. It’s an executable code sharing device, nothing more.

Macs had DLLs starting with the PPC, and they were called Code Fragments. OS-X-era Apple has .framework files and dylibs. These are all arguably the same as DLLs (ignoring some subtleties about framework arrangement).

Previous to all this, Macs had INITs, which were in themselves the very definition of DLL Hell. If you installed the wrong INITs or they didn’t play together well, Bad Things happened, from random crashes through nearly bricking your machine via boot crashes.

More here and here.

Also, please understand the many many software vendors use DLLs/frameworks/whatever as the vehicle to write plug-ins, e.g. Photoshop “filter” plug-ins. These are DLLs, and people find them extremely useful.

Also, see this timely example here on this board of someone installing a DLL on Mac.

Excellent question, but unfortunately it’s hard to give a straightforward answer without delving into history, business, and a lot of other things. I’ll try to list out some reasons:
[ul]
[li]The modern PC sitting on my desk today is many times more powerful and sophisticated than the enormous mainframes that I worked with 30 years ago.[/li][li]Despite the complexity of computers and the Internet/WWW/HomeLan environment in which they work, we expect computers to be more simple than our cars, roughly like opening the hood, tossing in a carburetor, and expecting it to work.[/li][li]And expecting the carburetor to work even if your car actually has fuel injection and you don’t know it.[/li][li]And expecting your car to continue to work if you decide you don’t like the carburetor and simply yank it out.[/li][li]Mac OS isn’t that great either.[/li][li]But Apple rigorously controls everything from displays to keyboards, so Mac programmers can usually assume they know everything that will happen with the software they build.[/li][li]Nonetheless, not all Mac software runs on all Macs! You have to know what Macs are compatible with the Mac software you buy. Some older Macs simply won’t work.[/li][li]Whereas your copy of Windows has to be really really old for that to happen with Windows.[/li][li]Windows is a magnificently complex OS that handles networking, security, web connectivity, masses of peripherals, many different programming languages, and (roughly) tens of thousands of programs without a lot of hiccups.[/li][li]If you tried to do what we do on a typical Windows PC with the typical minicomputer of say 25 years ago, it would never work.[/li][li]The problem with Windows may be that it allows you too many options. I dare you to try to uninstall a key piece of OS from the Mac without problems.[/li][li]Not to mention trying to connect up a piece of hardware that Apple didn’t make.[/li][li]Don’t get me wrong. I am not a big Windows fan.[/li][li]I’m not a Mac fan, or even a Linux fan.[/li][li]The idea that you even need to worry about an OS is frustrating to me.[/li][li]Where is my jet pack? Where’s my flying car? Where’s my house of tomorrow? Where’s the computer like the one in Star Trek?[/li][li]You shouldn’t need to know anything about computers in order to get the answer you want. Aren’t computers supposed to be smart?[/li][li]Well, not yet.[/li][/ul]

Actually, let me correct myself here: 68k Macs had Code Resources, which were generally stashed inside the application binary, but could also roam free as INITs, CDEFs, LDEFs or just custom files that some programs (e.g. Photoshop filters again) would load to add functionality. Code Fragments were eventually made available for 68k Macs, but PPC was generally eclipsing 68k desktops by this time.

Okay…The reason why OSX and Windows are different. Lemme see if I can’t take a crack at it an help you make some sense of it.

Firstly, you ask the question as if you are fed up with Windows. You paint windows in a negative light. That’s fine, I guess, but I’m a long-term Windows user who is a Mac convert. But, I also understand the reason why I liked windows back in those days.

There is first of all, the legacy support that windows must endure. The downside of being a monopoly is the fact that they are constantly saddled with their previous design flaws and must continue to support everything. I’ve read a few comments in the old Windows 2000 code that was leaked a few years ago, and it was filled with mentions of strange hacks to get various Microsoft products working. Apparently it isn’t a straightforward process. Personally I think that Microsoft should do what Apple has done and completely cut the cord with the current windows code. Design an OS from the ground up with the best tech and ideas. This would make all of their old stuff not work, but they could simply make virtualization technology to make everything work. It’s microsoft. I am sure they could. VMWare makes very good tech to use windows programs at near native speeds on a Mac, so I’m sure it could be done.

Secondly, there is a difference in design philosophy. On a windows box, you notice this a lot. An example. In Windows, you are notified when you plug in an external HD, or an iPod. There is a little pop-up balloon. On a Mac, you don’t get that, you simply seen an icon pop up on the desktop. On a windows box you get all kinds of notifications or whatever, on a Mac there’s much less.

The end result is that on a Windows box, you’re treated as a much more savvy user, while on a Mac you’re treated like an idiot from a UI point of view. On Windows, the emphasis is based on a computer nerd who is in complete control of his system. On a Mac, the aim is to treat you well, and make you happy without this sense of control.

Prior to the PPC (and I’m not sure if this is due to the PPC or the resultant Mac OS upgrades), there was no such thing as dynamic linking. Windows DLL’s have always been dynamically linked, and shared libraries/frameworks/rose-by-any-other-name are all dynamically linked. Dynamic linking is essentially leaving a blank spot in your final, linked program, with the promise to fill it out in real time. If something goes wrong with dynamic linking, than the thing that is responsible for filling in that blank – the OS – can deal with it gracefully.

Prior to that though, we had the processor trap and stack pushing. Because there was no real dynamic linking, calls to operating system API’s were done in a more direct manner. Take another step back: on the ROM-based machines, you always knew that function y lived at ROM address $x, so you could use that routine without writing it yourself (the whole point of libraries/DLL’s). But on new machines where the OS was only partially on ROM or the ROM was patched during the OS load, these addresses were constantly changing, so you couldn’t depend on knowing the hardware address. So, they used processor traps. To call a Mac ToolBox routine, you’d prepare by pushing the relevant data into a known memory location or a processor register or a processor stack, and then trigger an illegal opcode. A 16-bit processor could have over 65 thousand potential instructions, but the Macs’ processor only had less than 100 instructions. By causing the processor to intentionally fail, it could pickup on the fake instruction and use the library, as if it were dynamically linked.

It depends on your definition of DLL. On Mac, from very early on, you could certainly load a code resource from a file and jump to its entry point. Witness XCMDs in HyperCard, ca. 1987. If you mean linking to a stub library as the essence of DLL-dom, please note that code resources, code fragments, Win DLLs and other things all allow some way to manually load the code and call an entry point inside it. Witness, again, Photoshop filters – Photoshop doesn’t link to their export library and loads them through a different discovery mechanism; yet the filter code still lives in DLL files. So I guess it’s a gray issue.

Agree with your comments on trap patching, which was a lot of the essence of what INITs did at startup time.

This conclusion is not supported by your own evidence. How are constant, intrusive messages and reminders of benefit to the “savvy” user?

Which OS is more “user-friendly” is debatable. Windows presents a bunch more options, and makes them all visible. Mac presents fewer options and does more things behind the scenes. UN*X variations are the most transparent, and perhaps that leads them to be the hardest to learn.

DLLs are a great feature, but I have never figured out why MS made it impossible to unload them during runtime. For that reason, when you make a change that requires a new DLL, you have to reboot.

I have to agree with this – those balloon popups on Windows saying that my wireless network is found, there are updates available, blahblahblah are annoying, and seem targeted at the beginner. Mac wisely deprecated Balloon Help in favor of Tooltips, which I’m pretty sure was a Windows innovation (although I don’t see any googleable support for this, so take that for what it’s worth). I wish Windows would in turn nuke these silly balloons.

That’s only true if the DLL is currently running in a process (usually as part of the OS or drivers); you need to restart so that the old version of the DLL can shut down and be replaced with the new one.[/nitpick]

I should also add that pre OS X mac software was the easiest in the world to pirate. Drag programme folder to disk, viola!

Not that I ever did mind you

Gotta agree, too. They just piss me off and get in the way. That’s definitely a feature aimed at the clueless. “Savvy” users don’t need need such constant self-assurance.

Don’t forget that there’s a Unix hiding under the Mac OS. So which court does this ball fall into? If you stay in the GUI, it’s still Unix. I’m not really saying anything about the Mac OS, just pointing out that Unix doesn’t have to be arcane.

These “illegal” or “fake” instructions were the A-line opcodes of the 68K processor family — and they weren’t exactly illegal or fake. They were designed specifically to act as software interrupts, like the Intel’s “INT” instruction. If the upper 4 bits of a 16-bit opcode were “1010”, then the lower 12 bits were treated as an index into a special vector table in low RAM, which told the CPU where to go next. It was up to the OS to set up these vectors beforehand to do something meaningful.

If the 68K CPU encountered a genuine illegal instruction, it jumped to a different vector, completely apart from all the A-line traps. (On the old MacOS, this resulted in a “bomb” box on the screen, and a message seemingly tailored to scare the bejeezus out of novice users. The bomb icon probably didn’t help matters.)

By-the-by, the problem with the OP’s system was Norton, not Windows. Norton is a disaster now, and has been for years. I dunno what happened to it. A decade ago, they were the king and really ran things through the wringer. Nowadays, their code is never ‘bugged’. It merely interferes with everything, grabs ridiculous amounts of system resources, and won’t work with most of the programs it needs to. And if you can uninstall it, they leave crap on your system. And they charge you 50$ for a patch which doesn’t help… every year.

Yep, all this is true, and has been for several years, which is a shame – it used to be good stuff (or reasonably good). Run very far and fast from Norton.

I’d like to also add a warning on the McAfee – they stepped up when Norton faltered, and ended up sucking just as much after a while. If you want your machine to run as slowly as possible, install McAfee.

I say install them both, and let 'em fight it out.