Why is Windows so darn complicated?

I’m full of turkey. Please don’t go putting words in my mouth. I appreciate the answers written out above by dopers.

Here’s another question. Why do people get so defensive about this?

But that is predicated on the assumption that windows *is in fact * more complicated than Mac. I don’t see evidence for that. We cannot come to that conclusion unless we are allowed to post examples where Macs are deficient, and then the thread devolves into an OS war.

Norton doesn’t play nice with much of anything, including OSes, ferchrisakes. Get AVG instead, if you really want to put up with that kind of program.

Another reason that things are more complicated in Windows is that it is more oriented to centralised, automated configuration. Some kind of standardised central repository of per-computer, as well as per-user, configuration makes this much easier. In Windows, it’s the registry. I’m not saying that the Windows registry is a great piece of design, but I can see the motivation behind it.

How is centralised administration, “Group Policy” in the Windows world, achieved in Mac-land?

Hear, hear. But you must have written your program the right way, with a single, knowledgeable author, instead of the Microsoft way, with a vast staff who don’t know what they’re doing individually or collectively.

It depends upon how you define “complicated.” Are you talking about complication on the code level, or on the user level? I think as far as the OP is concerned, a solution on the user level would be fine, even if it complicated things on the code level.

Presently, on the coding level, my understanding is that Windows Vista is more complicated than OSX, reason being, the DRM software installed to deal with HD-DVD/Blu-Ray content.

Based on the examples in the OP, I presume we are talking about the user level.

FWIW, Linux is FAR worse when it comes to program installation. Sure, it’s easy to download a program from some central repository, but considering the Linux directory hierarchy, where there’s really no central system or installed programs folder, files are scattered all over the place. Directories like /usr, /bin, /lib, and so on - sure, they each serve a certain purpose, but it makes it nearly impossible for all but the most neckbearded of penguinheads to uninstall a program.

So you are basiclly asking why Macs and Windows behave different with installs and uninstalls? Simple THEY ARE DIFFERENT OSs

Depends - if it’s installed via a package manager such as Synaptic or YaST, uninstalling it is easier and cleaner than it is in Windows (although not as simple as on a Mac) - the package manager handles the removal of all the stuff that went all different places. But yes, if it’s one of those installations where you have to muddle through editing configuration files, compiling binaries etc, there’s often not even any such thing as a documented uninstall procedure.
Fortunately, there’s almost no need for fussy installs like that now, for desktop applications in the most popular Linux distributions.

The history of Windows versus Macintosh is that Windows was an OS that worked on open hardware systems while as Macintoshes had tightly regulated hardware that could be used. similarly, the guiding goal with Macs is to predetermine what you want to do and allow you to only be able to do it in their predetermined way, while as Windows allows you to swap and modify and configure things.

Why? Well the reason you want more choice in hardware is that it lets you go price and feature comparing, allowing you to get the features you need for the cheapest you can get them. Of course this means that Windows has to try to run even if you’ve got substandard/buggy hardware installed. And the reason you want more ocnfiguration is that it makes it easier for software developers to make things that have more and easier to make ways for users to be able to deal with their product. But those ways won’t be as standardised, of course. But from the end-users end it means that they can pay less for the software because the developer didn’t have to work as hard to make it. And for that same reason, there will be more software for that platform since it’s easier and cheaper to develop for.

So ultimately, with Windows you get more applications but with less shine to them, but more ways to interact and configure them; more hardware but less stability; and a lower cost. The world chose to go with Microsoft way back when because when it came to user friendliness or cost, they’d rather the cheaper system.

I’ve not had any hands-on experience with it, but Apple calls it Managed Preferences and/or Workgroup Manager. Thanks to what Apple calls Open Directory, Macs also can authenticate against Active Directory or LDAP without much difficulty.

Just to muck the original issue a bit - I’ll submit that OS X may actually be *more * complicated than Windows. Not to the user experience, but internally.

If you consider a continuum where a PC running MS DOS is technologically simple but requires the user to be very savvy and a Mac at the other end, where “It just works!” you’ll find a huge amount of complex programming behind the scenes to make it work.

There’s good reasons for using dynamically linked libraries, which is why they exist on nearly every platform, in one form or another.

Also, the aforementioned DLL isn’t exclusively a Windows phenomenon; Linux suffers from it, too.

[Moderating]

Certainly we would like to avoid this becoming a religious debate. However, a number of posters have given interesting comparisons of the different approaches used by the two systems that results in some of the differences noted by the OP. If you want to post some factual information on the subject of that nature go right ahead. But simply complaining about the question itself doesn’t strike me as very useful.

Colibri
General Questions Moderator

My understanding is that Windows took it to new extremes, however. I am fully willing to admit that I might be wrong on this matter.

A large part of the reason that installing/uninstalling an application on Windows is complex is that Windows has many features that tie application functionality to other features of the operating system. You’ve got your program group of the start menu. You can right-click on a file and perform operations on the file using the program you installed even though you aren’t explicitly opening the program. You can drag the file onto an icon to perform an operation using the program, again not having explicitly opened the application.

The way Microsoft made this happen was to require lots of entries in the registry. You need an installer to write these entries. You need an uninstaller to remove them. It has been a long time since I wrote software for the Mac, but what I recall is that each application maintains this sort of information inside itself and in simple preference files. Microsoft’s approach has some advantages, like allowing you to maintain multiple user accounts on a computer without the applications having to be aware that there is more than one user. However, the downsides outweigh the advantages. The central registry model sucks to program for and sucks for the user. But we’re stuck with it until Microsoft decides to create a new operating system that isn’t based on it.

I’d wager that there are a lot of discussions about this topic in Redmond but nobody is willing to be the one who bets his ass on such a major change to the OS. Breaking backward compatibility is not done lightly. There are ways to do it without breaking every existing Windows application, but they’re hacks. For example, Windows Vista intercepts many registry writes and redirects them to other locations. Most applications can’t write to HKEY_LOCAL_MACHINE on Vista, for example. Attempting to write there generally results in the entries being written to an area in HKEY_CURRENT_USER. They could theoretically redirect all registry writes and reads to preference files that live in the same directory as the application’s files, but it would be a nightmare. So we’re stuck with the registry until they decide to be bold and start over.

Another thing is that Windows is actually made up of lots of subcomponents, some of which are used by other programs. For instance, when most programs want to render HTML, they use the IE renderer component. The actual core of Windows, the microkernel, is a very svelte and nice piece of code. It’s just that when you load all of the components on top of that, and demand backwards-compatibility, and add the requirement that as much complexity as possible be hidden from the user, but exposed to the programmer, you start getting into strange territory.

Of course, none of this applies to the driver model, which is a truly horrifying thing.

There are a lot of good comments above, but a complete answer would (and probably does) fill an entire book.

I grew up with both of these operating systems and consider myself an expert in both of them (except Vista, which I don’t use and can’t speak credibly about), so here’s my perspective:

The Mac was engineered to be an easy-to-use computer. Initially the Mac was designed in the era when everything was proprietary and non-interoperable. Others from the era were the IBM-PC, the Ataris, the Commodores, Acorns, Kaypros, Tandys, Texas Instruments, and a bunch of weird European stuff.

The IBM-PC was still mostly safe from cloners at this time, as it was still new and mostly everyone was happy pushing their own technology. IBM did something no one else did, though. In order to rush to the market, they used off-the-shelf, well-documented technologies for everything and purchased their operating system – DOS – from a third party, Microsoft, hence eventually MS-DOS. MS-DOS was written by another company, and was heavily based on CP/M, something already found in many corporate environments. With the IBM name, it’s natural that companies started gravitating towards IBM for “serious” work.

Now that the currents were flowing towards IBM, other companies (notably Compaq) realized that with off-the-shelf, well-documented hardware, a PC could be cloned. The only proprietary element was the BIOS, which even today is an archaic program fairly easily replicated. And PC-clones exploded. That’s how we used to refer to them, “PC-Clones,” not Winboxen. From that, we’ve arrived at the current “PC” to mean a Windows box, and even though a “Mac” is a personal computer, it’s not a “PC” (even though it is).

Apple wasn’t sitting on its haunches. The Lisa flopped, but the original Mac became a hit. It was everything that a PC wasn’t. PC’s were still DOS based. You needed a lot of training. You weren’t just going to blithely put one on the receptionist’s desk. The Mac booted up to a training program that showed you how to use a mouse, how to point and click, and how to do everything that you could in the scope of the then-limited operating system. It truly was revolutionary. (Yeah, yeah, Xerox PARC and all that – it was still revolutionary in the sense that it was actually brought to market.)

By this time, though, everyone else but Apple, IBM, and the early cloners were gone. IBM was IBM. It was rich. Apple was loaded with the revenues from the Apple-]['s. Commodore and Atari had been well off, but had bad management and died out. This pretty much left IBM and Apple the only platforms in the game.

So now he have to platforms – the PC and Mac. How did they get to be so different? You can see that their very roots are different. The Mac’s graphical user interface merely progressed. When Microsoft decided to copy the Mac’s API, they still had the problem of DOS to consider and to support. Windows through 3.11 were still DOS programs, graphical shells atop the “real” operating system.

Macs and Windows both had their problems along the way.

The Mac was limited by its early 80’s software design philosophy. It was never designed for multiple programs to run at the same time, because with memory prices and hardware speed, who could have figured? So the OS was modernized gradually, breaking only a small percentage of programs with every upgrade rather than all of them. This was tolerable, as software vendors would eventually update. But it was a lot harder to adopt the very core of the OS to new technologies, such as memory protection and pre-emptive multitasking, because the OS was never designed for it. Nevertheless, for a whole lot of reasons filling other books, Apple finally released Mac OS X. It was a reset of sorts. A fresh start. It literally broke 100% of all software written for the Mac prior to it. Except… as a program included in the operating system was a virtual machine called Classic. It allowed most pre-Mac OS X software to execute thinking it was running in Mac OS 9. For “native” programs, vendors could adopt their legacy code to “Carbon,” which was a subset of the Classic API that played well with a modern OS.

Windows wasn’t limited by the Mac’s early limitations. For example due to the processor, through some processor trickery it was possible to execute old, cooperative multitasking programs under a preemptive scheduling system. It offered a basic memory protection scheme. It could fake processor execution modes to allow legacy stuff to run, as long as the API was still supported. Part of Windows’ complexity today is the continued support for most of the legacy API’s. Although NT (including XP and Vista) were “fresh starts” they were still progressive in nature at the API level, not revolutionary like Mac OS X.

So where do we find ourselves now? The Mac OS had a complete re-start back in 2001, but with two decades of user interface experience to make it correct from the start. Windows’ restart wasn’t at the API level, and still supports two decades of legacy programs. If I want to play Warcraft II on my Intel Mac, I have to run it under WINE or in Parallels under Windows.

Maybe the next generation of Windows could use a re-start, i.e., literally segregate the “classic” virtual machine from the new Windows core. Probably end up with Windows running atop Linux, which takes us back to the Mac, which, in a lot of ways, is like running the Mac OS atop Linux, except of course it’s Darwin which is a BSD Unix variant. Now it becomes apparent that the “new” Mac OS doesn’t really date from 2001 at all, but from the early 1970’s, with its roots in true multi-user machines. Windows was kind of cobbled together along the way in that respect, as “personal computers” were just that – personal – with no history of multi-user computing, only tacked-on fixes.

I think that at the core level, the Mac OS is probably a little more simple than NT. But I’m not a kernel engineer, so take that opinion for what it’s worth. The next level up are all the low level API’s. The Mac is probably more complex here, in that it supports the whole POSIX standard. Going up from there, the higher-level API’s are probably richer in Mac OS X, and not as convulated as in Windows. Remember that Windows supports a whole lot of legacy API’s. With the Mac’s 2001 reset, there’s Carbon and Cocoa, and that’s it.

Then there’s the whole vendor angle, as RJKugly mentions. Vendors deliver what people expect. If there are no high expectations on the Windows platform, then you get what you get, like Windows installers and files scattered all over the place. When you expect a drag-and-drop install on a Mac, even the mighty Microsoft delivers Office for a Mac as a drag-and-drop install (really!). Although, that’s going out the window lately. Even Apple software is typically installed with an installer these days, and files are all over the place. This mostly has to do with being a multi-user machine, though, not for historical reasons like on the PC.

I have a stupid little program for Windows. It uses an installer because back when I built it, everything that was “serious” needed an installer. It uses the registry, because that’s what’s expected. It uses shared libraries (dll’s) because that’s what’s expected. In retrospect they were all stupid decisions on my part. I should have done it the Mac way, but then again, would novice users want to drag-and-drop, or are they more comfortable with the “normal” Windows way? I also have a stupid little program for the Mac, done the Mac way. Completely self-contained, drag-and-drop, and the preferences file is exactly where Mac users know to expect it, and labeled in the Apple way (com.b***.tidy) for easy trashing. I could have built an installer to look like a heavyweight, but that would just piss off most Mac users.

Sorry to be so long.

You needn’t be. That was an interesting post. Thanks.

Still, you posted an excellent summary of the situation — though I have a few bones to pick with your history.

I wouldn’t have characterized the era that way. The Apple II, Apple’s first hit, was one of the most open architectures of the day. Many computers from the 8- and 16-bit era came with ROM listings printed in full in the reference manuals. The slots in the Apple II and IBM PC had well documented bus protocols, and thousands of third-party hardware products were made to fit them.

The closed architecture of the first black-and-white Macs was a departure for Apple. In fact this was one of the big complaints levelled against the machine at the time: you couldn’t just open up the Mac and start tinkering with it. You couldn’t write programs for it without buying expensive software development tools.

You’re certainly right though about the non-interoperability of those early machines.

Commodore 64 and Amiga fans are probably seething right now. I would say those “gone” machines were still a significant presence in the North American home market (though not business) through all of the 1980s. IBM PCs and their clones were still lousy for graphics, sound, games, or user interfaces. These were dry, joyless, workaday business machines almost entirely. (Not withstanding the mirthful Charlie Chaplin impersonator that IBM picked for its mascot.)

That all began to change some with Windows 3, and a lot with Windows 95 — after which the end times were nigh for all the quirky alternative machines like Commodore and Atari.

Well anyway, great post.