I'm curious. Why do they "Strongly reccomend you close all running programs"...

… before installing something?
Surely the OS (in this case Windows) is robust enough to cope with installing a program (essentially moving files about and creating new entries in registry) and running another program at the same time.
So why do they reccomend I close all running apps? (sometimes ‘strongly’)
And does that refer to TSRs (assuming we still call them that. Those programs that are running in the background without a visual presence except for maybe an icon in the system tray)

Is it just over-cautiousness? Acting on the possibility that one of the running programs will have a fatal crash causing the install to be aborted?

I always thought that is a hugely simplified rule of thumb that frees the software company from responsibility for other programs.

Even if the installation doesn’t require anything that a little common sense suggests anyway, I am sure users would find a way to sabotage their own installations.
If a users tries to install a program and a different program writes to the same directory at the same time, the potential for disaster should be obvious, but with the recommendation to close everything you can blame this on the user, even if he doesn’t know what he is doing.

I’m pretty sure it’s because often there are .dll files shared across applications, and the installation may need overwrite an older .dll with a newer version. If the older .dll file is in use by a running app, the installation won’t be able to overwrite it.

That makes perfect sense… but can’t installation software tell the OS to overwrite the dll when it’s not in use? such as when it’s a dll being used all the time and windows needs to reboot for the new application to work?

In other words, there’s obviously, built into the OS, a facility to handle the dll in use issue.

I’m getting even further into the realm of speculation here, but I think the .dll is marked as ‘open’ for the life of the program while it is running, not just as it is accessed. While something has it open, the OS isn’t going to let anything overwrite it,

But even if it wasn’t, it would be a bad idea for it to change in the middle of a program execution, even if it was replaced while not being accessed. Imagine that a program calls fubar.dll because it needs a particular function in it, but then when it calls it again later that function does something different than the last time it was called, because it’s a newer version of the .dll.

Again, I’m only making some somewhat educated speculations here, so corrections are welcome.

Honestly, the worst that happens is usually that the install fails because another program was open and copying a file failed But that virtually never happens, in my experience. I will say that if you’re installing a new version of a program, you do probably need to close the old one.

Somewhere else you see this same problem is with Windows Updates. It wants you to reboot because the files are in use by Windows itself. In this case, the OS actually has a little bit of code that runs at bootup that copies over the queued files from a staging area, before the main part of Windows starts up.

I can think of two reasons off the top of my head:

  1. People lost their work when they hadn’t saved and the system rebooted - MS got tired of people bitching.
  2. In early versions of Windows there was a possibility of running out of resources, causing the install to possibly fail.

Dunno if either of those are the real reasons - but they are viable reasons.

Well, others have pretty much nailed it: it is first and foremost a safety measure. Although the possibility of something getting screwed up is small, it is still a possibility. Just as grandma made you wait 20 minutes after eating before getting into the pool, the software asks you to close all running programs.

It is possible that some application might have locked a shared resource or a *.DLL, in which case your open program might crash (or more likely, the installation of the new app will fail).

In any case, mostly it’s to prevent tech support calls.

From a more experienced perspective: there is a command in Windows that tells Windows to copy a file after reboot. Anyone writing an application (such as an installation tool like MSI) can use this command. Typically an installer will try to copy all the files needed using a simple copy command, but in many cases another application will be using a file. Windows will then fail the copy command for that file, and the installer will use the special copy command to tell Windows to copy that file after rebooting. When the installer tells you to reboot, it’s usually because it had to use the copy after reboot command on one or more files.

Of course, the less you do while installing, the fewer problems you can have. Installing can also seriously slow down the system, too (it takes almost no CPU power but it keeps the harddisk busy, which will seriously slow down large applications).

For those of you who are curious, the Windows commands are CopyFile for a simple file copy, and MoveFileEx with the MOVEFILE_DELAY_UNTIL_REBOOT flag to copy the file after reboot. These commands are part of the Win32 API (Application Programming Interface), and are used when programming with C or C++ (and can be used with other languages in a circuitous manner).

In most cases that’s exactly right. It’s possible for an app to load a DLL, use the function it needs and then unload the DLL so the app keeps running but the DLL is no longer locked. However, in most cases the DLLs will be locked for the entire life of the app.

The whole point of using DLLs instead of putting all that code in monolithic applications that there are a lot of things different apps can share, so we get consistent behavior and smaller apps. Most of the DLLs third-party apps will use are either their own (in which case you might have an earlier copy if you’re installing a new version over an old) or a Windows system DLL. In either case, the OS won’t permit a DLL to be modified while it’s in use, so you have to shut down anything that might be using it before installing the new copy. The copy-after-reboot is used to install DLLs that are in use by Windows itself or by services which the normal user can’t conveniently shut down.

This should never happen. New DLLs with the same name (file and object) should be completely backward compatible. They might provide extended functionality, but they shouldn’t break or change existing functions. Obviously that’s not true in all cases, but that’s the way it’s supposed to work. When a proper installer finds an existing copy of a DLL it wants to install, it should compare version numbers and only replace the existing if the installer’s copy is newer. That way you don’t end up degrading system DLLs by installing older apps, and the app that used the old DLL should work fine with the newer version.

If you really do something different in a DLL function, you should use a new name.This is evident in a lot of system DLLs like MSXML (Microsoft’s XML functions). There are at least five versions and at least three of them are not compatible, so they have object names like MSXML, MSXML2, etc. and distinct DLL filenames so installing one doesn’t overwrite and break another.

If all this sounds like the system is built on a house of cards, welcome to what developers fondly refer to as DLL-Hell.

Thanks for the clarifications micco, expecially about the .dll files needing to be backward compatible. In my example where the functions in the .dll change (in a non-backward-compatible manner), even if the running app doesn’t get clobbered while the new .dll is being installed, it still isn’t going run correctly the next time it’s run.

Which shows you the lengths MS has to go to get their own software to work. The whole business is terribly screwed up. A reasonable OS has no trouble replacing libraries or binaries at all - even if the program you are installing is running at the time.

On unix type systems, I have had to replace running services (Samba comes to mind) on production sytems without stopping the running system. Compile and install the new version, do a service restart, and the system swaps over to the new version with just a few seconds of down time - the users don’t even notice that the server was down for a few seconds since everything picks up again right where it left off.

Windows keeps every library that a program needs open until that program is ended. Unix systems don’t. This is one reason you don’t have to reboot a Linux computer just to install a driver. The other reason is that Unix type systems provide good mechanisms for loading and unloading drivers while the system is running. Windows XP has finally caught up in this regard, but XP still can’t replace files that are held open by running programs or services.

Do you mean that the operating system itself keeps the DLLs open? micco said earlier

which makes it sound almost like a cultural thing - perhaps Windows developers are just used to doing it that way?

(Straight question, btw, I’m not trying to induce a Linux vs. Windows argument.)

Some of both, I think.

Windows itself keeps dlls open for its own use, and some (maybe most) programs do too.

A normal program under Linux doesn’t keep hold of libraries or its own binary. But then again just today I encountered an exception to the rule. It is a program that was ported to Linux from Windows. It holds its own binary open while running. It was also created using a compiler other than gcc - the standard under Linux. That other compiler - Kylix - comes from Borland, and if I’m not mistaken was originally used to make Windows/DOS programs.

Some of the difference is probably in the default way of handling things in the compilers and some of it is probably in the way the programmers do things.

Note that the “Please quit all running programs” bit is primarily a Windows issue – most other operating systems will handle installing new programs while others are running without any problem.