Disk defragmentation - c'mon, be honest with me now

Here’s a defrag tool I have often used. [ol][li]Backup every file from a partition or disk onto other media (file backup, not image). []Reformat the original partition or disk. []Restore every file from the backup media.[/ol] As you restore, all files will be written contiguously, i.e., defragged. And, as a bonus, you have a complete backup.[/li]
Now I know there are many smart people reading this who will say, “Yes, but you forgot…”. No, I didn’t. There are many gotchas in this scheme and it won’t work in all cases. However, I have found it very useful in the past, and in DOS days, it was my primary defrag tool.

It works really well if:[ul][li]The partition or drive contains only data files, not system files that may not be copyable if in use, and[]None of the files or space allocations must be restored to a particular track or cylinder to work properly, and[]The storage allocation scheme writes in sequence, like FAT, without leaving holes.[/ul][/li]It also won’t super-optimize the way the better defrag schemes do by determining which files are accessed most frequently or in a particular sequence, then storing them close together.

That’s not really a file fragmentation issue. I can’t think of any file system that was designed to support the resizing of a partition. That’s usually something that some clever programmer may write at a later date, if there is a need for it, and the file system’s structure doesn’t make it too difficult.

Well, Boot Camp Assistant is supposed to resize the partition without problem. Apple’s official site. From there, you can obtain the instructions for setting up the new partition using Boot Camp Assistant, which state:

So it is a separate utility, written later.

No. I’m criticizing the fact MS can’t get things like basic permissions right (an application should never have to run as Administrator), can’t adopt a coherent versioning system for libraries (installing a newer or an older version of the same DLL should not break the existing version, and it should not be hidden in an application-specific directory), and can’t produce a good command-line interface so admins can fix their systems. (Rebooting is an old canard, but it’s valid: No OS family has ever been as reboot-happy as MS-Windows. It just isn’t necessary, and every other OS in current commercial use is an existence proof of that.)

I never understood why MS didn’t want versioning in DLL’s but I do remember reading that Gates was very against it.

Versions seems like a simple clean solution to me.

HFS Debug. It doesn’t defrag, as such, but it will show you how much fragmentation you have and let you explore other aspects of your filesystem.

This is what I got when I ran it just now:
“Out of 500469 non-zero data forks total, 497223 (99.35 %) have no fragmentation.
Out of 11405 non-zero resource forks total, 11229 (98.46 %) have no fragmentation.”

I’ve been using this disk for over 3 years without a wipe, reinstall, or any defragging other than what the regular system does automatically in the background, and I’ve got about 5 GB of 80 GB free (93% full). So, you can see that OS X handles** file fragmentation** very well.

The problem with Boot Camp is that disk fragmentation suddenly becomes an issue, when under normal use it doesn’t have much effect, if any, on performance or file corruption. There is one universal (works on both PowerPC and Intel) utility I’m aware of that will deal with disk fragmentation and optimization: iDefrag. I can’t vouch for its effectiveness myself since I’ve never had the need to use it, but judging from comments and reviews it seems to work pretty well. It costs about $40 to optimize, but you can run it to see the extent of fragmentation for free.

Obviously, you need to back up before using it, unless you want to take a chance on losing your data. Rarely a problem with any modern disk tools, but always a possibility.

I doubt that David Cutler worked much on admin-level stuff like that. He’s more of a low-level kernel designer isn’t he? That’s like criticising Linus Torvalds because you don’t like BASH.

You guys got me curious. I use Linux and the Reiser filesystem and have never worried about fragmentation. I use Slackware 10.2 which I installed in Oct 2005. So I’ve gone without defragging for about 2.5 years.

After a little looking around, I found that they don’t even make a defrag tool for ReiserFS, although there is one generic tool that supposedly works, but before unleashing something like that on my system, i wanted to know how badly it was defragmented. I found a couple scripts to check the amount of defragmentation and they both agree that my system is 4% defragmented.

They are. I’m certain the Unix world was using them at least a decade before Windows 95 came out. If you have the ability to make links in the filesystem (so that opening file foo actually opens file bar) this can be made invisible to the applications involved. (Windows has had the ability to do the equivalent of Unix soft-links ever since Windows 95, and possibly before. That’s all you need.)

Fair enough.

Byzz: Don’t worry about it, and don’t bother with the defragging software. reiserfs is a modern, well-designed filesystem and doesn’t need it.

Vast oversimplification and probably unfair generalisation ahead:

-‘Serious’ journalling filesystems don’t tend to suffer so badly from fragmentation either because their design minimises it, or because they contain inbuilt routines to manage it on the fly, or because they’re just capable of coping better when it does happen.

I’m sure it would be possible to create a journalling filesystem that omits to do any of these things - but it wouldn’t prevail against those that do.

It’s more complicated than that. Windows doesn’t support having multiple versions of a DLL in memory. If application Foo uses GFX.DLL 1.0 and application Bar uses GFX.DLL 1.1, and the user runs application Foo, GFX.DLL 1.0 is loaded into memory. If the user then runs application Bar, it uses the old version of the DLL that is already in memory, not the updated version that was supplied with the application.

I read about this a bit after I posted that, and I really cannot understand the kind of mental derangement required to design the system that way. The people behind that move not only ignored decades of best practice, they invented something that actively hampered any future effort to move Windows in the right direction. Even memory concerns don’t justify it given the existence of VM and swap. (No, I don’t blame Cutler for any of this. Redmond doesn’t have drugs powerful enough to make this seem like a good idea to the person responsible for the well-designed-albeit-clunky VMS.)