I work with a lot of very large .wav files. Up until a few months ago, I was using a pair of FAT-32 drives. I found the Windows defrag utility to be a bunch of crap. I could leave it on for 24 hours and it would just move the files around without seeming to do much. So I bought Diskeeper, which I have to say is a much better defrag program. For FAT-32, it worked very well, and when writing CDs, the buffer level would stay on 100% full for the whole write. If the HD hadn’t been defragmented in awhile, it would take longer to write the disc, at slower speeds, and the buffer would sometimes near empty out (thank Og for buffer underrun protection!).
Now both my drives are NTFS and they aren’t as full as they used to be. I can’t say that I see a noticeable difference in performance from a freshly-defragged drive anymore, but I get some kind of security out of knowing that all my files are lined up in contiguous manner, with all the empty space at the end. The drives still get heavily fragmented, with all the recording and deleting, so I still use Diskeeper regularly.
It is heavily dependent on the type of applications that you run on the computer. With Windows NT and above, the disk can become badly fragmented if you install, uninstall and upgrade lots of large software packages. This affects not only files, but directories and the master file table. For I/O intensive tasks, you can often see a noticeable slowdown because the disk drive is spending much of its time moving the heads from one cylinder to another to collect all the fragments of a file. You may not notice it if you have plenty of memory and only occasionally read/write files.
It’s actually hard to defrag a disc that is 99% full, because the process of moving the fragments around and putting them in appropriate order requires some free disc space. Windows defrag recommends about 15%, and even commercial defrag programs recommend having some empty space on the disc.
Just for those that aren’t aware, the built-in defragmenting tool in Windows is a stripped down version of Diskeeper. I’m sure the pay-for version has more bells and whistles that make it a more desirable product to use.
While the current version of Diskeeper, (10) isn’t free, version 7.0, Diskeeper Lite is free, and bundled here is Diskeeper Lite 9!
In the Readme_EN.txt and in the License Agreement I didn’t find any rules prohibiting downloading and using this program (on a single computer), therefore I am assuming there aren’t any.
It has been long enough ago that I added a couple of memory chips that I had to go looking for how much I had and how much I have. This Dell 4500 started with 128 Meg and it now has 256 Meg, so I must have added 128 Meg in two chips, making each 64 Meg instead of 16 Meg. It should be obvious that I don’t know my hardware.
I think this has a Pentium 4. The Compaq I had before this one was a Pentium 3, I think. Before that it was a Commodore 64! We even had a VIC-20 back in the early 80’s. Back then 16 K was a big deal.
Thanks for your concern and your help, SenorBeef and rbroome and Carnac the Magnificent!.
If you go to this Dell site, you can enter your Service Tag Number (usually located on the back of the tower, but sometimes on a sticker on the side) and view all the original specifications of the computer when it left the factory.
But doesn’t the defrag process itself cause wear and tear? You do it twice a month, and during the defrag, the drive head has to run around non-stop during that period. Doesn’t that add up to more than the savings you achieve?
All disk usage causes “wear and tear.” Running a defrag can save time and effort in the long run, though. It’s just reorganizing the data for efficiency. It moves the most used data (and sometimes the swap file) to contiguous portions of the disk and sometimes to the faster parts of the disk so that the next million trips to the disk for data are short trips.
Someone tell me now, aren’t disk platters always spinning?
[Defragmentation is also dangerous: if there’s a power blip at the wrong moment or a software crash or something, then you can potentially kiss goodbye to all your files - or worse, have a file which is apparently ok until you come to access it.]
I always worried about this, too, but I’ve been reassured by some techs that Windows first copies data to another place on the drive before erasing it from its original spot on the drive during defragging. That way, after a power outage, all the data is still there, despite the fact that RAM was cleared.
Umm… no. That sounds like the techs are mixing up NTFS and the defrag tool. NTFS does a read after it’s written a sector to disk to confirm that the data has been written correctly. And FAT and FAT32 don’t. But that data may well still be in the disk’s onboard cache, possibly generating a false response. Your defragmentation tool may do the copy / replace, but it’s not a function of Windows. You’re still vulnerable.
Write caching isn’t a problem if the disk drives and device drivers are configured properly. For IDE drives, you can use Device Manager to set the caching behavior. NTFS also has special code to support the functions needed by disk defragmentation utilities.