The kind of work I do on my home PC involves a lot of disk activity and my drive (NTFS on WinXP) does get significantly fragmented over time - however, for every, say, three times it becomes bad enough to need a defrag, I find I’ve also reached the point where Windows is so crufty and bogged down that it’s also time for a bare metal reinstall (I’m due for one now, but have been putting it off).
Not surprising: 2000 and XP are both NT with a new coat of paint.
Mangetout brings something else up, namely that it’s difficult to relate the effect of disk fragmentation to performance because many other aspects of the system also have a significant impact on performance. This is important to keep in mind in general when dealing with complex systems.
Novell brought out some docs in the 1980’s – some white papers on their internal structures and procedures. Definitely designed for geeks and those that would appreciate how much good design and care had gone into their software, even tho they may not have originated the concepts. To those of us who were equally familiar with the internals of CP/M, DOS and budding Windows, Novell’s implementations were more sophisticated and far ahead of the game.
To this day, the latest Windows lacks some of the Netware’s strongpoints, and I miss that. Netware long ago discarded the concept that a newly installed program required a reboot (how could you tolerate that with 200 users connected?) but Windows still hasn’t got the word. My Netware servers would run forever (many months, at least!) unless turned off, but it’s rare to have a Windows machine running for a full day without needing a reboot.
I’m glad to hear Netware is alive and well, but I never encounter it anymore in my daily computing rounds.
I do a lot of download/uploading via bittorent. I delete files as large as 4.7 gigs and replace them with other files about the same size. When I do this a lot my system starts to bog down and a defrag will help things a LOT. This is just my experience.
Cite?
I find it hard to believe that FAT is so poorly designed that failing to defragment it will actually result in corruption.
I meant common practice in the real OS world, as opposed to the very limited microcomputer OS world of that era. CP/M, DOS, and Windows were obsolete when they were first introduced (as was the hardware, really, but minicomputer OSes of the past had done more with less: see RSTS/E, Seventh Edition Unix, and ITS for examples).
Windows is hardly the standard by which real OSes are judged.
Exactly. Real OSes never required that. It was invented by MS, so far as I know.
Again, common industry practice long before Windows. Uptime is commonly measured in years outside of the desktop world, and has been for decades.
Linux and the open-source BSDs really took a bite out of its marketshare in the 1990s.
I could see how severe fragmentation could tend to cause accidental corruption - not at the level of the drive management, but elsewhere - higher up - in the OS - on a highly-fragmented disk, especially a fairly full one, any disk operations are going to be quite intensive and slow - a sustained bout of disk activity could result in jobs backing up until the OS considers them to have timed out - at which point the system may become unstable or might just appear to hang - and the outcome of that could be a shutdown with the filesystem in an undetermined state.
That might seem a bit far-fetched, but on an older system with less speed and fewer resources, I don’t think it’s impossible.
… aaaand tonight it hung on shutdown. So I don’t know.
But it is the standard used by most of the world for common computing. It’s the one I have to put up with in supporting the average Joe Home User.
I agree with everything you say; I was just trying to point out that Windows, touted as the Latest and Greatest by you-know-who and forced on the public by many corporate policies, is far from advanced technology in many ways.
A metric hour??? Are you saying 100 minutes? What the hell is a metric hour and who uses metric time? Was this just a cute expression?
I remember well the moment we switched to metric time. 80 past 2 on April 47th.
Yeah, of course it is. It’s, by far, the slowest component in your system when data is needed from it.
What benchmarks are you talking about? Synthetic benchmarks designed to test hard drive speed probably pick an open/unfragmented area of the drive so that drive fragmentation doesn’t affect the results. Benchmarks that test other components like the CPU, ram, etc. would have no reason to involve the hard drives.
Sure, of course. As has been covered, NTFS is better about fragmentation than FAT/FAT32 are, but any fragmentation still leads to more hard drive seeks, and more hard drive seeks lead to longer times to complete operation. It may not make a massive difference, but it does improve things.
Why not just set a defragment program to automatically run while you’re sleeping? That way you’ll enjoy the performance boost without having it inconvenience you.
I’m curious as to why you think so.
That’s not true. Windows 2000 was mostly rewritten from scratch, using some of the NT 4 core code and design philosophy. It is an NT operating system - it’s internally referred to as NT 5 (xp is 5.1), but it’s way more than minor tweaking. XP is pretty much a new coat of paint on 2000 though - add in a cartoony interface, some idiot proofing, break a few things, and boom, you have a new OS.
I’m a big fan of 2000. I always thought that microsoft had finally designed a very good OS and were so confused that they didn’t know what to do with themselves, so they quickly screwed some of it up and made it crappier and hence XP was born.
Like others have said, with more modern file systems (say NTFS) it doesn’t occur as frequently, and file access is actually faster by design than FAT.
That being said, disk I/O is still a monstrous bottleneck on most PCs- processor and memory speed are so much faster than disk that it’s not funny.
That’s not to say that there aren’t applications that are processor or memory bound, but disk-bound is as common as ever.
What has happened is that processor, memory and disk speed has got WAY out ahead of file sizes, even with the huge growth in file size. Moore’s law doesn’t apply to file size.
I kind of agree with that, except NT 3.51 was very good and stable and then they ballsed it up by moving the video drivers into kernel space in NT4, for performance reasons. Like servers need fast graphics. But I guess it enabled them to sell essentially the same OS in server versions and home versions. Windows 2000 retained that home-user-oriented feature in the server-oriented editions, but as long as the video driver was stable I suppose it was OK, and it did make for a versatile OS.
This used to be an official Windows tool, but they’ve since taken it off their website, probably as a part of their Less Support for XP initiative.
Anyways, it analyzes your Windows boot-up and then defragments/optimizes the arrangement of those parts of your hard drive used during booting to speed it up. Let me tell you, it works.
Now, Steve Ballmer is the only Official Windows Tool.
:: ducks chair ::
Musicat: Yeah, we were agreeing with each other pretty loudly.
SenorBeef: Thanks for the info. (WNT was still a castrated clone of VMS, though. I don’t know what they did to Cutler when they hired him away from DEC but the results were not pretty.)
Cutler worked on both, but it is overstating things to describe Windows NT as a clone of VMS. Anyway, criticism of NT and its descendants seems to be directed more at the superficial stuff that accumulated later, rather than the core design of the OS.
OK, so NTFS is pretty decent at working around fragmentation. How about HFS+?
AFAIK, there is no native defrag tool packaged with Mac OS X by Apple. (There might be a commandline utility somewhere in the dev toolkit, but, if so, I don’t know what it is.)
I encountered a situation where I needed to defrag my Mac’s disk and couldn’t.
I was shrinking the boot partition on my MacBook Pro so that I could create a second partition and install Windows XP. The Apple-recommended and Apple-supplied tool for this is Boot Camp Assistant (BCA). At one point, BCA presents you with a diagram of your hard drive and requests you to choose a size for the new partition.
I did this, and received an error message that the new partition could not be created because files were in the way. There was more than enough raw space in the boot partition for all my Mac OS X stuff even after the shrinkage.
A bit of googling revealed that this has happened to other people, and that there are third-party defrag utilities available for low low prices. Apparently OS X and the HFS+ filesystem defrags either files or free space, but not both (I may be misunderstanding that). It was unable to push things aside to clear the space on the drive for the second partition.
I ended up reinstalling Mac OS X and recreating the boot partition from scratch, and then adding the second partition before I restored my data back to the main partiton.