Networking gurus - Guidance on subdividing folders?

The company I work for stores a huge number of files in a single folder on our file server – more than 17,000. They’re word and text files, and none of them is larger than 100 KB. It’s my feeling that continuing to store everything in that one folder (with more being added daily) is asking for trouble.

Is there any kind of consensus on the number of files that should be kept in one folder? Or is the size of the files a more important consideration? Would subdividing the folder a thousand files at a time be advisable?

It probably depends on the filesystem type, as to how many files it can handle. Is it FAT32, NTFS or something else?

From a usage standpoint, it just seems like a good idea to try and organize it. I can’t see how you’d be able to find anything, and waiting for explorer to create the list of 17,000 files in the open dialog is probably a pain in the neck.

Here’s what Microsoft has to say about it.

and

Just realized that the article linked from the first article has even more pertinent info.

It’s NTFS. And the server is not even at 10% of its storage capacity.

I’m a pretty savvy user when it comes to Word and Excel. I like to think I’m pretty competant when it comes to basic file maintenance. But my knowledge of operating systems and networking is so badly out of date that I genuinely have no idea how much is too much. I’ve recommended to my boss – purely from a records management standpoint – that the folder be subdivided. I thought it prudent, though, to ask those more knowledgable than myself whether I’m seeing danger where none exists.

Another FYI:

If many of those files are rarely used, use Window’s Remote Storage to archive 'em off to another drive. The file names will stay in the original folder(s) and for all the world it looks like they’re there, but the data itself is somewhere else.

From the user’s standpoint, it might take a few seconds to retrieve the actual data when they open the file, but if they rarely use the files it’s not a big deal. Files that are frequently used won’t be archived, so that’s not an issue.

It’ll help a little bit with performance if you do have to move 'em around, and also allows you to keep a gazillion million files on a leeetle tiny drive. Not sure if this is relevent to you or not, but I figure I’d throw it out there as an idea.

What OS are you running on the server? Is the folder only accessed by something on the server itself or are people going to browse it? Some of my colleagues work with 2 million files in a directory, but then that’s only used by a process specific to the server and hidden from the users.

Assuming that the folder is on a Windows server and that users age going to browse it, a good rule of thumb from a usability POV is that anything more than several screens-full is to be avoided.

Others have covered the limitations of NTFS so I won’t repeat those. In the unlikely event that your server is running HPFS or possibly early (3.x) versions of NTFS, I recall that there’s a breakpoint at about the 17,000 file mark, but can’t provide a cite.