Files per sub-directory

Evening Experts,

I am using WinXP and NTFS. Please let me know the maximum number of files I can have in a subdirectory. So far I have 1150 in one sub-d occupying 126Mb, I just want know if there is a limit.

Patrick
LVL 45
patrickabAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

sda100Commented:
The maximum number of files in each directory is governed by the file system rather than the OS.

You can find a lot of info here:
http://www.pcguide.com/ref/hdd/file/fatRoot-c.html

Essentially, there is no limit for FAT32/NTFS, but from experience as soon as you get up to the 100,000 mark, Windows explorer starts to die when trying to open all the files to read the summary data and such.  In this case, use a command window.

Steve :)
scrathcyboyCommented:
The only significant limit is in the ROOT directory, C:\, so you want to keep extraneous files out of there, and leave those entires for subdirectories under the root.  Otherwise, you can have upwards of 10,000, but this is not efficient, takes too long to sort the FAT.  If you can keep the files in any directory to under 500, that is the best optimization of the files system, whether NTFS or others.  
kode99Commented:
There is no limit beyond the actual limit of total files a NTFS volume can hold,  which you will never hit due to hardware limitations.  Up over 4 billion files - 2^32 less one file.

In practice,  as mentioned,  browsing a directory with massive numbers of files can be quite frustrating as there will be a delay as the explorer scans the directory and displays the files.  So long so that it will look like it has crashed if there are enough files.  If you need to browse it over a network this problem will be magnified.

I do find that between 5 and 10 thousand is likely going to be about as much as you want to put into a directory that is browsed.  If accessed over a network probably half that maybe less.  It does depend on your hardware, snappy systems with faster drives do better.

If you are not browsing/scanning there seems to be no real issues.  I have an application that deals with hundreds of thousands of files and it was fine to put them all in a single directory - as long as all access was done directly without the need to scan the directory.

It is usually easier to organize files in smaller quantities with meaningful directory names so humans can find things faster.

You can check out the specs for NTFS from here if you are interested,
http://technet2.microsoft.com/WindowsServer/en/Library/81cc8a8a-bd32-4786-a849-03245d68d8e41033.mspx

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
patrickabAuthor Commented:
Thank you all for you contributions - most helpful.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Storage

From novice to tech pro — start learning today.