Solved

Ideal max files/directories per directory

Posted on 2004-08-04
4
985 Views
Last Modified: 2013-12-15
Hi guys,

I'm using Red Hat Enterprise and would like to know what the ideal maximum number of files/directories per directory is. Currently we have a directory with over 7000 directories in it. For various reasons we now have to restructure this directory, and an important factor in deciding how to do this will be the ideal number of files/directories per directory.

A few years ago I was doing a similar project but on a Sun Solaris system, and the sys admin there told me the ideal max inodes per directory was about 200.

If anyone knows what the figure would be for Red Hat Enterprise, your advise would be very much appreciated.

Rangi
0
Comment
Question by:rangi500
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
4 Comments
 
LVL 3

Expert Comment

by:pYrania
ID: 11714085
I don't know where the so called sys admin got his information from, but there is no such limit.

I've seen performant filesystems with a lot more files/directories within another directory.
0
 
LVL 40

Expert Comment

by:jlevie
ID: 11714728
I don't know if there is an "ideal number" of nodes within a directory for Linux or Solaris, but 7000 in one directory seems like a lot. I suspect that the the "ideal number" is largely determined by the way those nodes are used. An application that only accessed the directory at startup wouldn't suffer as much as one that was continuous accessing the directory. For the later apps breaking the single large directory in to some sort of hashed structure would be a big help.
0
 
LVL 2

Author Comment

by:rangi500
ID: 11724630
Thanks for your feedback guys.

pYrania, apparenly there was an 'optimal' number of files per directory - perhaps a 'most efficient' number?

jlevie, yes we are planning to create a better structure, possibly hashed, but one of the main things we need to know to make the decision on how we do it is how many directories to put in each directory. If 7000, sounds like a lot, what sounds like "not a lot" - 1000? 100?

Some facts that may or may not be relevant:

   - the folders are read from much more than they are written to
   - the folders make up about 500GB of data
   - our sys admin has said the 7000 dirs in one dir may be responsible for crashing our backup software

Thanks,

Rangi
0
 
LVL 40

Accepted Solution

by:
jlevie earned 250 total points
ID: 11732856
It probably doesn't matter much that the access is mostly read, nor should it matter as to the total data size, with respect to directory size. What would matter most is number of file opens per unit of time. A good hash scheme reduces the time required to locate a file for reading and as a bonus reduces the directory overhead associated with file creation.

> - our sys admin has said the 7000 dirs in one dir may be responsible for crashing our backup software

I don't know what you are using for backups, but I've never had any problems with Solaris ufsdump or Legato Networker with far larger directories than that. I discourage people from creating really big directories, but sometimes they still do ('til I find out about it).
0

Featured Post

Three Reasons Why Backup is Strategic

Backup is strategic to your business because your data is strategic to your business. Without backup, your business will fail. This white paper explains why it is vital for you to design and immediately execute a backup strategy to protect 100 percent of your data.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
unable to put logic for reading multiple repo in a single file 4 104
SSH in linux 9 90
IMAP copying tool 14 72
Watching Inbound/Outbound Traffic on Server 4 19
I am a long time windows user and for me it is normal to have spaces in directory and file names. Changing to Linux I found myself frustrated when I moved my windows data over to my new Linux computer. The problem occurs when at the command line.…
SSH (Secure Shell) - Tips and Tricks As you all know SSH(Secure Shell) is a network protocol, which we use to access/transfer files securely between two networked devices. SSH was actually designed as a replacement for insecure protocols that sen…
Learn how to get help with Linux/Unix bash shell commands. Use help to read help documents for built in bash shell commands.: Use man to interface with the online reference manuals for shell commands.: Use man to search man pages for unknown command…
Learn how to find files with the shell using the find and locate commands. Use locate to find a needle in a haystack.: With locate, check if the file still exists.: Use find to get the actual location of the file.:

738 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question