ls *

"ls *" or any shell command with '*' gives me an error output when executed in a directory where I have over 2000 files, because
the resulting parameter list is too long.
How can I change the maximum value of this in my shell?
Who is Participating?
chris_calabreseConnect With a Mentor Commented:
This is not a shell limitation but rather a limitation in the kernel on how much memory can be allocated for the arugent list when executing a program.  It's usually 10k-20k depending on the Unix flavor, etc.

As ozo suggested, the best way to solve this problem is to simply use 'ls' instead of 'ls *'.  Similarly, for other programs you can use things like
  ls | xargs myprogram
  find . -exec myprogram '{}' ';'

Not to mention that using '*' is a security risk in a directory writable to others because it's conceivable that a malicious user may have dropped a file with a name like '; rm -rf /'.
Comment. There are alot of reasons to take it as a warning rather than a problem. You might fix it for a few commands but others will still have problems. The operating system itself doesnt like handling directorys that are that full.

This is why an ISP which gets over 1000 customers often shifts to a subdirectory home structure. From /home/gp1628 to /home/g/gp1628
and really large ISPs sometimes go deeper than that. Its much more efficient that way. You can make the system operate with that many files but its usually not worth the effort.

Anytime Im stuck with that many in a directory I tend to do kindof the same thing. I work with all a* then all b* etc. Or break it down by number of characters such as  ls ???    ls ????  ls ?????

Eric98Author Commented:
I cannot break down my list of files, as those files are the input of a tool that requires to find all files in one time and in one and the same directory.
In other words, my problem remains open ...
Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

It sounds like you are using csh for you command line shell. csh has limitations that vary depending on the operating system.

For example, on a Solaris 7, csh accepts no more than 1706 arguments, the arguments must be no more than 1024 characters long, and the total argument list length must be less than 1M. On HP 10.20, the limit is 10240 characters for the argument list. You can determine the limits for your system by reading "man csh", and looking for the section near the end labeled "WARNINGS" or "NOTES".

You are running into the limits because the shell is expand the wildcard "*" to match all files in the directory, and that exceeds one of the limits described above.

There is no way to adjust the limits with the csh provided on your system. Your choices are:

1. Use another shell that does not have the same limitations. Bourne shell (/bin/sh) and Korn shell (/bin/ksh) will work fine for 'ls *' in a directory with more than 2000 files.

2. Use the find command:

"find . -name "*" -prune -type f -print" will list all the files match "*" in the current directory. The output will look similar to "ls -1 *".

3. Write your own version of csh.

You can probably find source code somewhere, and modify csh to raise of eliminate the limitations.
Can you do `ls` instead of `ls *`?
or `ls | xargs ls`
Oh yeah, and this is also not usually why people go for the /home/g/gp1628 type naming convention, but rather because many systems also have a limit on the number of files per direcotory or because performance on directory operations gets really bad on directories with lots of entries.
Eric98Author Commented:
I am very surprised one cannot change the limit itself, but the xargs solution helps out.
Thanks to all
Eric98Author Commented:
ozo deserves the points too, I have no idea how tpo proceed ...
That happens alot.

you can create another question that says "for OZO" and give it points.
When he answers, accept it.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.