Solved

ls *

Posted on 2000-03-14
9
768 Views
Last Modified: 2010-04-21
"ls *" or any shell command with '*' gives me an error output when executed in a directory where I have over 2000 files, because
the resulting parameter list is too long.
How can I change the maximum value of this in my shell?
0
Comment
Question by:Eric98
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
  • 2
  • +2
9 Comments
 
LVL 2

Expert Comment

by:GP1628
ID: 2616612
Comment. There are alot of reasons to take it as a warning rather than a problem. You might fix it for a few commands but others will still have problems. The operating system itself doesnt like handling directorys that are that full.

This is why an ISP which gets over 1000 customers often shifts to a subdirectory home structure. From /home/gp1628 to /home/g/gp1628
and really large ISPs sometimes go deeper than that. Its much more efficient that way. You can make the system operate with that many files but its usually not worth the effort.

Anytime Im stuck with that many in a directory I tend to do kindof the same thing. I work with all a* then all b* etc. Or break it down by number of characters such as  ls ???    ls ????  ls ?????

0
 

Author Comment

by:Eric98
ID: 2616821
I cannot break down my list of files, as those files are the input of a tool that requires to find all files in one time and in one and the same directory.
In other words, my problem remains open ...
0
 
LVL 2

Expert Comment

by:dirc
ID: 2616830
It sounds like you are using csh for you command line shell. csh has limitations that vary depending on the operating system.

For example, on a Solaris 7, csh accepts no more than 1706 arguments, the arguments must be no more than 1024 characters long, and the total argument list length must be less than 1M. On HP 10.20, the limit is 10240 characters for the argument list. You can determine the limits for your system by reading "man csh", and looking for the section near the end labeled "WARNINGS" or "NOTES".

You are running into the limits because the shell is expand the wildcard "*" to match all files in the directory, and that exceeds one of the limits described above.

There is no way to adjust the limits with the csh provided on your system. Your choices are:

1. Use another shell that does not have the same limitations. Bourne shell (/bin/sh) and Korn shell (/bin/ksh) will work fine for 'ls *' in a directory with more than 2000 files.

2. Use the find command:

"find . -name "*" -prune -type f -print" will list all the files match "*" in the current directory. The output will look similar to "ls -1 *".

3. Write your own version of csh.

You can probably find source code somewhere, and modify csh to raise of eliminate the limitations.
0
Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 84

Expert Comment

by:ozo
ID: 2617273
Can you do `ls` instead of `ls *`?
or `ls | xargs ls`
0
 
LVL 14

Accepted Solution

by:
chris_calabrese earned 50 total points
ID: 2617374
This is not a shell limitation but rather a limitation in the kernel on how much memory can be allocated for the arugent list when executing a program.  It's usually 10k-20k depending on the Unix flavor, etc.

As ozo suggested, the best way to solve this problem is to simply use 'ls' instead of 'ls *'.  Similarly, for other programs you can use things like
  ls | xargs myprogram
or
  find . -exec myprogram '{}' ';'

Not to mention that using '*' is a security risk in a directory writable to others because it's conceivable that a malicious user may have dropped a file with a name like '; rm -rf /'.
0
 
LVL 14

Expert Comment

by:chris_calabrese
ID: 2617385
Oh yeah, and this is also not usually why people go for the /home/g/gp1628 type naming convention, but rather because many systems also have a limit on the number of files per direcotory or because performance on directory operations gets really bad on directories with lots of entries.
0
 

Author Comment

by:Eric98
ID: 2618873
I am very surprised one cannot change the limit itself, but the xargs solution helps out.
Thanks to all
0
 

Author Comment

by:Eric98
ID: 2618879
ozo deserves the points too, I have no idea how tpo proceed ...
0
 
LVL 2

Expert Comment

by:GP1628
ID: 2619778
That happens alot.

you can create another question that says "for OZO" and give it points.
When he answers, accept it.

0

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Installing FreeBSD… FreeBSD is a darling of an operating system. The stability and usability make it a clear choice for servers and desktops (for the cunning). Savvy?  The Ports collection makes available every popular FOSS application and packag…
Why Shell Scripting? Shell scripting is a powerful method of accessing UNIX systems and it is very flexible. Shell scripts are required when we want to execute a sequence of commands in Unix flavored operating systems. “Shell” is the command line i…
Learn how to find files with the shell using the find and locate commands. Use locate to find a needle in a haystack.: With locate, check if the file still exists.: Use find to get the actual location of the file.:
In a previous video, we went over how to export a DynamoDB table into Amazon S3.  In this video, we show how to load the export from S3 into a DynamoDB table.

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question