• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1178
  • Last Modified:

too many open files

I have a script used to run perfectly on a SCO Unix box. Now I put it on a Solaris box and it is experiencing some issues:

#ident  "@(#)root       1.21    04/03/23 SMI"
#
# The root crontab should be used to perform accounting data collection.
#
#
10 3 * * * /usr/sbin/logadm
15 3 * * 0 /usr/lib/fs/nfs/nfsfind
30 3 * * * [ -x /usr/lib/gss/gsscred_clean ] && /usr/lib/gss/gsscred_clean
#10 3 * * * /usr/lib/krb5/kprop_script ___slave_kdcs___
00 4 * * * /bin/ksh -c "COINSBASE=/u/coins; export COINSBASE; /u/coins/bin/logcron >/dev/null 2>&1"
00 7 * * * /home/scripts/coins/proc.120309 > /dev/null 2>&1
13 0 * * * /usr/lib/patch/swupAuto > /dev/null 2>&1
root@oiler # cd /home/scripts/coins/
root@oiler # cat proc.120309
#!/bin/sh
#set -x
#Last Modified 08/13/2004
#
#This script goes through the directories defined in the file proc_dir_list
#and prints out each file and date contained in those directories.
#
#IMPORTANT - This script assumes several things
# 1 - In the "BAK" files, the first 3 lines contain the REPORT ID, RUN DATE,
# and PROCESS DATE.
#
# 2 - Each BAK file ONLY contains one report.
#
# 3 - If the data directories change, the program needs to change to reflect
# this.

day=`date +%j%Y`
dir="/home/scripts/coins/proc_reports"
#rec="steve.wemer@unisys.com cynthia.kline@unisys.com kenneth.barnard@unisys.com gabriel.ramirez@unisys.com michael.schwartz@unisys.com valeriy.saliyev@unisys.com michael.pittman@unisys.com"

rec="simon.cheng@unisys.com kenneth.barnard@unisys.com"

echo "\n******  `date`  ******\n" >> $dir/$day.proc

# The echo is necessary because nawk will look for input otherwise.
# The way around it is to put all the code in the Begin section, but I prefer
# the format it currently is in.

echo " " |nawk 'BEGIN { FS="\t"
        while (getline < "/home/scripts/coins/proc_dir_list" > 0) {
        if ($0 ~ /^#/ || $0 ~ /^$/)
                continue
        #Load directory listing array
        ++sizeofarray
        dir[sizeofarray]=$1  #Contains directory name
        desc[sizeofarray]=$2 #Contains Desc. of directory
        }
        close("/home/scripts/coins/proc_dir_list")
} # End for BEGIN

#MAIN
{
        FS="/"
        OFS=""

        #The for loop goes through the dir array getting each directory name
        #It prints the directory name and desc. first
        #Goes through each file that "find" prints out

        for (i=1;i <= sizeofarray;i++)
        {
                print "\nFiles in " dir[i] ":\n" desc[i] "\n"
                #printf("\tSub-directories\t%-21s\tDates\n","\tFile Names") orig
                printf("%-26s%-27s%s","Sub-directories","File Names","Dates\n")
                holdersize=split(dir[i],holder,"/") - 1 #added
                sub_start=holdersize + 2 #added

                while ("find "'dir[i]'" -type f -print" |getline > 0)
                {
                        FS="/"
                        "ls -l " $0 |getline
                        sub_end=(NF - 1) #added
                        for (t=sub_start; t <= sub_end;t++) #added
                                sub_dir=sub_dir"/"$t #added
                        #sub_dir=(NF - 1) orig

                  file_date() #Function call
                  rpt_info() #Function call
                  sub_dir="" #added

                } #End of while loop

        } #End of for loop

}

function file_date() {

        #Tests if the record is in the ls -l format.
        #This allows me to get the date of the file.

                if ($0 ~ /^-/)
                {
                        split($1,date," ")
                        # Test is a little backwards.
                        # Returns a 1 for true and 0 for false. #
                        #Prints the sub directory and then the
                        #last field in record "$NF" ###
                        #printf ("\t\./%-10s\t%-20s",sub_dir,$NF) orig
                        printf ("\.%-25s%-27s",sub_dir,$NF)
                        printf ("%s %s, %s\n",date[6],date[7],date[8])

                }
} #END of function file_date

function rpt_info() {

                        # The following is the code to print out the REPORT ID
                        # , RUN DATE, and PROCESSED for backup directories
                        if (dir[i] ~ /.*backup/)
                        {
                          file='dir[i]'sub_dir"/"$NF

                          while ("head -3 " file |getline) #Gets the top 3 lines
                          {
                                gsub(/^. /,"") #Subs any leading spaces at
                                                #the front of the line.
                                FS = " " #Sets the field seperator to a space

                        #The if statement checks for the words REPORT, RUN or
                        #PROCESSED.
                        #This is to get just the three lines I want.

                                if ($1 ~ /^REPORT/ || $1 ~ /^RUN/ || $1 ~ /^.*PROCESSED.*/)
                                {

                        #This if is to not get any more data than is wanted for
                        #the PROCESSED line.
                                        if ($1 !~ /^.*PROCESSED/)
                                          printf ("\t\t\t\t%s %s %s\n",$1,$2,$3)
                                        else
                                          printf ("\t\t\t\t%s %s\n",$1,$2)
                                }
                          } #End of while loop
                        } #End of if statement
        close("head -3 " file)
}' >> $dir/$day.proc   #End of function and nawk processing

#echo "\n*** Files backed up after processing (BAK). Any that do not have the current date is deleted ***\n" >> $dir/$day.proc

#cat $dir/deleted.proc |sort +0n -t >> $dir/$day.proc

mailx -s "Oiler Coinserv Report" $rec < $dir/$day.proc


I got this error:

nawk: ls -l /u4/coins/spool/ccrp/yd090r03.09365 makes too many open files
 input record number 1
 source line number 33

it output part of the data, but not all.
0
USTTN-LAN
Asked:
USTTN-LAN
  • 4
  • 4
  • 3
1 Solution
 
Brian UtterbackPrinciple Software EngineerCommented:
Nawk has a limit of how many files it is willing to keep track of and you have reached this limit. You need to close the files after you are done with them.
0
 
USTTN-LANAuthor Commented:
I did close the file :

close ("head -3" file)
0
 
RowleyCommented:
What platform & version of Solaris are you running?
0
Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

 
USTTN-LANAuthor Commented:
I am running Solaris 10 on a SUNFire V490. Thank you.
0
 
RowleyCommented:
What release? cat /etc/release, uname -a etc...
0
 
Brian UtterbackPrinciple Software EngineerCommented:
This is a bug in the Solaris version of nawk:

5064228 *nawk* does not close pipes

There doesn't appear to be a workaround.

You might be able to get it to work in gawk, but it isn't a drop in replacement. If I were you, I would
rewrite the script in perl.
0
 
USTTN-LANAuthor Commented:
SunOS oiler 5.10 Generic_141414-08 sun4u sparc SUNW,Sun-Fire-V490

0
 
Brian UtterbackPrinciple Software EngineerCommented:
The version of Solaris doesn't matter. The problem has existed since nawk was introduced and continues to exist in the latest version.
0
 
RowleyCommented:
I thought it may have been an stdio issue but you are right (are you ever wrong? ;)).

http://bugs.opensolaris.org/bugdatabase/view_bug.do;jsessionid=2aaa24f118a6b13b82bf8e4a68e9?bug_id=5064228
0
 
Brian UtterbackPrinciple Software EngineerCommented:
I looked at the source of nawk and I see how you can get around the problem. You must explicitly close all the pipes when you are done with them. Specifically, the pipe that is your main problem is the "ls -l" pipe. You must close the pipe with the full string you used to open it. Since you opened it with $0 set to the filename, but used getline to reset $0, you need to store the previous $0.

Change this:

FS="/"
"ls -l " $0 |getline
sub_end=(NF - 1) #added

to this:

FS="/"
fname = $0
"ls -l " $0 |getline
close("ls -l " fname)
sub_end=(NF - 1) #added

Give that a try.
                       
0
 
USTTN-LANAuthor Commented:
Thank you very much for your help.
0

Featured Post

Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 4
  • 4
  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now