?
Solved

Threading

Posted on 2008-10-27
6
Medium Priority
?
215 Views
Last Modified: 2013-11-13
I need a little assistance with the included routine.  This routine uses an array of filenames where the files are zipped text (log) files.  Each file is unzipped into memory and searched for a list of terms.  When there are 5 files, this is fairly quick, but when there are 100 files, then I run into resource issues.  A file could be on the order of 20M uncompressed.  

it appears that this routine opens and searches all files in the array in parallel.  What I need is to control the processing.  I'm looking for help to use a configurable value to set the number of files processed at a time.  So that if configured at 5, then 5 files will be searched and when one completes, a new file is taken from the array and searched.  In this way, there will always be 5 files being processed until the array is empty.
my $in_num="23";  #A constant
 my $terms[4] = "(1\.1\.1\.1|help|perl)";
 my ($pid,$tfile,@childs);
 my $Tmpfolder = "./tmp/";
 my $jobs2run = scalar @datfile;
    for $i (0..($jobs2run-1)) {
      $pid = fork();
        if ($pid) {  #parent
          push (@childs, $pid);
        } elsif ($pid==0) {  #child
            print LOG "Forking child for IN num: $in_num - $i\n";
            $tfile = $in_num."_".$i.".tmp";
            open (TMPRES,">$Tmpfolder/$tfile");
            open F, "gunzip -c $datfile[$i] |";
              foreach my $line (<F>) {
                if ($line =~ m/$terms[4]/) {print TMPRES $datfile[$i],"::",$line,"\n"}
                }
              close F;
          print LOG "closed $datfile[$i]\t closing $Tmpfolder/$tfile\t$i\n";
              close TMPRES;
              exit 0;   #end child
 
        } else {
          print LOG "couldn't fork: $!\n";
        }
    }
    foreach (@childs) {
      waitpid($_,0);
    }
  }

Open in new window

0
Comment
Question by:mouse050297
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 3
6 Comments
 
LVL 39

Accepted Solution

by:
Adam314 earned 500 total points
ID: 22817361

my $MaxAllowedInParrallel = 5;
 
my $in_num="23";  #A constant
my $terms[4] = "(1\.1\.1\.1|help|perl)";
my ($pid,$tfile,@childs);
my $Tmpfolder = "./tmp/";
my $jobs2run = scalar @datfile;
my %childs;
for $i (0..($jobs2run-1)) {
	if(keys %childs >= $MaxAllowdInParrallel) {
		my $finishedpid=wait();
		delete $childs{$finishedpid};
	}
	$pid = fork();
	if ($pid) {  #parent
		$childs{$pid}=1;
	}
	elsif ($pid==0) {  #child
		print LOG "Forking child for IN num: $in_num - $i\n";
		$tfile = $in_num."_".$i.".tmp";
		open (TMPRES,">$Tmpfolder/$tfile");
		open F, "gunzip -c $datfile[$i] |";
		foreach my $line (<F>) {
			if ($line =~ m/$terms[4]/) {print TMPRES $datfile[$i],"::",$line,"\n"}
		}
		close F;
		print LOG "closed $datfile[$i]\t closing $Tmpfolder/$tfile\t$i\n";
		close TMPRES;
		exit 0;   #end child
	}
	else {
		print LOG "couldn't fork: $!\n";
	}
}
 
if(keys %childs >= 0) {
	my $finishedpid=wait();
	delete $childs{$finishedpid};
}

Open in new window

0
 

Author Comment

by:mouse050297
ID: 22833705
It appears that when I run this with a large number of large files, system memory becomes exhausted.  any tips to alleviate this situation?  Setting $MaxAllowdInParrallel=2 has the same result, it takes more time to exhaust.  
0
 
LVL 39

Expert Comment

by:Adam314
ID: 22836694
Can you confirm if it is a large number of files, or large files, that cause the problem?
0
VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

 

Author Comment

by:mouse050297
ID: 22846402
The routine makes a tmp log file for each archive file it searches.  The routine exhausts all system memory at the same point whether I use $MaxAllowedInParrallel = 1 OR $MaxAllowedInParrallel = 5.  We'll say for example that it processes 32 files out of 100.  That 32nd file, when uncompressed is 30M.  The system has 8G memory.  During processing, memory use only increments, I never see a decrement.  I confirm that it is the number of files along with their additive size that causes the problem.  
0
 
LVL 39

Expert Comment

by:Adam314
ID: 22847554
Your lines 23-25:
    foreach my $line (<F>) {
        if ($line =~ m/$terms[4]/) {print TMPRES $datfile[$i],"::",$line,"\n"}
    }
might be better as:
    while(my $line = <F>) {
        if ($line =~ m/$terms[4]/) {print TMPRES $datfile[$i],"::",$line,"\n"}
    }

Another possibility is that both the parent and child are using the LOG filehandle.  I don't think this is a problem, but you could try not having the child use this - if you need it to log, have the child open it's own log file (maybe use flock so multiple children aren't writing at the same time).
0
 

Author Comment

by:mouse050297
ID: 22852483
Changing the process line from a 'foreach' to a 'while' stmt has great success.  
0

Featured Post

Get 15 Days FREE Full-Featured Trial

Benefit from a mission critical IT monitoring with Monitis Premium or get it FREE for your entry level monitoring needs.
-Over 200,000 users
-More than 300,000 websites monitored
-Used in 197 countries
-Recommended by 98% of users

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Whether you’re a college noob or a soon-to-be pro, these tips are sure to help you in your journey to becoming a programming ninja and stand out from the crowd.
Today, the web development industry is booming, and many people consider it to be their vocation. The question you may be asking yourself is – how do I become a web developer?
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
Simple Linear Regression

770 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question