Killing child processes without waiting for the child to finish

Code is roughly:

open(FILE, "+<file");
flock (FILE,2);
#read data from the file
spawn_child();
#write to the file
close (FILE);

sub REAPER {
   my $wait_pid = wait;
}

sub spawn_child() {
   $SIG{CHLD} = \&REAPER;
   my $pid = fork();
   if (!$pid) {
      exec ("perl script");
      exit; #redundant, I know
   }
   return;
}

The issue is that "script" contains an open to "file", which is currently locked by the parent process. I can't change the order of the main logic (i.e. can't write to the file before I spawn the child). The way it's written now, the parent seems to be waiting for the child to finish executing before it continues on with the logic, creating a deadlock (child is waiting for "file" to be unlocked, parent is waiting for child to finish before unlocking "file"). I need the parent to continue on without the child finishing (the parent also needs to continue to spawn children while the child processes run their course, so waiting for each child to finish is unacceptable).

I've tried running the child as a background process, but this seems to cause "script" to never come out of sleep() commands (/boggle). Any other suggestions?
AsdfAsked:
Who is Participating?
 
ozoConnect With a Mentor Commented:
flock (FILE,2);
#read data from the file
flock(FILE,8);
spawn_child();
flock (FILE,2);
#write to the file
close (FILE);
0
 
Adam314Commented:
what are you trying to do?  (high-level)
0
 
AsdfAuthor Commented:
Adam314:

Parent spawns X children, running simultaneously. Child and parent communicate through a "status" file. When the child finishes, parent will read that status and spawn a new child.

ozo's solution does solve the file locking deadlock, but the parent still waits for the child to finish executing before moving forward in the logic, meaning I can only have a single child running at once.

What if the child never died on it's own, and it was up to the parent to kill the child when the child set it's status to "done"? As of now, we'd be stuck because the would wait for the child to die indefinitely. How can I make the parent stop waiting for the child to finish, but still reap the child (it's necessary to reap the child because cygwin only allows 256 children from a single process, and I need to exceed this number in my use case)?
0
Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

 
AsdfAuthor Commented:
Another point of interest:

Speckled throughout my code I have various qx!echo "something" >> log.txt!; lines, mostly for debugging purposes. It seems that the parent is "freezing" when it finishes executing the first qx! after the fork(). I'm thinking these qx!'s are the issue, not my method of reaping. I tested this with some basic sample code and saw the same thing:

foreach my $num (1..3) {
   open (T, ">>txt");
   flock (T, 8);
   &spawn();
   flock (T, 2);
   print T "$num parent\n";
   close (T);
}

sub REAPER {
   my $var = ("process died: ",wait);
}

sub spawn {
   $SIG{CHLD} = \&REAPER;
   my $pid = fork();
   if (!$pid) {
      exec ("perl sleeper");
   }
   qx!ls!;
}

the code in sleeper:

sleep(4);
open (T, ">>txt");
flock (T, 2);
print T "sleeper\n";

Resulting "txt":
sleeper
1 parent
sleeper
2 parent
sleeper
3 parent

If you comment out the qx!, the result is:
1 parent
2 parent
3 parent
sleeper
sleeper
sleeper

Is this expected behavior? If so, why, and is there a way around it?
0
 
kblack05Commented:
I believe adding the line

next;

to the parent codeblock where it is to pass over the child should work.
0
 
kblack05Connect With a Mentor Commented:
Also try using this method to trap your errors

To obtain useful error messages, add the following snippet to your script, just beneath the shebang line (the first line of the script; usually !#/usr/local/bin/perl or !#/usr/bin/perl):

BEGIN {
open (STDERR, ">/path/to/somewhere/error.txt");
}

0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.