Link to home
Start Free TrialLog in
Avatar of style-sheets
style-sheets

asked on

stop bash if php script halted

Hi,

I have a php script that calls a bash script via shell_exec() function:

<?php
	set_time_limit(0);

	$file_name = "/home/full/path/file.txt";
	shell_exec('bash bash_script.sh "' . $file_name . '"');
?>

Open in new window


The bash script called looks like this:

# ======================================================= #
i=0
while read LINE
do
	bash child_script.sh "${i}" "${file_size}" < /dev/null &
	i=$(($i+1))
done < "${file_name}.txt"
# ======================================================= #

# Wait until all jobs are done
for job in `jobs -p`
do
    # echo $job
    wait $job || let "FAIL+=1"
done

Open in new window


As you can see, this bash script calls another "child" bash script. I'm basically simulating multi-threading by using &

It works super well (execution speed is insanely fast), except that if user hits the ESCAPE key or stops the php script, these child bash script aren't killed & becomes zombie processes.

This is badly affecting my (dedicated) server. Is there a way to prevent this?
Avatar of burnsj2
burnsj2

You could have the PHP script touch a file or update a database record every X seconds.  Then have the bash processes check the file or database (via another PHP script) and see if the file or record has been touched/updated in the last X+Y seconds.

I'm not sure what your script is doing but I usually try to avoid having user processes start by a web browser execute system level scripts.  Instead I have the user process insert into a database queue.  The queue is then checked by by the user process to see if the task has been completed.  That way you can throttle the speed of execution of the queue and avoid bogging down the server.  The user would just have to wait longer if there is a high volume of traffic.
Avatar of Duncan Roe
I think you may have a more complex problem here. Processes only become zombie if their parent process is still alive and does not wait for them. If the parent has gone, init (process 1) inherits the process and waits for it.

A possibility is that the process in which the php script runs (i.e. the web server) is still running but has no reason to wait for any child process. In that case, only jobs actually started by the php script (child jobs) should end up as zombies. I don't know php yet - is it possible to put cleanup code in your script to kill the started job on termination?

Another possibility is that somehow the child job has been stopped owing to receiving a signal and so is no longer in control. If that is happening, ps axfu should show you jobs in state T (stopped). In this case, the processes it started (grandchild jobs) will become zombies. It should be possibly to include a trap command in the script to avoid it from being stopped. You can get some diagnostics from your script by heading it with an exec to log its output somewhere, but this might disturb the problem scenario (e.g. if SIGTTOU is stopping the script).

Please post the output from ps axfu initially.
you can always chk for the presence of the php file in the list of processes and kill the shell script if its not...put your check in the child bash  script with      
ps -fax | grep example.php

if its not present than kill alll bash scripts you are executing ... .

hope it helps
Avatar of style-sheets

ASKER

The script will be run by many users simultaneously, I can't catch this using PS (I think) as some of these processes will be valid (and I can only tell which process is the one I should kill by looking at each process' command line arguments)

I think I can deal with child processes *IF* I find a way how to catch the following situations (so that I kill bash script & finish gracefully):

User close the browser
User restart his/her machine (or system crash)
User cancel request by using ESC script

I can catch ESC key by using javascript/jquery and eventually even user closing the browser (and call JQUERY Post to clean up the mess), but there are no guarantee if user's machine crash / reboot.
May be I need more cofee, but does try / catch in php 5.3 provide this feature?
Sounds promising especially if try/catch has a finally clause (like Visual Basic). Meanwhile, please post output from shell  command ps axfu when there are zombies present.
checkout register_shutdown_function ... might help ...
http://php.net/manual/en/function.register-shutdown-function.php
Thanks for the advice guys, I just realized that the issue only happens if I use flashget, I tested over & over and couldn't reproduce the problem with google chrome.

The php script serve files dynamically (which is retrieved & treated by the bash script in a special manner that can't be done in php - ie. using multi-threading).

So the question is now: Is there a reliable way to force anyone who call my script (ie. download file served) to use only 1 thread (ie. especially those using flashget / download managers that create multiple threads per download)

@duncan_roe I'll post the output of ps axfu shortly

@kshna Thanks for the reference, I'll check it out

Thanks!
ASKER CERTIFIED SOLUTION
Avatar of Ray Paseur
Ray Paseur
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial