Link to home
Start Free TrialLog in
Avatar of Ray Paseur
Ray PaseurFlag for United States of America

asked on

A script that restarts itself repeatedly

I have a queue of work from external sources, and I need to regulate the speed with which units of work are released from the queue (lots of work comes in overnight, and I want to avoid dumping it all into the workplace at once).  

The strategy I've been trying includes a "qManager" that releases one work unit, then sleeps for 20 seconds, then restarts itself by using CURL to its own URL.  However I am finding that the CURL process hangs up waiting for the restarted task to end if I use GET and returns FALSE if I use POST.

What is a good way to (1) restart the qManager and (2) get a signal back that it restarted successfully, without waiting until the end of the restarted script?

Thanks and regards, ~Ray
Avatar of Beverley Portlock
Beverley Portlock
Flag of United Kingdom of Great Britain and Northern Ireland image

Does it have to be every 20 seconds Ray? What about every minute and then a simple cronjob would do it?
Avatar of Ray Paseur

ASKER

Hi, Brian - please trust me on this - it has to be 20 seconds.  Actually 10 seconds would be better.  I would love a CRON solution but it's not in the cards here.

When I restart the script with CURL, I use these options (see snippet) and the frequent error is CURL error #28 that says there was a timeout.  It seems to happen after several calls to CURL.  The symptom may eb a false positive - it seems to restart the script correctly but CURL still returns FALSE.  Wonder if there is a curl option or a PHP set_time_limit() I need to set to bring the timer back to zero?

// TRANSFER CONTROL WITH A CURL POST
function curl_post($uri, $timeout=15)
{
    $chr = curl_init($uri);
    curl_setopt($chr, CURLOPT_HEADER,         FALSE);
    curl_setopt($chr, CURLOPT_POST,           TRUE);
    curl_setopt($chr, CURLOPT_POSTFIELDS,     "x=y");
    curl_setopt($chr, CURLOPT_TIMEOUT,        $timeout);
    curl_setopt($chr, CURLOPT_RETURNTRANSFER, TRUE);
    $xyz = curl_exec($chr);
    $err = curl_errno($chr);
    $inf = curl_getinfo($chr);
    curl_close($chr);
    if ($xyz === FALSE) warning_RAY("CURL POST FAIL: $uri TIMEOUT=$timeout, CURL_ERRNO=$err", $inf);
    return $xyz;
}

Open in new window

Actually, thinking about it, why not write a little bash script that simply call LYNX or the PHP command line interface to run a script releasing your job. Then use "sleep 20" and send it round the loop again like

#!/bin/bash
while true
do
     lynx -dump http://www.example.com/rays-job-release.php?password=my-secret > /dev/null

     sleep 20
done
ASKER CERTIFIED SOLUTION
Avatar of Beverley Portlock
Beverley Portlock
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Thanks, I take you point about scheduling being an OS job, and I think this will move us forward.

Stochastic Arrival, Deterministic Departure!

Best, ~Ray
OK - Mail me if you have any problems and I'll take a whack at it.

Obviously you would need to install the runJob.sh into the machine start up scripts and will need to tackle the situation where no work units are available for distribution. Other than that the only "biggy" is to ensure that the start interval is long enough that the scripts complete, otherwise the number executing will be the difference between the two intervals times the rate of calls.
Yeah - zero risk from long-running jobs - nothing could take more than a fraction of a second.  Thanks for the help, Brian!