Link to home
Start Free TrialLog in
Avatar of bennybutler
bennybutler

asked on

crond running mulitiple copies

I have crontab running a wget every night at 8pm, with the following command:

* 20 * * * wget http://www.domain.com/admin/dbcleanup.php > /dev/null 2>&1

Or at least that's what I expect that it does....

Well every night I look at the machine around 8:30 and there's 15 or so copies of the above command running.  

FYI, the scriipt does take about 90 seconds to complete.

Am I invoking the script incorrectly in crontab?
thanks
Avatar of sunnycoder
sunnycoder
Flag of India image

Hi bennybutler,

The cron is correct .... multiple instances you see are those of threads spawned by wget.

Cheers!
sunnycoder
Avatar of bennybutler
bennybutler

ASKER

Ouch! All this time...

Strange, i never noticed it doing that before when I tested from the command line.

So, would curl --get http://www.dailycommunicator.com/admin/dbcleanup.php > /dev/null 2>&1

Get the job done without spawning?    I just need one copy to fire up, and run for a max of 2 minutes.  The way it's working now, with so many copies running, it takes hours for them all to finish.

Thanks ,
>FYI, the scriipt does take about 90 seconds to complete
>I just need one copy to fire up, and run for a max of 2 minutes.
Seems like current solution fits your needs!!!

Number of copies should nto matter here. They would actually be making your script run faster. Network and disc access are typically lot more slower as compared to CPU processing power ... If you use a single threaded application, it would block for completion of each network and disc access. As of now, if one thread blocks for a resource, another thread can continue execution. Also all threads can download in parallel rather than one downloading at a time.

Cheers!
sunnycoder
Normally yes, but I failed to mention, the objective here is NOT to get the output of the dbcleanup.php.  The output is simply 'complete'

I just need the script triggered.  The script runs a number of mysql delete functions, such as

delete from log where date < date_sub(curdate, interval 30 day)

Yeah, that's wrong, but you get the idea.  The problem I'm having is I end up with 10-15 of those running, all competing to run server.  Mysql locks the table for each delete, so they all line up and try to run one after another, and the longer it runs, the more copies start up...

I just need a single instant to run, and no more. So should I stick with the wget, and some flag, or switch to the curl.  Hell, even ab would do it, I just need to make sure it's only kicked off once a night.
ASKER CERTIFIED SOLUTION
Avatar of sunnycoder
sunnycoder
Flag of India image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
I'm going to try my hand with curl first.  i know it's much more 'integration friendly' as I've used it to process credit cards with authorize.net.  I think if it's default tendency was to spawn copies of itself, when I'm charging a credit card' then I'd know it already.

I am interested in that command line browser though, I've never seen that one.

Thanks for the help!
ooops
> wget ahs no switch to make it run in non-interactive mode an
should have been
 wget has no switch to make it run as a single thread ...

Thanks :)