Solved

Cron Job Queue?

Posted on 2008-06-14
19
1,169 Views
Last Modified: 2012-05-05
I wanted to know how to queue cron jobs with a general linux based system. I have all the crons in a nice little file, but if too many run at once, they start getting blogged up, and it lags the server resulting in me having to restart after some time.

I wanted to know if theres a way to have it do like.. 3 crons at a time max and queue the rest? Possible another php script that could do it thats our there? (Searched with no luck), or maybe even the general cron job application can do it, I'm not really sure. But I need to figure out a way so that it doesn't eat my server alive! Looking for any suggestions really.

Thanks!
0
Comment
Question by:Valleriani
  • 9
  • 6
  • 2
  • +2
19 Comments
 
LVL 4

Expert Comment

by:afzz
Comment Utility
You can set a time when the cronjob has to run up to the last minute. Please post your cronjob datails of how you have it setup currently
0
 
LVL 2

Expert Comment

by:jml948
Comment Utility
I would say that you test how long that each of your cron jobs takes to run. This may be a bit tedious but in the long run it can help you determine a whereabout of how to manage each of the cron jobs into the following terms:

(1) determine the maximum number of instances that can simultaneously execute the cron jobs without causing excessive queuing, and (2) create an execution plan to allow no more than the maximum number of instances to be running these cron jobs at the same time.

once you can determine this you can decide when to run each cron job as to where it wont bog down your system to where it is unbearable. Also you could perhaps consider removing some of the jobs that aren't exactly "important" to run. This is only a suggestion. If all else fails you could consider upgrading your memory and perhaps your processor. But i would suggest getting an estimate of how long each cron job takes to run and then working it into a schedule as to where it would work like a queue, where once one job finishes another will be able to perform to its fullest in order to avoid a complete bog down.

Good Luck.
0
 
LVL 108

Expert Comment

by:Ray Paseur
Comment Utility
Valleriani, tell us a little more - like what your cron jobs are doing.  Are there any functions that could be combined or queries that could be optimized?  Do the cron jobs need to be run as frequently as they are being run now?  Perhaps a lighter work schedule will help (like afzz says, show us your cron schedule).  We will probably have more suggestions after we have a little more information.  Best, ~Ray
0
 
LVL 7

Author Comment

by:Valleriani
Comment Utility
I have 70 websites, and about each one running every 30 minutes, all spread out, Ill show some examples:

0,30 * * * * php /home/cars/public_html/content/uf.php
5,35 * * * * php /home/chicago/public_html/content/uf.php
10,40 * * * * php /home/business/public_html/content/uf.php
15,45 * * * * php /home/enterta/public_html/content/uf.php
20,50 * * * * php /home/estatec/public_html/content/uf.php
25,55 * * * * php /home/gadget/public_html/content/uf.php

Then i repeat that with all the sites. Basicly so there 'more' spread out. What they do is get news/info from our advertising sites, (as we are a advertising company, not the e-mail type) There normally fine but sometimes they don't finish on time.. May take awhile to do, creating a buildup slowly. My boss does state he wants them run regularly, I can prob move it up to 1 hour, etc.

It would be hard to combine because each site deals with specific sites, some are the same, some aren't, it really depends. I'd really like the crons to finish before halting them though, thats why I was thinking some sort of cron queuer due to it.
And for the time limit, its pretty random, sometimes there fine, other times they just go plain slow.
0
 
LVL 29

Expert Comment

by:fibo
Comment Utility
<<There normally fine but sometimes they don't finish on time>>
What exactly is happening? are they breaking the CPU limit given to a php job?

If this is the case, you might consider 3 solutions:
a - increasing the site-wide limit to running time, presumably not the best solution
b - run your jobs more frequently, so that there are lower risks to get too much material
c - redesign uf.php so that it gets some king of "memory" of which was the last successful action and then proceed from this place on; that would probably be the more complex but more effective strategy, since it would somehow "smoothen" the processes over the whole day
0
 
LVL 7

Author Comment

by:Valleriani
Comment Utility
For C, what do you exactly mean there? I kind of understand, but then I don't.

They sometimes take longer then they should, weather it be more material, or perhaps the other sites are slow, etc. Ventually it honestly seems like they start overlapping each other and almost all crons take ages, probably because the more that starts running, the slower it becomes, and slower.. and slower..

I've tried running it ever 15 minutes but I felt that it was blogging down alot quicker that way.. Seems like spreading it out helped it but ventually gets to be blogged down.
0
 
LVL 29

Expert Comment

by:fibo
Comment Utility
I presume that uf.php explores url, presumably from some sort of list.

The idea would be to check a certain given number of urls each time, say 30. Next time you check the next 30 etc.
Or each time you have checked an url, you tag it as such. Then when/if the script crashes, next time it will resume in the list at the precise place where it was stopped.

The problem is in fact that you need another process to add elements to the list, eg in a database.
If you have a database of your list, you might have a table with something like:
url
date initially added
last checked date (empty or similar when added first)
additional info
then you get a list of this table sorted by last checked date, check one after the other, and after doing one change "last checked date" to current timestamp.

0
 
LVL 7

Author Comment

by:Valleriani
Comment Utility
At the moment it does sort of do that, It will continue where it left off if it 'stopped' somehow. But it doesn't have some sort of limit, mainly becuase the boss generally wants it all updated per hour at least.

He calls it his 'advertising idea' to always be up to date. Meh, Gotta love bosses, right? It's a shame crons can't be done the same way sort of. If it 'checked' this last time, then go to the next. Something like that.

But even if I do it the way your talking about, sometimes when too many sites are running uf.php, it will blog down no matter what, sometimes just for a minute, but it can get annoying. Isn't good when the sites go down1-2minutes every 30minutes due to too many sites using uf.php, which is of course, different per site. So I think even if we do it that way it will still have downtime, just in short bursts.

Is there any way to restrict the memory usage per this UF.php perhaps? Make it run slower, but not use a large amount of ram? Not sure... That's why I wanted a cron queuer, where it would simply just run the crons in a queue list, maybe 5 crons going on max, and can only queue the same uf.php file once, just in case it DOES end up looping homehow. Would this be even possible to make somehow?

The scripts do get done however, I haven't really seen them crash. It's mostly getting data from mysql files and some webpages, etc. So UF.php does use mysql/php so it has that fun memory usage going on there.

What a nightmare!
0
 
LVL 7

Author Comment

by:Valleriani
Comment Utility
If a cron job is set to run lets say every minute, the SAME cron job, would it continue to load that same cron job lets say, if it was already running that 'cron'? Just another question.
0
Highfive Gives IT Their Time Back

Highfive is so simple that setting up every meeting room takes just minutes and every employee will be able to start or join a call from any room with ease. Never be called into a meeting just to get it started again. This is how video conferencing should work!

 
LVL 29

Expert Comment

by:fibo
Comment Utility
Uh uh... I had not understood that the problem was that the sites would go down because of the load!

As far as I know, you might end up with several cron jobs running simultaneously, and that would probably create increase the server load.

However, if your sites are going down frequently, you need to be sure of the problem source.
- The load created by one or several uf.php jobs running simultaneously is high on your list.
- BUT you need to check that in the logs... it might also come from some unusual web activity (eg, some presumed normal queries which in fact crash the system) or from some hack or hack attempt.

Hack attempts would, most of the times, exhibit queries of the type index.php?....http://...
They might also translate as additional files appearing in your root directory or some sub-directories (if you are running a standard product, look at directories such as cache and multimedia library).

You may also think to some way to detect simultaneous uf jobs running. Not very easy if using "crash" to end a run, otherwise you might write somewhere "ended OK" when ending gracefully.
0
 
LVL 29

Expert Comment

by:fibo
Comment Utility
An additional thought, looking back at the cron list: since all the jobs are running on the same (Unix) server, overloads might come, as you hinted, from uf running on several different (web) "servers" (ie cars, chicago, etc).

Would it be possible that you run just ONE super-uf job, which would explore several lists (cars, chicago...) in round robin fashion? This would ensure that there is only one job running, and the round robin would spread the "loads" when necessary.
0
 
LVL 7

Author Comment

by:Valleriani
Comment Utility
I never really thought about combining them, it wasn't my script but I shall see if I can work it out into some GLOBAL script. Not sure. mmm..

Is it possible to create a lets say GLOBAL-UF.PHP, which runs all these little uf.php files? like i call global-uf.php and it jumps out to all the uf.php's, maybe some sorta way to make it do only one or two at a time (I'm sure php can do that maybe?) and just have it basicly 'run' the other php files?

The issue is i think is uf.php is attached to many many files there that make it call out other scripts etc. UF.PHP is just the external file for getting it to update not manually.
0
 
LVL 29

Expert Comment

by:fibo
Comment Utility
Yes, your idea seems a good one. Having some "master" which manages the small scripts. Tracking each's progress..
0
 
LVL 7

Author Comment

by:Valleriani
Comment Utility
Ahh okay, is there any specific way I should call these PHP files? I'll probably use some variables to track whats going and what isn't going, etc. Possibly a timeout (if possible in php) after a certain limit, just in case, wanna be safe rather then sorry
0
 
LVL 7

Author Comment

by:Valleriani
Comment Utility
Would there be any examples of running scripts like this online? I seem to be having some troubles.
0
 
LVL 29

Accepted Solution

by:
fibo earned 450 total points
Comment Utility
Not sure what is the best way for your problem.
You might open a browser page with javascript enabled in the browser.

Your page will load and run the php script. Once the run is over, the page is "complete" and does not crash against the run-time limit of php.
Then your javascript wait say 5 minutes (this will happen on the client, not on the server), then reload the page which will in fact re-trigger the script.

SO that you can see that the run is alive, you might display the times and number of searches over the last hour..
0
 
LVL 7

Author Comment

by:Valleriani
Comment Utility
I could make it add in a log, thats no issue. I mean, to call a PHP script within a php script, how would that be handled?
0
 
LVL 7

Author Comment

by:Valleriani
Comment Utility
Er rather, to execute a script.
0
 
LVL 108

Assisted Solution

by:Ray Paseur
Ray Paseur earned 50 total points
Comment Utility
To execute a script in a script, you can use include() or include_once()
0

Featured Post

Easy Project Management (No User Manual Required)

Manage projects of all sizes how you want. Great for personal to-do lists, project milestones, team priorities and launch plans.
- Combine task lists, docs, spreadsheets, and chat in one
- View and edit from mobile/offline
- Cut down on emails

Join & Write a Comment

Author Note: Since this E-E article was originally written, years ago, formal testing has come into common use in the world of PHP.  PHPUnit (http://en.wikipedia.org/wiki/PHPUnit) and similar technologies have enjoyed wide adoption, making it possib…
Part of the Global Positioning System A geocode (https://developers.google.com/maps/documentation/geocoding/) is the major subset of a GPS coordinate (http://en.wikipedia.org/wiki/Global_Positioning_System), the other parts being the altitude and t…
The viewer will learn how to look for a specific file type in a local or remote server directory using PHP.
The viewer will learn how to create and use a small PHP class to apply a watermark to an image. This video shows the viewer the setup for the PHP watermark as well as important coding language. Continue to Part 2 to learn the core code used in creat…

772 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

14 Experts available now in Live!

Get 1:1 Help Now