• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 607
  • Last Modified:

PHP dealing with high load spikes.

Hi,
We have several servers each with arround 50 -80 sites on them. Each site runs an instace of our large ecommerce system.
The problem we are getting is dealing with load on the server, but this seems mainly to be becuase of backups.
The backups start very early in the morning arround 4am, but some days have been continueing running right up till 1 or 2pm. at random times during the backups, the load will spike and the application will not be able to run, but then there are other times when the load is really high and the app can sort of still run, but 1. its slow, and 2. its probably not helping as it may be helping it reach such high loads. Load can spike from 3.4 - 5 right up to 500!
Aside from any ideas of how to fix the issue with the load from the backups, does anyone have any ideas about how the application should handle this. E.g. Should we try to detect the load at the start of the php application, and get it to sleep for a bit till its better, or exit/redirect to a static error page?
I realize that dealing with it at all at the php app it self level seems like avoiding the problem, but i think we may always get spikes in the load anyway from maybe spikes in visitors at random times and so this would help with all cases.
Thanks is anyone has any ideas.
0
stilliard
Asked:
stilliard
3 Solutions
 
slyongCommented:
Hi,

I think you understand the disadvantage dealing with such situation using PHP.  However, if you want to have a script to check server load and redirect, you might want to try:

<?php
$load = sys_getloadavg();
if ($load[0] > 80) {
    header('HTTP/1.1 503 Too busy, try again later');
    die('Server too busy. Please try again later.');
}
?>

Open in new window


ref: http://de.php.net/manual/en/function.sys-getloadavg.php
0
 
slyongCommented:
Sorry another way to do it is to set a timeout and register_shutdown_function to make PHP redirect to somewhere:

set_time_limit(1);
function shutdown () {
    if ( ! defined('FINISHED')) {
        header('Location: /timeout');
    }
}

register_shutdown_function('shutdown');

while (true) {}

define('FINISHED', 1);

Open in new window


ref: http://stackoverflow.com/questions/6759735/can-i-configure-apache-to-redirect-at-a-specified-timeout
0
 
ray-solomonCommented:
Some random insights/tips that may help.

I also have several servers as big as yours. The process of backing up sites to tar.gz files is very CPU intense and mysql consumes a lot of memory when databases are being dumped. This is normal. So during the backup process I make sure there's no other script on any of our sites that is running any big mysql queries.

I also adjust my backup to start late at night at 10pm when traffic is lower because it takes about 9 hours to complete. Better than during the day.

Make sure you are not actually running out of RAM and using swap during the backup process. If so, add more RAM or else apache and other things will start dying randomly. That will help keep your sites operating normally during a backup. Can't have too much ram.

I adjust most cron jobs around the backup process.

As for the php applications. If your applications are doing database queries, make sure you have the proper indexes on your tables where applicable. A lot of people I help get this wrong.

Raid 10 is the way to go if you can afford it. Otherwise throwing more RAM and better CPU is useless if your Hard drive read/write speeds cant keep up with all those sites doing things during the backup or heavy traffic.
0
Cloud Class® Course: Ruby Fundamentals

This course will introduce you to Ruby, as well as teach you about classes, methods, variables, data structures, loops, enumerable methods, and finishing touches.

 
arober11Commented:
What are you backing up?
0
 
stilliardAuthor Commented:
@slyong Thanks sys_getloadavg is great for telling the point when the app may need to be stoped, thanks i'll look into it more to decide the best cut off point.
@ray-solomon Thanks for the tips, i'll definetly be looking to through more RAM at it and look at some better hardware to support it.
@arober11 All the sites files and mysql db backups. However im glad you've asked hat question, as we could definetly look to cut down the parts we actually need backups of in terms of files. The sites currently all run the same system, so i gues the backups would only be needed for there own user files folder and any config files.
0
 
arober11Commented:
You could also set up another mySQL instance, on a separate server and use basic mySQL replication (master / slave) to maintain a remote copy of your DB's, then simply break the mirror, backup from the slave, then re-enable the replication. Would shift the majority of the load elsewhere along with offering some redundancy.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Cloud Class® Course: Amazon Web Services - Basic

Are you thinking about creating an Amazon Web Services account for your business? Not sure where to start? In this course you’ll get an overview of the history of AWS and take a tour of their user interface.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now