• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1334
  • Last Modified:

PHp MySQL - running a long script - how to prevent timeout?

Hi,

I've written some PHPfunciton which opens a lot of database connections. The code contains a lot of legacy code so it is difficut to change much of its working.

The code processes some cusotmer information and takes a good few moments to run. All works well if I run it on a few customers although it does take some time (say about 3 mins for 100 customers).

In fact it's fine when I even run it on up to about  500 customers. But we have to run it on about 50000 customers or any amount above 1000, the Internext explorer window just never completes or even times out. It just shows the whirling circle indefinetely.
It effectively never completey loads the page but I assume at some point the server times out as it does not process all of the customes correctly.

How can I approach this so there are intervals that allow the script to catch up with itself and perhaps provide screen output during the process to update me as to how far it has got.

I did put an echo into the main loop but it does not seem to actually print the output until it has completed the entire script. (so when I run it on a small amount of customers when it completes all of a sudden it dumps the output onto the screen in one go).

Thanks for any input on this matter.

0
afflik1923
Asked:
afflik1923
3 Solutions
 
syedasimmeesaqCommented:
on the top of your script that needs longer time for execution, you can do this

ini_set('max_execution_time', 300); //300 seconds = 5 minutes  

or alternatively you can use this
set_time_limit(seconds);
set_time_limit(0); //for no time limits.

0
 
afflik1923Author Commented:
So if I use just the line:
set_time_limit(0); //for no time limits.
in the main start script will that do the trick. So even if it takes 2 hours to run it will still work?

Also should I put the
set_time_limit(0); //for no time limits.
command into each sub script that is called from the main script.

Finally also related just in case it is running out of memory on the shared server the code uses the PEAR DB library to make DB connections.
If I set these variables to NULL after creation does that free up the memory right away.

Thanks
0
 
absxCommented:
Something on optimizing the PHP-MySQL interaction:

- Check out INSERT DELAYED, if inserting a lot of records at once
- Make sure your indexing is suitable for the queries, use EXPLAIN with every query to look for trouble
- Look at the PHP file - is it instantiating a new MySQL connection before every query? This slows things down massively. It's mostly just as safe to open up one connection and do all queries with the same handle
0
Free learning courses: Active Directory Deep Dive

Get a firm grasp on your IT environment when you learn Active Directory best practices with Veeam! Watch all, or choose any amount, of this three-part webinar series to improve your skills. From the basics to virtualization and backup, we got you covered.

 
afflik1923Author Commented:
Hi Yes unfortunetly it does open a new MySQL connection everytime. Something quite complex to put right as the code is quite old and there is a lot of it.
0
 
afflik1923Author Commented:
I take it back. It did not take long at all. I simply added a-n optional parameter to each function that was requird in ths query. The optional parameter was for a DB connection which defaulted to Null.

Ten at the satart o the function I checked if he DB field was null, if so then create a new connection if not use the DB connection that has been passed to the function.
That way kept working with old code and new code (and this special funcitn I had to write) has ROCKED in speed. Thanks for making me try!!!
0
 
afflik1923Author Commented:
Sorry for many typos in last posting. I'm excited!
0
 
Ray PaseurCommented:
Glad you got thing headed in the right direction.

About that set_time_limit(0) thing.  Don't do that.  For one thing, many web hosts will not permit it (imagine THAT!).

A better strategy is to set the time limit iteratively for each pass through the logic.  That way, if you ever do hit a data-dependent error that loops the code, you will know exactly what data was in play at the time, and you won't run forever wondering if your script is making progress!

Just a thought from the "sadder-but-wider" camp!

best regards, ~Ray
0
 
afflik1923Author Commented:
OK, in the end. Still got a bit of a problem with this as sometimes I run it and it still seems to time out. Problems of being in a shared server I guess. But at least I have something much better then I had and I can do it in batches of a reasonable size.
Thanks
0
 
afflik1923Author Commented:
Thanks. see last comment.
0
 
Ray PaseurCommented:
Thanks for the points - glad we are helping move the ball forward! ~Ray
0

Featured Post

VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now