We help IT Professionals succeed at work.

PHp MySQL - running a long script - how to prevent timeout?

afflik1923
afflik1923 asked
on
Medium Priority
1,383 Views
Last Modified: 2013-12-12
Hi,

I've written some PHPfunciton which opens a lot of database connections. The code contains a lot of legacy code so it is difficut to change much of its working.

The code processes some cusotmer information and takes a good few moments to run. All works well if I run it on a few customers although it does take some time (say about 3 mins for 100 customers).

In fact it's fine when I even run it on up to about  500 customers. But we have to run it on about 50000 customers or any amount above 1000, the Internext explorer window just never completes or even times out. It just shows the whirling circle indefinetely.
It effectively never completey loads the page but I assume at some point the server times out as it does not process all of the customes correctly.

How can I approach this so there are intervals that allow the script to catch up with itself and perhaps provide screen output during the process to update me as to how far it has got.

I did put an echo into the main loop but it does not seem to actually print the output until it has completed the entire script. (so when I run it on a small amount of customers when it completes all of a sudden it dumps the output onto the screen in one go).

Thanks for any input on this matter.

Comment
Watch Question

on the top of your script that needs longer time for execution, you can do this

ini_set('max_execution_time', 300); //300 seconds = 5 minutes  

or alternatively you can use this
set_time_limit(seconds);
set_time_limit(0); //for no time limits.

Not the solution you were looking for? Getting a personalized solution is easy.

Ask the Experts

Author

Commented:
So if I use just the line:
set_time_limit(0); //for no time limits.
in the main start script will that do the trick. So even if it takes 2 hours to run it will still work?

Also should I put the
set_time_limit(0); //for no time limits.
command into each sub script that is called from the main script.

Finally also related just in case it is running out of memory on the shared server the code uses the PEAR DB library to make DB connections.
If I set these variables to NULL after creation does that free up the memory right away.

Thanks
Commented:
Something on optimizing the PHP-MySQL interaction:

- Check out INSERT DELAYED, if inserting a lot of records at once
- Make sure your indexing is suitable for the queries, use EXPLAIN with every query to look for trouble
- Look at the PHP file - is it instantiating a new MySQL connection before every query? This slows things down massively. It's mostly just as safe to open up one connection and do all queries with the same handle

Author

Commented:
Hi Yes unfortunetly it does open a new MySQL connection everytime. Something quite complex to put right as the code is quite old and there is a lot of it.

Author

Commented:
I take it back. It did not take long at all. I simply added a-n optional parameter to each function that was requird in ths query. The optional parameter was for a DB connection which defaulted to Null.

Ten at the satart o the function I checked if he DB field was null, if so then create a new connection if not use the DB connection that has been passed to the function.
That way kept working with old code and new code (and this special funcitn I had to write) has ROCKED in speed. Thanks for making me try!!!

Author

Commented:
Sorry for many typos in last posting. I'm excited!
Most Valuable Expert 2011
Author of the Year 2014
Commented:
Glad you got thing headed in the right direction.

About that set_time_limit(0) thing.  Don't do that.  For one thing, many web hosts will not permit it (imagine THAT!).

A better strategy is to set the time limit iteratively for each pass through the logic.  That way, if you ever do hit a data-dependent error that loops the code, you will know exactly what data was in play at the time, and you won't run forever wondering if your script is making progress!

Just a thought from the "sadder-but-wider" camp!

best regards, ~Ray

Author

Commented:
OK, in the end. Still got a bit of a problem with this as sometimes I run it and it still seems to time out. Problems of being in a shared server I guess. But at least I have something much better then I had and I can do it in batches of a reasonable size.
Thanks

Author

Commented:
Thanks. see last comment.
Most Valuable Expert 2011
Author of the Year 2014

Commented:
Thanks for the points - glad we are helping move the ball forward! ~Ray
Access more of Experts Exchange with a free account
Thanks for using Experts Exchange.

Create a free account to continue.

Limited access with a free account allows you to:

  • View three pieces of content (articles, solutions, posts, and videos)
  • Ask the experts questions (counted toward content limit)
  • Customize your dashboard and profile

*This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

OR

Please enter a first name

Please enter a last name

8+ characters (letters, numbers, and a symbol)

By clicking, you agree to the Terms of Use and Privacy Policy.