PHP loop and high cpu usage

Alw1n
Alw1n used Ask the Experts™
on
Hi,
I have a loop in a php script (5.2.11 on IIS) that builds data from a database, there could be many lines (20k+) and it can run for several seconds. I have noticed that the cpu usage goes up to 100% while this is busy which is understandable, I was wondering:
1 - Is this cpu usage is going to affect the server badly (for other requests) or will it give resources to other requests as needed?
2 - What I could do about this to free up some cpu time (ala vb6 DoEvents type thing)
Comment
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
Hi,
Ya, it would affect the server badly for other request .
My suggestion would be why not give a break of 2-5 sec break. and later run the code it would not only reduce cpu process but also clear the problem of memory limit.
Increase memory limit by using ini_set() method.
Ya it would take little more time but it avoid a risk of sever crash or process stop in between

Author

Commented:
thanks for the info, it is kinda what I thought. Could you explain a little more about "give a break of 2-5 sec break. and later run the code"? Do you mean a 'sleep' in the loop?
Memory is fine as I have set this correctly, it's just the cpu that is taking a beating.
Avinash ZalaWeb Expert
Commented:
You should fetch one block of data then some second of sleep using php and then fetch another block of data. ie. fethch 5k data, then some sleep, fetch another 5k data ao so on.
Introduction to R

R is considered the predominant language for data scientist and statisticians. Learn how to use R for your own data science projects.

Heavy CPU usage during database retrival normally means that you have not defined sufficient indexes for your retrieval. Ensure that EVERY column mentioned in a WHERE, ORDER BY or JOIN clause has an index in the database.

Author

Commented:
Thanks, I haven't really noticed that a lack of indexes causes high cpu usage, just that the queries take longer but I had created the indexes already anyway for performance reasons
It depends on the size of the table. Run the wrong query and the machine has to scan the whole table. Ask it to do that on a big table a few times and you will see a noticeable drop in server response. You mentioned 20,000 rows of data and depending on what you are doing to it that could cause a drop in performance. ORDER BY clauses are particularly good at stuffing a machine's performance.

If possible, do as much summarising of data as possible before reading it out of the database and into PHP. Databases generally have good aggregating capabilities and the trick to good performance is to move the smallest amount data  around as possible. Big data sets eat processor time.

Remember the old database rule - do your SELECTs before your JOINs

Author

Commented:
Thanks All, I split the points to be fair

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial