Fatal error: Out of memory

Posted on 2007-08-01
Medium Priority
Last Modified: 2011-10-03
I have a problem, i am using a php script file to generate something and put in my mysql db, the problem is that the info is very much and i get something likt this :

Fatal error: Out of memory (allocated 503054336) at /usr/ports/lang/php5/work/php-5.2.3/Zend/zend_hash.c:610 (tried to allocate 1048576 bytes) in /usr/home/gtoplive/public_html/php/inc/__mysql.php on line 375

How can i compress or do something to reduce the memory allocation ?
Question by:rares_dumitrescu
  • 2

Expert Comment

ID: 19613273
#1 Use unset() on all the variables you no longer need.
#2 Use  ini_set('memory_limit', '-1'); to disable the memory limit.
#3 Write to a file rather than store it in a variable.

Expert Comment

ID: 19613274
I would need a little more specific information about what you're trying to do but in general, the approach is going to probably be to break up the job and do one piece at a time.  For example, if you are reading 1000 records into memory, altering the values, and writing them back to the database, then instead of reading all 1000 into an array you would instead read one record, alter it, and write it back to the database - and then loop to do that 1000 times.

Can you give me a little more detail about how the logic of your script works and what kind of data you're reading into PHP?

Author Comment

ID: 19615205
The scripts works like this...

I get from db about 250 000 records, and then i  combine the information from it in a array, afther that i read the array and insert into another table...

Accepted Solution

etully earned 1500 total points
ID: 19625404
9 times out of 10, you don't need to read all the records from the database, process them, and write them back out.

The remaining 1 time out of 10 is when record #1 is altered in a way that is dependent on the values of records #2-250,000.

A good programming model would be:

Set $i = 0;
Start Loop
read records $i through $i+X  (where X is a number from 0 to maybe 1000)
process those records
write those records
increment $i by X
Finish Loop

If you pick a value of 0 for X,  then you are saying:  Read ONE record, process it, write it.  Repeat.  This is very disk intensive.

If you pick a value of 1000 for X,  then you are saying:  Read 1000 records, process them write them.  Repeat.  This is less disk intensive BUT you are moving lots of data into RAM and you might run out of memory.  Also, things process more slowly when you push RAM to its limits.

You will need to experiment a little to find a good value for X.  I suspect it will be somewhere between 50 and 500.  You might even do it scientifically by trying a dozen different values for X and timing them all.  This way,  you should be able to optimize the code fairly well.

Featured Post


Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Password hashing is better than message digests or encryption, and you should be using it instead of message digests or encryption.  Find out why and how in this article, which supplements the original article on PHP Client Registration, Login, Logo…
Originally, this post was published on Monitis Blog, you can check it here . In business circles, we sometimes hear that today is the “age of the customer.” And so it is. Thanks to the enormous advances over the past few years in consumer techno…
Learn how to match and substitute tagged data using PHP regular expressions. Demonstrated on Windows 7, but also applies to other operating systems. Demonstrated technique applies to PHP (all versions) and Firefox, but very similar techniques will w…
The viewer will learn how to count occurrences of each item in an array.
Suggested Courses
Course of the Month14 days, 3 hours left to enroll

809 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question