[php, mysql] MySQL to CSV download too large

I have a script that queries two tables in my database and grabs a majority of their columns of data and puts that data into a CSV file that is downloaded to a user's computer. Or at least that is what it is supposed to do. Below is my code:

$select = "SELECT foo.foo_id, serial, sku, foo_type, foo_mac, foo_sn, foo_mac, foo_sn, foo_vendor, foo_mdn, foo_hex, foo_dec, ship_to_foo, install_foo, foo.fooid, foo.chain, store, address, tech, bar.fooid, bar.chain, barid, address1, address2, city, state, zip     
    FROM foo, bar";
$export = mysql_query ( $select ) or die ( "Sql error : " . mysql_error( ) );
$fields = mysql_num_fields ( $export );

for ( $i = 0; $i < $fields; $i++ )
{
    $header .= mysql_field_name( $export , $i ) . "\t";
}

while( $row = mysql_fetch_row( $export ) )
{
    $line = '';
    foreach( $row as $value )
    {                                            
        if ( ( !isset( $value ) ) || ( $value == "" ) )
        {
            $value = "\t";
        }
        else
        {
            $value = str_replace( '"' , '""' , $value );
            $value = '"' . $value . '"' . "\t";
        }
        $line .= $value;
    }
    $data .= trim( $line ) . "\n";
}
$data = str_replace( "\r" , "" , $data );

if ( $data == "" )
{
    $data = "\n(0) Records Found!\n";                        
}

header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=foobar.xls");
header("Pragma: no-cache");
header("Expires: 0");
print "$header\n$data";
?>

Open in new window


The error I am getting is this:

Fatal error: Allowed memory size of 104857600 bytes exhausted (tried to allocate 104333464 bytes) in /home/foobar/public_html/fbar/_csv.php on line 35

Open in new window


Not sure what to do to fix or address this. Thanks so much!
prileyosborneAsked:
Who is Participating?
 
Ray PaseurCommented:
Maybe something more like this...  (Did you notice that you are selecting the same columns more than once?)
$csvfile = fopen("cs_data.csv",'w');  

$sql 
= "SELECT 
  foo.foo_id
, serial
, sku
, foo_type
, foo_mac
, foo_sn
, foo_mac
, foo_sn
, foo_vendor
, foo_mdn
, foo_hex
, foo_dec
, ship_to_foo
, install_foo
, foo.fooid
, foo.chain
, store
, address
, tech
, bar.fooid
, bar.chain
, barid
, address1
, address2
, city
, state
, zip 

  FROM 
  foo
, bar
"
;
$result  = mysql_query ( $select ) or die ( "FAIL $sql <br/>" . mysql_error( ) );

// ITERATE OVER THE RESULTS SET
while ($row = mysql_fetch_assoc($result))
{
    fputcsv($csvfile, $row, ',', '"');
}  

// Close File  
fclose($csvfile); 

Open in new window

Intstead of writing headers, you could give the client a URL link to the CSV file you have just written to the server.  The file extension CSV is almost universally associated with Excel.
http://filext.com/file-extension/CSV
0
 
Dave BaldwinFixer of ProblemsCommented:
You probably need to increase "memory_limit" in 'php.ini'.  If you are on shared hosting and can't do that, then you need to break your CSV export into smaller chunks that can be used after downloading them.
0
 
rinfoCommented:
have you tried to put this line before start of the script
ini_set('memory_limit', '256M');
This should make php.ini memory limit to 256M temporarily and your script can execute
within this limit.
At present it seems to be set for 100M.
0
Upgrade your Question Security!

Your question, your audience. Choose who sees your identity—and your question—with question security.

 
Ray PaseurCommented:
Learn about this function.
http://php.net/manual/en/function.fputcsv.php

Your application design should make the SELECT, and use fputcsv() to write the rows of data to a temporary file.  Then when the file has been created and the query results set has been exhausted, you have your CSV file ready to go to the browser.  Issue the headers, then read and feed the contents of the file.  By doing it this way you will not need to hold the entire query results set in memory at the same time.
0
 
prileyosborneAuthor Commented:
Thank you all so much for your responses. I tried to set the file size max to a higher number, but that didn't work.

I reviewed the link from Ray, and that seems to have worked best. I have code that is writing to a file on my server, and then the headers are downloading the file upon completion of the SQL statement.

The problem I am having is twofold:

1. The script is actually loading thousands of the exact same record, instead of thousands of individual records.

2. The downloaded file shows nothing in Excel, but when I open the live server version in something like Sublime Text, I can see all the record data.

Below is my updated script:

$csvfile = fopen("cs_data.csv",'w');  

$select = "SELECT foo.foo_id, serial, sku, foo_type, foo_mac, foo_sn, foo_mac, foo_sn, foo_vendor, foo_mdn, foo_hex, foo_dec, ship_to_foo, install_foo, foo.fooid, foo.chain, store, address, tech, bar.fooid, bar.chain, barid, address1, address2, city, state, zip     
    FROM foo, bar";
$result  = mysql_query ( $select ) or die ( "Sql error : " . mysql_error( ) );
$row_num = mysql_num_rows ( $result );
$db_data = mysql_fetch_array ( $result );

for ( $i = 1; $i <= $row_num; $i++ )  
{  
    if ($i <= $row_num) {
        // Write SELECT data to file
        fputcsv($csvfile, $db_data, ',', '"');
    }
}  

// Close File  
fclose($csvfile); 

header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=cs_data.csv");
header("Pragma: no-cache");
header("Expires: 0");

Open in new window


Thanks for any help you can offer!
0
 
prileyosborneAuthor Commented:
That worked perfectly! Thanks so much!
0
 
Ray PaseurCommented:
Thanks for the points, and thanks for using EE!  It's a great question, ~Ray
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.