aberns
asked on
properly formatting a csv file from mysql db using php and implode $row
Hello,
As a means of backing up my tables with a cron job, I am using PHP to query my database, write the rows into a CSV file, and then email the results to myself via PEAR mail. Here is my question...
If I manually list the number of fields in each table, (i.e. $nextline = "\"$row[0]\",\"$row[1]\",\ "$row[2]\" ,\"$row[3] \",\"$row[ 4]\"\r\n"; ) I end up with a properly formatted csv file.
However, I wanted to use a short cut so that I wouldn't have to count and list each field, somethign like:
$nextline= implode(",",$row)."\r\n";
The problem is that it doesn't encapsulate each data field with any kind of delimiter, so for instance, a text field with a line break ends up being broken into two fields in the csv file if you see what I mean.
So, can you explain why this is happening, and if there is a way to add a field delimter to the implode function of php when extracting my data?
Thanks!
As a means of backing up my tables with a cron job, I am using PHP to query my database, write the rows into a CSV file, and then email the results to myself via PEAR mail. Here is my question...
If I manually list the number of fields in each table, (i.e. $nextline = "\"$row[0]\",\"$row[1]\",\
However, I wanted to use a short cut so that I wouldn't have to count and list each field, somethign like:
$nextline= implode(",",$row)."\r\n";
The problem is that it doesn't encapsulate each data field with any kind of delimiter, so for instance, a text field with a line break ends up being broken into two fields in the csv file if you see what I mean.
So, can you explain why this is happening, and if there is a way to add a field delimter to the implode function of php when extracting my data?
Thanks!
$q = 'select * from '.$tn;
$query = mysql_query($q);
while ($row = mysql_fetch_array($query)) {
$nextline = "\"$row[0]\",\"$row[1]\",\"$row[2]\",\"$row[3]\",\"$row[4]\"\r\n";
fwrite($fp,$nextline);
}
fclose($fp);
# vs
$q = 'select * from '.$tn;
$result = mysql_query($q) or print(mysql_error());
$i = 0;
while($row = mysql_fetch_assoc($result)) {
if($i++ == 0) {
$header= implode(array_keys($row),",")."\r\n";
fwrite($fp,$header);
}
$nextline= implode(",",$row)."\r\n";
fwrite($fp,$nextline);
}
fclose($fp);
//
I do believe you just have it backwards.
$header= implode(array_keys($row)," ,")
should be
$header= implode(",", array_keys($row))
$header= implode(array_keys($row),"
should be
$header= implode(",", array_keys($row))
Just guessing but something like:
$output = '';
$columns = mysql_num_fields($query);
while ($l = mysql_fetch_array($query)) {
for ($i = 0; $i < $columns; $i++) {
$output .='"'.$l["$i"].'",';
}
$output = '';
$columns = mysql_num_fields($query);
while ($l = mysql_fetch_array($query))
for ($i = 0; $i < $columns; $i++) {
$output .='"'.$l["$i"].'",';
}
actually, if your data contains double quotes, then you will have problems. You need to escape the double quotes. Use this:
$q = 'select * from '.$tn;
$query = mysql_query($q);
while ($row = mysql_fetch_array($query)) {
$nextline="";
foreach($row as $v){
$nextline .= '","' . str_replace($v,'"','""');
}
$nextline = substr($nextline,2) . '"' . "\r\n";
fwrite($fp, $nextline);
}
fclose($fp);
ASKER
As hielo suggested, I copied and pasted the following line over my implode line. The file now looks good. (There was a comma in the data that caused it to be split into two fields initially. Now all fields are delimited by double quotes and the comma w/in the text field did not cause a problem. Thanks!
I have one follow up question if you don't mind...what is the security risk of sending a csv file as an email attachment? As I mentioned, I wanted to use this for a backup solution of my databases, but should I skip the few pieces of sensitive data within my database?
Thanks again for your speedy response!
I have one follow up question if you don't mind...what is the security risk of sending a csv file as an email attachment? As I mentioned, I wanted to use this for a backup solution of my databases, but should I skip the few pieces of sensitive data within my database?
Thanks again for your speedy response!
$nextline='"'. implode('","',$row).'"' . "\r\n";
I'd zip them up using php gzip first and then send that as the mail attachment, I think you can also password protect the zip.
It will be a lot smaller email, and safer.
I have the code forthis somewhere if u need it.
It will be a lot smaller email, and safer.
I have the code forthis somewhere if u need it.
ASKER
If you could send the code for using php gzip that would be awesome! I will split the points between you and hielo.
-Audg
-Audg
>>As hielo suggested, I copied and pasted the following line over my implode line
IF you are saying that you are using this:
$nextline='"'. implode('","',$row).'"' . "\r\n";
then don't. Use my second suggestion. It's not as short but will not break if your content has double quotes. Imagine one of you fields contains:
She said, "Look out!".
Then what you have now will NOT be encoded properly, thereby "corrupting" your file. You've been warned.
>>I have one follow up question if you don't mind...what is the security risk of sending a csv file as an email attachment?
that depends on how secure is your mail server and whether or not you are sending the email over an encrypted medium!
>> but should I skip the few pieces of sensitive data within my database?
Well, if you are sending this as a backup, if you do not include the sensitive data, then you will not be able to restore the data. Kind of defeats the purpose of havinb a backup. My suggestion would be to encrypt the sensitive data in your table(s). So, as you update/insert data, the sensitive data willl be saved as encrypted data rather than plain text. When you export your backup, the values of those fields will still be encrypted.
IF you are saying that you are using this:
$nextline='"'. implode('","',$row).'"' . "\r\n";
then don't. Use my second suggestion. It's not as short but will not break if your content has double quotes. Imagine one of you fields contains:
She said, "Look out!".
Then what you have now will NOT be encoded properly, thereby "corrupting" your file. You've been warned.
>>I have one follow up question if you don't mind...what is the security risk of sending a csv file as an email attachment?
that depends on how secure is your mail server and whether or not you are sending the email over an encrypted medium!
>> but should I skip the few pieces of sensitive data within my database?
Well, if you are sending this as a backup, if you do not include the sensitive data, then you will not be able to restore the data. Kind of defeats the purpose of havinb a backup. My suggestion would be to encrypt the sensitive data in your table(s). So, as you update/insert data, the sensitive data willl be saved as encrypted data rather than plain text. When you export your backup, the values of those fields will still be encrypted.
Hmm, can't find my zip to email code at the moment, I did find this though, if you are doing backups this is a better option as they are saved as sql files that you can load straight back in. There are pleny of php zip examples to Google for though.
<?php
$command = 'mysqldump -h localhost -u username -p"passwd" dbname';
$sqlData = shell_exec($command);
$mailheader = 'From: donotreply@domain.com' . "\r\n" . 'Reply-To: donotreply@domain.com' . "\r\n" . 'X-Mailer: PHP/' . phpversion();
mail('you@domain.com','Auto Backup '.date("Y-m-d-H-i-s"),$sqlData,$mailheader);
?>
ASKER
hielo,
When I tried your second solution, I just got a csv file full of "" marks....can you look at it again?
And you're right, I should be encrypting that tiny fraction of my data that is "sensitive".
evcr-I do not have shell access...so I cannot use mysqldump, which was my plan "A".....
When I tried your second solution, I just got a csv file full of "" marks....can you look at it again?
And you're right, I should be encrypting that tiny fraction of my data that is "sensitive".
evcr-I do not have shell access...so I cannot use mysqldump, which was my plan "A".....
"""","""","""","""","""","""","""","""","""",""""
"""","""","""","""","""","""","""","""","""",""""
"""","""","""","""","""","""","""","""","""",""""
"""","""","""","""","""","""","""","""","""",""""
sorry. Change this:
$nextline .= '","' . str_replace($v,'"','""');
to this:
$nextline .= '","' . str_replace('"','""',$v);
$nextline .= '","' . str_replace($v,'"','""');
to this:
$nextline .= '","' . str_replace('"','""',$v);
ASKER
When I tried that, each field printed twice
"47","47","0","0","2008-08-26","2008-08-26","3","3"
$fp = fopen("c:\\data\\info.txt", "w");
if( $fp ){
$q = 'select * from '.$tn;
$query = mysql_query($q);
while ($row = mysql_fetch_array($query)) {
$nextline="";
foreach($row as $v){
$nextline .= '","' . str_replace('"','""',$v);
}
$nextline = substr($nextline,2) . '"' . "\r\n";
fwrite($fp, $nextline);
}
fclose($fp);
}
I was not sure if you made the correct replacement, but above you can see what I meant. I did NOT get the duplicate entries you described.
ASKER
hmmmmmm..
I'm still getting duplicates....I even tried another table.
I'm still getting duplicates....I even tried another table.
$tn = 'silk';
$fp = fopen($tmpDir.$prefix.'_'.$tn.date('Y_m_d_g_i_a').'.csv','w');
if ($fp){
$q = 'select * from '.$tn;
$query = mysql_query($q) or print(mysql_error());
while ($row = mysql_fetch_array($query)) {
$nextline="";
foreach($row as $v){
$nextline .= '","' . str_replace('"','""',$v);
}
$nextline = substr($nextline,2) . '"' . "\r\n";
echo $nextline."<br />";
fwrite($fp, $nextline);
}
fclose($fp);
}
Have you considered using a .sql file for backups rather than a .csv?
ASKER
I use PHPMyAdmin's backup/export utility to do backups already; the solution I am looking for is an automated solutiont that I can run as a CRON job on a few critical tables several times a day. I do not have SHELL access to my data (I'm on a shared server) so that's why I cannot use mysqldump. I guess there are a few PHP classes that achieve the same thing as mysqldump http://phpclasses.betablue.net/browse/package/2527.html. Is that what you had in mind?
did you try the sql backup script I posted?
ASKER
Unfortunately, I do not have permission to execute mysqldump, so this is not an option.
$command = 'mysqldump -h localhost -u username -p"passwd" dbname';
$sqlData = shell_exec($command);
Ok, how about..
$mytable = 'mytable';
$mybackup = 'backup/mybackup.sql';
$query = "SELECT * INTO OUTFILE '$mybackup' FROM $mytable";
$result = mysql_query($query);
then zip the file and send it as a mail attachment
$mytable = 'mytable';
$mybackup = 'backup/mybackup.sql';
$query = "SELECT * INTO OUTFILE '$mybackup' FROM $mytable";
$result = mysql_query($query);
then zip the file and send it as a mail attachment
try:
$fp = fopen("c:\\data\\info.txt", "w");
if( $fp ){
$q = 'select * from '.$tn;
$query = mysql_query($q);
while ($row = mysql_fetch_array($query)) {
foreach($row as $k => $v){
$row[$k] = str_replace('"','""',$v);
}
$nextline = '"'. implode('","',$row) . '"' . "\r\n";
fwrite($fp, $nextline);
}
fclose($fp);
}
ASKER
hielo: I'm sorry...but I'm still getting each field repeated twice...and I can't figure out how to fix it.
to evcr, I've tried the SELECT * INTO OUTFILE etc but as I recall my database user didn't have the ability to write to a file on my server...it was a permissions thing at the user level I think so I gave up. Maybe I was just doing it wrong.
That's why I was going for this more "lo tech" solution. I guess I can still take this latter approach, going way back to my original thought: manually list the number of fields in each table, (i.e. $nextline = "\"$row[0]\",\"$row[1]\",\ "$row[2]\" ,\"$row[3] \",\"$row[ 4]\"\r\n"; ) making sure that the comma/quote issues are dealt with.
to evcr, I've tried the SELECT * INTO OUTFILE etc but as I recall my database user didn't have the ability to write to a file on my server...it was a permissions thing at the user level I think so I gave up. Maybe I was just doing it wrong.
That's why I was going for this more "lo tech" solution. I guess I can still take this latter approach, going way back to my original thought: manually list the number of fields in each table, (i.e. $nextline = "\"$row[0]\",\"$row[1]\",\
yes, yours and heilo's csv appoach will still work (eventually) but I you've still got to feed it all back in again to retore it. A properly formatted .sql file is going to be easier to accomplish that.
There is a way to build an sql file using php only but it will take a while to work out. You'd need to do a real mysqldump, work out what's in the sql file then replicate that using php/mysql. Perhaps a thorough Google for this will help, someone is bound to have tried this before.
Or good luck with the csv method.
Or move to a new hosting company! :-))
Cheers
There is a way to build an sql file using php only but it will take a while to work out. You'd need to do a real mysqldump, work out what's in the sql file then replicate that using php/mysql. Perhaps a thorough Google for this will help, someone is bound to have tried this before.
Or good luck with the csv method.
Or move to a new hosting company! :-))
Cheers
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Thanks for sticking w/ it.
glad to help.
$nextline='"'. implode('","',$row).'"' . "\r\n";