properly formatting a csv file from mysql db using php and implode $row

Hello,

As a means of backing up my tables with a cron job, I am using PHP to query my database, write the rows into a CSV file, and then email the results to myself via PEAR mail. Here is my question...
If I manually list the number of fields in each table, (i.e. $nextline = "\"$row[0]\",\"$row[1]\",\"$row[2]\",\"$row[3]\",\"$row[4]\"\r\n"; ) I end up with a properly formatted csv file.
However, I wanted to use a short cut so that I wouldn't have to count and list each field, somethign like:

$nextline= implode(",",$row)."\r\n";

The problem is that it doesn't encapsulate each data field with any kind of delimiter, so for instance, a text field with a line break ends up being broken into two fields in the csv file if you see what I mean.

So, can you explain why this is happening, and if there is a way to add a field delimter to the implode function of php when extracting my data?

Thanks!

$q = 'select * from '.$tn; 
	$query = mysql_query($q); 
	while ($row = mysql_fetch_array($query)) { 
	$nextline = "\"$row[0]\",\"$row[1]\",\"$row[2]\",\"$row[3]\",\"$row[4]\"\r\n"; 
	fwrite($fp,$nextline); 
	} 
	fclose($fp); 
 
 # vs
 
 
 
$q = 'select * from '.$tn; 
	
	
	$result = mysql_query($q) or print(mysql_error());
	
    
	$i = 0;
 
	while($row = mysql_fetch_assoc($result)) {
		if($i++ == 0) {
			$header= implode(array_keys($row),",")."\r\n";
			fwrite($fp,$header); 
		}
		$nextline= implode(",",$row)."\r\n";
		fwrite($fp,$nextline); 
 
	}
 
 
 
	fclose($fp); 
//

Open in new window

abernsAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

hieloCommented:
copy and paste this:
$nextline='"'. implode('","',$row).'"' . "\r\n";
0
Tyler LaczkoCommented:
I do believe you just have it backwards.

$header= implode(array_keys($row),",")
should be
$header= implode(",", array_keys($row))
0
evcrCommented:
Just guessing but something like:

$output = '';
$columns = mysql_num_fields($query);
while ($l = mysql_fetch_array($query)) {
for ($i = 0; $i < $columns; $i++) {
$output .='"'.$l["$i"].'",';
}
0
Learn Ruby Fundamentals

This course will introduce you to Ruby, as well as teach you about classes, methods, variables, data structures, loops, enumerable methods, and finishing touches.

hieloCommented:
actually, if your data contains double quotes, then you will have problems. You need to escape the double quotes. Use this:
$q = 'select * from '.$tn; 
      $query = mysql_query($q); 
      while ($row = mysql_fetch_array($query)) { 
	 	$nextline="";
		foreach($row as $v){
			$nextline .= '","' . str_replace($v,'"','""');
		}
		$nextline = substr($nextline,2) . '"' . "\r\n";
      fwrite($fp, $nextline); 
      } 
      fclose($fp); 

Open in new window

0
abernsAuthor Commented:
As hielo suggested, I copied and pasted the following line over my implode line. The file now looks good. (There was a comma in the data that caused it to be split into two fields initially. Now all fields are delimited by double quotes and the comma w/in the text field did not cause a problem. Thanks!

I have one follow up question if you don't mind...what is the security risk of sending a csv file as an email attachment?   As I mentioned, I wanted to use this for a backup solution of my databases, but should I skip the few pieces of sensitive data within my database?

Thanks again for your speedy response!
$nextline='"'. implode('","',$row).'"' . "\r\n";

Open in new window

0
evcrCommented:
I'd zip them up using php gzip first and then send that as the mail attachment, I think you can also password protect the zip.

It will be a lot smaller email, and safer.

I have the code forthis somewhere if u need it.
0
abernsAuthor Commented:
If you could send the code for using php gzip that would be awesome! I will split the points between you and hielo.
-Audg
0
hieloCommented:
>>As hielo suggested, I copied and pasted the following line over my implode line
IF you are saying that you are using this:
$nextline='"'. implode('","',$row).'"' . "\r\n";

then don't. Use my second suggestion. It's not as short but will not break if your content has double quotes. Imagine one of you fields contains:
She said, "Look out!".

Then what you have now will NOT be encoded properly, thereby "corrupting" your file. You've been warned.

>>I have one follow up question if you don't mind...what is the security risk of sending a csv file as an email attachment?
that depends on how secure is your mail server and whether or not you are sending the email over an encrypted medium!

>> but should I skip the few pieces of sensitive data within my database?
Well, if you are sending this as a backup, if you do not include the sensitive data, then you will not be able to restore the data. Kind of defeats the purpose of havinb a backup. My suggestion would be to encrypt the sensitive data in your table(s). So, as you update/insert data, the sensitive data willl be saved as encrypted data rather than plain text. When you export your backup, the values of those fields will still be encrypted.
0
evcrCommented:
Hmm, can't find my zip to email code at the moment, I did find this though, if you are doing backups this is a better option as they are saved as sql files that you can load straight back in. There are pleny of php zip examples to Google for though.
<?php
$command = 'mysqldump -h localhost -u username -p"passwd" dbname';
$sqlData = shell_exec($command);
$mailheader = 'From: donotreply@domain.com' . "\r\n" . 'Reply-To: donotreply@domain.com' . "\r\n" . 'X-Mailer: PHP/' . phpversion();
mail('you@domain.com','Auto Backup '.date("Y-m-d-H-i-s"),$sqlData,$mailheader);
?>

Open in new window

0
abernsAuthor Commented:
hielo,

When I tried your second solution, I just got a csv file full of "" marks....can you look at it again?
And you're right, I should be encrypting that tiny fraction of my data that is "sensitive".

evcr-I do not have shell access...so I cannot use mysqldump, which was my plan "A".....
"""","""","""","""","""","""","""","""","""",""""
"""","""","""","""","""","""","""","""","""",""""
"""","""","""","""","""","""","""","""","""",""""
"""","""","""","""","""","""","""","""","""",""""

Open in new window

0
hieloCommented:
sorry. Change this:
$nextline .= '","' . str_replace($v,'"','""');


to this:
$nextline .= '","' . str_replace('"','""',$v);
0
abernsAuthor Commented:
When I tried that, each field printed twice
"47","47","0","0","2008-08-26","2008-08-26","3","3"

Open in new window

0
hieloCommented:

$fp = fopen("c:\\data\\info.txt", "w");
if( $fp ){
	$q = 'select * from '.$tn; 
      $query = mysql_query($q); 
      while ($row = mysql_fetch_array($query)) { 
                $nextline="";
                foreach($row as $v){
                        $nextline .= '","' . str_replace('"','""',$v);
                }
                $nextline = substr($nextline,2) . '"' . "\r\n";
      fwrite($fp, $nextline); 
      } 
      fclose($fp);
}

Open in new window

0
hieloCommented:
I was not sure if you made the correct replacement, but above you can see what I meant. I did NOT get the duplicate entries you described.
0
abernsAuthor Commented:
hmmmmmm..

I'm still getting duplicates....I even tried another table.
           $tn = 'silk';
	$fp = fopen($tmpDir.$prefix.'_'.$tn.date('Y_m_d_g_i_a').'.csv','w'); 
if ($fp){
	$q = 'select * from '.$tn; 
	$query = mysql_query($q) or print(mysql_error());
     while ($row = mysql_fetch_array($query)) { 
                $nextline="";
                foreach($row as $v){
                        $nextline .= '","' . str_replace('"','""',$v);
                }
                $nextline = substr($nextline,2) . '"' . "\r\n";
      echo $nextline."<br />";
	  fwrite($fp, $nextline); 
      } 
      fclose($fp);
}

Open in new window

0
RoonaanCommented:
Have you considered using a .sql file for backups rather than a .csv?
0
abernsAuthor Commented:
I use PHPMyAdmin's backup/export utility to do backups already; the solution I am looking for is an automated solutiont that I can run as a CRON job on a few critical tables several times a day. I do not have SHELL access to my data (I'm on a shared server) so that's why I cannot use mysqldump. I guess there are a few PHP classes that achieve the same thing as mysqldump http://phpclasses.betablue.net/browse/package/2527.html. Is that what you had in mind?
0
evcrCommented:
did you try the sql backup script I posted?
0
abernsAuthor Commented:
Unfortunately, I do not have permission to execute mysqldump, so this is not an option.

$command = 'mysqldump -h localhost -u username -p"passwd" dbname';
$sqlData = shell_exec($command);

Open in new window

0
evcrCommented:
Ok, how about..

$mytable  = 'mytable';
$mybackup = 'backup/mybackup.sql';
$query  = "SELECT * INTO OUTFILE '$mybackup' FROM $mytable";
$result = mysql_query($query);

then zip the file and send it as a mail attachment
0
hieloCommented:
try:
$fp = fopen("c:\\data\\info.txt", "w");
if( $fp ){
	$q = 'select * from '.$tn; 
      $query = mysql_query($q); 
      while ($row = mysql_fetch_array($query)) {
                foreach($row as $k => $v){
                        $row[$k] = str_replace('"','""',$v);
                }
                $nextline = '"'. implode('","',$row) . '"' . "\r\n";
      fwrite($fp, $nextline); 
      } 
      fclose($fp);
}

Open in new window

0
abernsAuthor Commented:
hielo: I'm sorry...but I'm still getting each field repeated twice...and I can't figure out how to fix it.

to evcr, I've tried the SELECT * INTO OUTFILE etc but as I recall my database user didn't have the ability to write to a file on my server...it was a permissions thing at the user level I think so I gave up. Maybe I was just doing it wrong.

That's why I was going for this more "lo tech" solution. I guess I can still take this latter approach, going way back to my original thought: manually list the number of fields in each table, (i.e. $nextline = "\"$row[0]\",\"$row[1]\",\"$row[2]\",\"$row[3]\",\"$row[4]\"\r\n"; ) making sure that the comma/quote issues are dealt with.
0
evcrCommented:
yes, yours and heilo's csv appoach will still work (eventually) but I you've still got to feed it all back in again to retore it. A properly formatted .sql file is going to be easier to accomplish that.

There is a way to build an sql file using php only but it will take a while to work out. You'd need to do a real mysqldump, work out what's in the sql file then replicate that using php/mysql. Perhaps a thorough Google for this will help, someone is bound to have tried this before.

Or good luck with the csv method.

Or move to a new hosting company! :-))

Cheers
0
hieloCommented:
on my last post, change this:
while ($row = mysql_fetch_array($query))

to this:
while ($row = mysql_fetch_assoc($query))
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
abernsAuthor Commented:
Thanks for sticking w/ it.
0
hieloCommented:
glad to help.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
PHP

From novice to tech pro — start learning today.