[2 days left] What’s wrong with your cloud strategy? Learn why multicloud solutions matter with Nimble Storage.Register Now

x
?
Solved

am i going about exporting csv's then zipping the right way? the flow of code doesn't seem to confirm this.

Posted on 2011-09-21
4
Medium Priority
?
283 Views
Last Modified: 2012-05-12
hi there.
i have had to create a functionality whereby a stored procedure (either by mysql record or by file, which when selected populates in the editor. this value is passed to the xml handler via ajax, and is processed in the following switch case:
	case 15: // Execute Stored Procedure and Export to Zipped CSV's
		$file = $_POST["text_spdesc"].".csv";
		$spsql = $_POST["text_spsql"];
		$ret = spexportcsv($file, $spsql);
		break;

Open in new window

now the values do exist and are received as follows:
text_spdesc (Stored Procedure Description / File Name) = [TCX] Premium Leads CSV Export

text_spsql (Stored Procedure SQL Queries) = SELECT 
	CONCAT(SL.bigint_LeadID + 11001000,LS.smallint_LeadOrdinal) AS `Lead ID`, 
	S.text_SupplierName AS `Supplier`, 
	IF(LS.tinyint_LeadSent = 1,'TRUE','FALSE') AS `Lead Sent`, 
	LS.timestamp_LeadCreated AS `Lead Created`, 
	SC.text_ServiceDescription AS `Service`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',1),' = ',-1) AS `Make`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',2),'<br />',-1),' = ',-1) AS `Model`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',3),'<br />',-1),' = ',-1) AS `Year`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',4),'<br />',-1),' = ',-1) AS `Cab Size`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',5),'<br />',-1),' = ',-1) AS `Bed Size`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',6),'<br />',-1),' = ',-1) AS `Condition`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',7),'<br />',-1),' = ',-1) AS `Function`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',8),'<br />',-1),' = ',-1) AS `Product`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',9),'<br />',-1),' = ',-1) AS `Color`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',10),'<br />',-1),' = ',-1) AS `Budget`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',11),'<br />',-1),' = ',-1) AS `Fitment`, 
	SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,'<br />',12),'<br />',-1),' = ',-1) AS `Radius`, 
	C.text_ConsumerName AS `Consumer Name`, 
	C.`text_ConsumerE-Mail` AS `Consumer E-mail`, 
	C.text_ConsumerPhone AS `Consumer Telephone`, 
	CONCAT(C.text_ConsumerCity, '-', RA.text_RegionAbbrev) AS `Consumer City, Zip-ST`, 
	C.text_ConsumerCounty AS `Consumer County`, 
	C.text_ConsumerCountry AS `Consumer Country` 
FROM 25_serviceleads SL 
LEFT JOIN 27_leadssent LS 
ON (
	SL.bigint_LeadID = LS.bigint_LeadID
) 
LEFT JOIN 19_consumers C 
ON (
	SL.bigint_ConsumerID = C.bigint_ConsumerID
) 
LEFT JOIN 1_regions R 
ON (
	C.text_ConsumerState = R.text_RegionDescription
)
LEFT JOIN 35_regionabbrevs RA 
ON (
	R.bigint_RegionID = RA.bigint_RegionID
) 
LEFT JOIN 5_suppliers S 
ON (
	LS.bigint_SupplierID = S.bigint_SupplierID
) 
LEFT JOIN 2_servicescatalogue SC 
ON (
	SL.bigint_ServiceID = SC.bigint_ServiceID
) 
WHERE 
	LS.timestamp_LeadCreated BETWEEN "2011/01/01 00:00:00" AND "2011/09/31 23:59:59" 
GROUP BY 
	`Lead ID` 
ORDER BY 
	`Lead Created` DESC;
SELECT * FROM 1_regions ORDER BY text_RegionDescription ASC;

Open in new window

now the function splits this "stored procedure" correctly, as follows:
Array
(
[0] => SELECT
     CONCAT(SL.bigint_LeadID + 11001000,LS.smallint_LeadOrdinal) AS `Lead ID`,
     S.text_SupplierName AS `Supplier`,
     IF(LS.tinyint_LeadSent = 1,\'TRUE\',\'FALSE\') AS `Lead Sent`,
     LS.timestamp_LeadCreated AS `Lead Created`,
     SC.text_ServiceDescription AS `Service`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',1),\' = \',-1) AS `Make`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',2),\'<br />\',-1),\' = \',-1) AS `Model`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',3),\'<br />\',-1),\' = \',-1) AS `Year`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',4),\'<br />\',-1),\' = \',-1) AS `Cab Size`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',5),\'<br />\',-1),\' = \',-1) AS `Bed Size`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',6),\'<br />\',-1),\' = \',-1) AS `Condition`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',7),\'<br />\',-1),\' = \',-1) AS `Function`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',8),\'<br />\',-1),\' = \',-1) AS `Product`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',9),\'<br />\',-1),\' = \',-1) AS `Color`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',10),\'<br />\',-1),\' = \',-1) AS `Budget`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',11),\'<br />\',-1),\' = \',-1) AS `Fitment`,
     SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(SL.text_LeadAttributes,\'<br />\',12),\'<br />\',-1),\' = \',-1) AS `Radius`,
     C.text_ConsumerName AS `Consumer Name`,
     C.`text_ConsumerE-Mail` AS `Consumer E-mail`,
     C.text_ConsumerPhone AS `Consumer Telephone`,
     CONCAT(C.text_ConsumerCity, \'-\', RA.text_RegionAbbrev) AS `Consumer City, Zip-ST`,
     C.text_ConsumerCounty AS `Consumer County`,
     C.text_ConsumerCountry AS `Consumer Country`
FROM 25_serviceleads SL
LEFT JOIN 27_leadssent LS
ON (
     SL.bigint_LeadID = LS.bigint_LeadID
)
LEFT JOIN 19_consumers C
ON (
     SL.bigint_ConsumerID = C.bigint_ConsumerID
)
LEFT JOIN 1_regions R
ON (
     C.text_ConsumerState = R.text_RegionDescription
)
LEFT JOIN 35_regionabbrevs RA
ON (
     R.bigint_RegionID = RA.bigint_RegionID
)
LEFT JOIN 5_suppliers S
ON (
     LS.bigint_SupplierID = S.bigint_SupplierID
)
LEFT JOIN 2_servicescatalogue SC
ON (
     SL.bigint_ServiceID = SC.bigint_ServiceID
)
WHERE
     LS.timestamp_LeadCreated BETWEEN \"2011/01/01 00:00:00\" AND \"2011/09/31 23:59:59\"
GROUP BY
     `Lead ID`
ORDER BY
     `Lead Created` DESC
[1] => SELECT * FROM 1_regions ORDER BY text_RegionDescription ASC
)

Open in new window

working code is then used to export the query to csv in the spexportcsv function, which also attempts to incorporate waiting for the files to be created/added to zip using the Archive_Zip PEAR class.
function fileszipped($list, $files) {
	$ret = array(count($files));
	foreach ($list as $f) {
		for ($i = 0; $i < count($files); $i++) {
			if (strpos($f[1][1],$files[$i])!==false) {
				$ret[$i] = true;
			}
		}
	}
	foreach ($ret as $b) {
		if ($b !== true) {
			return false;
		}
	}
	return true;
}
function spexportcsv($file, $spsql, $dir="../temp/") {
	global $conn;
	$ret = array();
	$zipfile = preg_replace("/.csv$/",".zip",$file);
	$zfilepath = $dir . $zipfile;
	$ffilepath = $zfilepath;
	if (file_exists($dir.$zipfile)) {
		$zfilepath = str_replace(".zip"," (".$i.").zip",$ffilepath);
		while (file_exists($zfilepath)) {
			$i++;
			$zfilepath = str_replace(".sql"," (".$i.").sql",$ffilepath);
		}
	}
	$csv_terminated = "\n";
	$csv_separator = ",";
	$csv_enclosed = '"';
	$csv_escaped = "\\";
	$sqlarr = preg_split("/(;+[\r\n\t ]*)/",$spsql,-1,PREG_SPLIT_NO_EMPTY);
	//print_r($sqlarr); // query array validated
	// save individual query results as csv's
	foreach ($sqlarr as $tsql) {
		if (strlen(trim($tsql)) > 0) {
			$tsql = stripslashes(trim($tsql)).";";
			// print_r($tsql); // individual query validation
			$filepath = $dir . $file;
			$firstname = $filepath;
			$i=0;
			if (file_exists($firstname)) {
				$filepath = str_replace(".sql"," (".$i.").sql",$firstname);
				while (file_exists($filepath)) {
					$i++;
					$filepath = str_replace(".sql"," (".$i.").sql",$firstname);
				}
			}
			$filename = basename($filepath);
			$result = mysql_query_errors($tsql , $conn , __FILE__ , __LINE__ );
			$fields_cnt = mysql_num_fields($result);
			$schema_insert = '';
			for ($i = 0; $i < $fields_cnt; $i++) {
				$l = $csv_enclosed . str_replace($csv_enclosed, $csv_escaped . $csv_enclosed,
						stripslashes(mysql_field_name($result, $i))) . $csv_enclosed;
				$schema_insert .= $l;
				$schema_insert .= $csv_separator;
			} // end for
			$out = trim(substr($schema_insert, 0, -1));
			$out .= $csv_terminated;
			// Format the data
			while ($row = mysql_fetch_array($result)) {
				$schema_insert = '';
				for ($j = 0; $j < $fields_cnt; $j++) {
					if ($row[$j] == '0' || $row[$j] != '') {
						if ($csv_enclosed == '') {
							$schema_insert .= $row[$j];
						} else {
							$schema_insert .= $csv_enclosed . 
								str_replace($csv_enclosed, $csv_escaped . $csv_enclosed, $row[$j]) . $csv_enclosed;
						}
					} else {
						$schema_insert .= '';
					}
					if ($j < $fields_cnt - 1) {
						$schema_insert .= $csv_separator;
					}
				} // end for
				$out .= $schema_insert;
				$out .= $csv_terminated;
			} // end while
			$dat = file_put_contents($filepath, $out, FILE_USE_INCLUDE_PATH | LOCK_EX);
			while (!file_exists($filepath)) {
				sleep(1);
			}
			echo "	<result>Successfully exported Stored Procedure (".number_format($dat/1024,2)." KB) to ".$filename.".</result>\n";
			array_push($ret, $dir.$filename);
		}
	}
	// zip csv's and set download if successful
	$obj = new Archive_Zip($zfilepath); // name of zip file
	if ($obj->create($ret)) {
		echo "	<result>Zipped files ".implode(", ",$ret)." to ".$zfilepath.".</result>\n";
		echo "	<download file=\"temp/".basename($zfilepath)."\" ret=\"".implode(", ",$ret)."\" />\n";
	} else {
		echo "	<result>Failed zipping files ".implode(", ",$ret)." to ".$zfilepath.".</result>\n";
	}
	// await files to be zipped
	while (!fileszipped($obj->listContent(),$ret)) {
		sleep(1);
	}
	$obj->close();
	// delete csv's
	foreach ($ret as $filepath) {
		if (unlink($filepath)) {
			echo "	<foperation file=\"".$file."\" success=\"1\" />\n";
		} else {
			echo "	<foperation file=\"".$file."\" success=\"0\" />\n";
		}
	}
}

Open in new window

now the first query returns 38 rows and a csv is saved. the second query however, returns 44,310 records - but where this php is used in another function, it does not wait for the csv file to be created. this needs to be done in order to zip the multiple queries! if i remove the second query it creates the csv, adds it to zip all is fine. but if the second query is there, it times out before creating the zip and adding the csv's - however this page has a set_time_limit(0); at the top of the page - thus the timeout is clientside (http_request.status returning and not being 200 alerts me to 'There was a problem with the request.')... adding flush(); after writing the individual lines does nothing except finishing the xml tree before other data has been added.
this php document generates the response in XML 1.0
it might  also be more prudent to export the csv's via the individual sql queries by adding the following into $tsql, at the appropriate places (and so lessening the amount of code required in the function spexportcsv).
INTO OUTFILE '".$zfilepath."'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'

Open in new window

any form of help and or suggestions, would be greatly appreciated on this topic.
0
Comment
Question by:intellisource
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
4 Comments
 

Author Comment

by:intellisource
ID: 36578453
ok i have even tried creating the zip first - then adding the csv files as they are created -but it does not export large tables. this exact same code was used with exporting another table with 44,310 records - but here it times out on the clientside. i have also removed the sleep loops, but i have no idea why it still does not export a csv from the second "stored procedure".
0
 
LVL 111

Expert Comment

by:Ray Paseur
ID: 36579567
This appears to be a data-dependent issue.  Please post the test data that you are using to isolate the problems.  Thanks.
0
 

Accepted Solution

by:
intellisource earned 0 total points
ID: 36579693
o, i've ljocated reference for and decided to use the built in php class ZipArchive() - incorporating the csv's into the zip as strings! :D
it exports the second one too as there are less delays giving possibilities of timeouts with file creation then zipping those files and deleting them before returning.
and there is no need for a secondary function! XD
function spexportcsv($file, $spsql, $dir="../temp/") {
	global $conn;
	$ret = array();
	$zipfile = preg_replace("/.csv$/",".zip",$file);
	$zfilepath = $dir . $zipfile;
	$ffilepath = $zfilepath;
	$i = 0;
	if (file_exists($dir.$zipfile)) {
		$zfilepath = str_replace(".zip"," (".$i.").zip",$ffilepath);
		while (file_exists($zfilepath)) {
			$i++;
			$zfilepath = str_replace(".zip"," (".$i.").zip",$ffilepath);
		}
	}
	// create zipfile
	$zip = new ZipArchive();
	if ($zip->open($zfilepath, ZIPARCHIVE::CREATE)!==TRUE) {
		exit("	<result>Failed Creating Zip Archive ".$zfilepath.".</result>\n");
	} else {
		echo "	<result>Successfully created zip archive ".$zfilepath.".</result>\n";
	}
	$csv_terminated = "\n";
	$csv_separator = ",";
	$csv_enclosed = '"';
	$csv_escaped = "\\";
	$sqlarr = preg_split("/(;+[\r\n\t ]*)/",$spsql,-1,PREG_SPLIT_NO_EMPTY);
	//print_r($sqlarr); // query array validated
	// save individual query results as csv's
	$n=0;
	foreach ($sqlarr as $tsql) {
		if (strlen(trim($tsql)) > 0) {
			$tsql = stripslashes(trim($tsql)).";";
			// print_r($tsql); // individual query validation
			$filepath = $dir . $file;
			$firstname = $filepath;
			$filepath = str_replace(".csv"," (".$n.").csv",$firstname);
			$n++;
			$filename = basename($filepath);
			$result = mysql_query_errors($tsql , $conn , __FILE__ , __LINE__ );
			$fields_cnt = mysql_num_fields($result);
			$schema_insert = '';
			for ($i = 0; $i < $fields_cnt; $i++) {
				$l = $csv_enclosed . str_replace($csv_enclosed, $csv_escaped . $csv_enclosed,
						stripslashes(mysql_field_name($result, $i))) . $csv_enclosed;
				$schema_insert .= $l;
				$schema_insert .= $csv_separator;
			} // end for
			$out = trim(substr($schema_insert, 0, -1));
			$out .= $csv_terminated;
			// Format the data
			while ($row = mysql_fetch_array($result)) {
				$schema_insert = '';
				for ($j = 0; $j < $fields_cnt; $j++) {
					if ($row[$j] == '0' || $row[$j] != '') {
						if ($csv_enclosed == '') {
							$schema_insert .= $row[$j];
						} else {
							$schema_insert .= $csv_enclosed . 
								str_replace($csv_enclosed, $csv_escaped . $csv_enclosed, $row[$j]) . $csv_enclosed;
						}
					} else {
						$schema_insert .= '';
					}
					if ($j < $fields_cnt - 1) {
						$schema_insert .= $csv_separator;
					}
				} // end for
				$out .= $schema_insert;
				$out .= $csv_terminated;
			} // end while
			$zip->addFromString($filename, $out);
			echo "	<result>Added ".$filename." to ".$zfilepath.".</result>\n";
		}
	}
	echo "	<download file=\"temp/".basename($zfilepath)."\" ret=\"".$zip->status."\" />\n";
	$zip->close();
}

Open in new window

0
 

Author Closing Comment

by:intellisource
ID: 36708045
i did that in the initial post ray, but i've found a solution ;) this issue is now resolved :D
the proplem was with files being created - there being a delay before they actually exist in the directory. i've skipped all those sleep loops and merely added the compiled csv content string to the zip - which gets it exported well under the timeout limit ;)
0

Featured Post

Nothing ever in the clear!

This technical paper will help you implement VMware’s VM encryption as well as implement Veeam encryption which together will achieve the nothing ever in the clear goal. If a bad guy steals VMs, backups or traffic they get nothing.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

There are times when I have encountered the need to decompress a response from a PHP request. This is how it's done, but you must have control of the request and you can set the Accept-Encoding header.
In this blog post, we’ll look at how ClickHouse performs in a general analytical workload using the star schema benchmark test.
The viewer will learn how to dynamically set the form action using jQuery.
In this video, Percona Solution Engineer Dimitri Vanoverbeke discusses why you want to use at least three nodes in a database cluster. To discuss how Percona Consulting can help with your design and architecture needs for your database and infras…
Suggested Courses

649 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question