Process very slow for RETS/MYSQL script and stops after so many loops

I found code to pull rets data and make a csv file (https://github.com/troydavisson/PHRETS/wiki/Connect,-download-listing-data-in-csv-format,-disconnect)

I moded this after getting it to work on my rets server to work with mysql (NOTE: it did work fine with csv files, and i was able to get the 300+ meg file to populate mysql with LOAD, but i ran into problems getting fields with commas and qoutes etc to parse correctly as well as a deliminator,(seams to be a common problem) I am now trying to directly inject to mysql with a modified version of this csv file.

 it takes forever to process, is there any way i can speed it up some? I have other tables to do.

and after 5-10 mins and 14,000 records it dies. trying to pull 118k+ records as a one time pull
i have in my max_execution_time and memory_limit .
please note it did work with the csv file - i just could not use it as i had major parse errors


Thank you in advance for any code or help you may provide.
Johnny


<?php 
ini_set("memory_limit","375M");
ini_set('max_execution_time', 12500); //7 mins is 420 seconds - 300 seconds = 5 minutes
require_once('functions.php'); 
include ("../../mysqli_ctx.php");

$rets_login_url = "xxxx"; 
$rets_username = "xxxxx"; 
$rets_password = "xxxxx"; 

// use http://retsmd.com to help determine the SystemName of the DateTime field which 
// designates when a record was last modified 
//$rets_modtimestamp_field = "LastModifiedDateTime"; RECORDMODDATE
$rets_modtimestamp_field = "RECORDMODDATE";
// use http://retsmd.com to help determine the names of the classes you want to pull. 
// these might be something like RE_1, RES, RESI, 1, etc. 
$property_classes = array("DE"); 

// DateTime which is used to determine how far back to retrieve records. 
// using a really old date so we can get everything 
$previous_start_time = "1980-01-01T00:00:00"; 


$sixmonths = date('Y-m-d\TH:i:s', time()-15778800); // get listings updated within last 6 months
////////////////////////////// 

require_once("../lib/phrets.php"); 

// start rets connection 
$rets = new phRETS; 

//$rets->AddHeader("RETS-Version", "RETS/1.7.2"); 
//$rets->AddHeader('User-Agent', "YOUR_USER_AGENT_HEADER_IF_NECESSARY"); 

$rets->SetParam("compression_enabled", true); 

echo " * Connecting to {$rets_login_url} as {$rets_username}<br>\n"; 
$connect = $rets->Connect($rets_login_url, $rets_username, $rets_password); 

if ($connect) { 
    echo "  * Connected<br>\n"; 
} 
else { 
        echo "  * Not connected:<br>\n"; 
        print_r($rets->Error()); 
        exit; 
} 

foreach ($property_classes as $class) { 

        echo " * Property class: {$class}<br>\n"; 

        $maxrows = true; 
        $offset = 1; 
        $limit = 1000; 
        $fields_order = array(); 
        while ($maxrows) { 
                //$query = "(LN=0+),(Status=|A,C),(County=|Barnstable,Dukes,Nantucket,Plymouth)";
                $query = '(RECORDMODDATE='.$sixmonths.'+)'; 
                // run RETS search 
                echo "  * Query: {$query}  Limit: {$limit}  Offset: {$offset}<br>\n"; 
                $search = $rets->SearchQuery(
                "Property",
                $class,
                $query,
                array('Limit' => $limit, 'Offset' => $offset, 'Format' => 'COMPACT-DECODED', 'Count' => 1)); 
                echo "  * Total found: {$rets->TotalRecordsFound()}<br>\n";
                echo "Proccessing...<br>";
                if ($rets->NumRows() > 0) { 
                        if ($offset == 1) { 
                                // print filename headers as first line 
                                $fields_order = $rets->SearchGetFields($search); 
                                //fputcsv($fh, $fields_order); 
                        } 
                        // process results 
                        while ($record = $rets->FetchRow($search)) { 
                                $this_record = array(); 
                                foreach ($fields_order as $fo) { 
                                        $this_record[] = '"'.mysqli_real_escape_string($mysqli,$record[$fo]).'"';
                                }//for each
                                //mysql inject
                                 $fields_order_l = implode(',', $fields_order);
                                 $fields_order_l = str_replace(",MOD,",",MODx,",$fields_order_l);
                                 $this_record = implode(',', $this_record);
				    $table="rets_property_de";
				    $query = "INSERT INTO `".$table."` (".$fields_order_l.") VALUES (".$this_record.")";
				        //echo $query;
				        $r = mysqli_query($mysqli,$query);
				        if(!$r )
					{
					  die('<br>Could not enter data: ' . mysqli_error($mysqli));
					}
					else 
					{
					//echo "<br>Entered data successfully<br>\n";
					// printf("%d Row inserted.<br>\n", $mysqli->affected_rows);
					//echo ".";
					}//if r
					//echo "<br>"; 
			} // WHILE
                        $offset = ($offset + $rets->NumRows()); 
                } // if num
                $maxrows = $rets->IsMaxrowsReached(); 
                $rets->FreeResult($search); 
        } // while max
 }// for each
mysqli_close($mysqli);
echo "- done<br>\n"; 
echo " * Disconnecting<br>\n"; 
$rets->Disconnect(); 

?> 

Open in new window

JohnnyAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

gr8gonzoConsultantCommented:
You can speed up bulk inserts by pushing multiple records into the database in one INSERT instead of one INSERT per record. This reduces a LOT of overhead when you're working with a ton of records.

Basically the syntax is:
INSERT INTO table
(fieldA,fieldB,fieldC,etc)
VALUES
(rec1A,rec1B,rec1C,etc),
(rec2A,rec2B,rec2C,etc),
(rec3A,rec3B,rec3C,etc);

So assuming the fields are always the same for each record (and in the same order), I would suggest filling an array with your records instead of inserting each time. Then, when the array gets to 100 or 500 or 1000 records (test out different volumes to see what works for you), construct the bulk query and run it and clear your array and let it keep filling up with more records.

If each individual record takes 0.3 seconds to insert (just a hypothetical), then 10 records should take 3 seconds to insert individually. However, you can likely insert those same 10 records in 1 second or less with the bulk insert syntax.
0
gr8gonzoConsultantCommented:
For the sake of simplicity, here's the adjusted while loop of your code. Basically, you:

1. Add a new $queue array before each while() loop that fetches your records.
2. Instead of querying, you add the $this_record to the $queue array.
3. Move the query to AFTER the end of the while() loop, and change the query to implode all values in $queue.

                        // Define a new, empty queue
                        $queue = array();

                        // process results 
                        while ($record = $rets->FetchRow($search)) { 
                                $this_record = array(); 
                                foreach ($fields_order as $fo) { 
                                        $this_record[] = '"'.mysqli_real_escape_string($mysqli,$record[$fo]).'"';
                                }//for each
                                //mysql inject
                                 $fields_order_l = implode(',', $fields_order);
                                 $fields_order_l = str_replace(",MOD,",",MODx,",$fields_order_l);
                                 $this_record = implode(',', $this_record);
				    $table="rets_property_de";

                                    // Add to queue instead of querying
                                    $queue[] = "(" . $this_record . ")";

                                     // REMOVED THE QUERY CODE FROM THE WHILE LOOP
			} // WHILE

                        // Now that the loop is finished...
				    $query = "INSERT INTO `".$table."` (".$fields_order_l.") VALUES " . implode(",",$queue);
				        //echo $query;
				        $r = mysqli_query($mysqli,$query);
				        if(!$r )
					{
					  die('<br>Could not enter data: ' . mysqli_error($mysqli));
					}
					else 
					{
					//echo "<br>Entered data successfully<br>\n";
					// printf("%d Row inserted.<br>\n", $mysqli->affected_rows);
					//echo ".";
					}//if r
					//echo "<br>"; 

Open in new window

0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
JohnnyAuthor Commented:
@gr8gonzo i have over 118k records im trying to inject here would storing them all up in one shot be able to handle it you mentioned firing off after so many in the array (how would i do that)
or would it be good to break apart the full query array in chunks after its built
no the field names do not change for the sets.
as there are 1000 records it pulls from rets server in blocks would that not be a good starting number to try?

thank you for your help
Johnny
0
JohnnyAuthor Commented:
Actually that would fire off 1k blocks
im testing the code now

update: so far looks like its working out great MUCH faster - im going to accept as answer and if i have any more problem reference this post. Thank you again so much
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
PHP

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.