[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

Starting point in a foreach()

Posted on 2007-08-07
6
Medium Priority
?
320 Views
Last Modified: 2013-12-12
Hello

I have a tab delimited text file which contains 9,000 rows and 23 columns.  I am importing this into a MySQL database.  However, the script which imports it is timing out around row 4860.

What I want to do is make it so the script gets to row 1000, then stops and refreshes and continues from that point to break the process down a bit.  I can store the row number in a SQL database ready to be read and continued from by the script on refresh.

However, the method I am using to break the file down is as follows.

if (file_exists($file_name)) { // Check the file can be opened
$file_to_process = file($file_name); // put file into an array

Then once I have the file in an array I do the following to process each row

foreach($file_to_process as $row) {
$data = explode("\t",$row);
// From here I do my validation and insert into the database

However, if I am restarting the script, my current script is going to always start from row zero and will never get anywhere.  How can I do the foreach starting at row 1001 (or whatever value I call from the database to continue from) so that only the data which has not been processed is then processed.

Also, each time the script runs I need to call row 0 to get the column names as this is part of my verification system.

Note:  The 9,000 rows is an example.  The file could contains 23,000 rows, or just 10 rows.  I depends on what my customers upload to be processed.

I did wonder if it was possible to do the following

foreach($file_to_process[i] as $row)

and then i++ after processing although I have a strong feeling that would not work as it would only retrieve one row.

I am sure I am missing something simple but for the life of me cannot figure it out.

Cheers
0
Comment
Question by:ParadyneDesigns
  • 4
6 Comments
 
LVL 11

Expert Comment

by:Chris Gralike
ID: 19650468
instead, why not count the rows in a file

next set a for loop delimiter and work with a for loop instead?

regards,
0
 
LVL 11

Expert Comment

by:Chris Gralike
ID: 19650480
$starting_offset = "0";
$stopping_offset = "999";
$current_count = "0";
for($i = $starting_offset; $stopping_offset; $i ++){
       blabla
       $current_count ++;
}

use the current count to check if all was succesfull and to know where the seccond run shoudl start...
0
 
LVL 11

Expert Comment

by:Chris Gralike
ID: 19650501
You might also want to read this function (do be carefull though)
http://nl3.php.net/manual/en/function.set-time-limit.php

regards,
0
Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 11

Expert Comment

by:Chris Gralike
ID: 19650549
oeps, that for will walk forever :P

for($i = $starting_offset; $i <= $stopping_offset; $i ++)
0
 

Author Comment

by:ParadyneDesigns
ID: 19650591
Thanks for the ideas but the loop suggested still would not split the array.  Each iteration would still commence from the begining of the array.  What I needed was something to move the internal pointer to the new starting point in the array when the script restarts.

What I have done in the time being is add an array slice to see if that will work.  I am just uploading the file to my FTP to run the tasks and seeing it that solved it.

The time set limit would not really have worked either as a 16,000 row file would take my server around 1 hour to process (the validation in my cron script is extremely intensive (I forgot to add that this was a cron)).  There is still the time set limit of the server which in my case overrules the time set limit added in php.
0
 
LVL 11

Accepted Solution

by:
AlexanderR earned 1000 total points
ID: 19657986
session_start();
if ($_SESSION["position"]<1){
$_SESSION["position"] = 1;
}

if (file_exists($file_name))
$file_to_process = file($file_name);


for($i=$_SESSION["position"];$i<=($_SESSION["position"]+1000);$i++){
      echo $file_to_process[$i];
      if(($i+1)==($_SESSION["position"]+1000)){
            $_SESSION["position"] = $i + 1;
            break;
      }
}

At the point where it says       echo $file_to_process[$i];
you can put whatever you need for that 1000 worth of records.

Whats happening here is that you position is held in a session variable.  That way that position can be pulled up when you come back to the page so as to "resume" from that point.  Once another set is done, it gets incremented for another "resume" once you reload the page (or implement some other way to restart it).
0

Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Since pre-biblical times, humans have sought ways to keep secrets, and share the secrets selectively.  This article explores the ways PHP can be used to hide and encrypt information.
Many old projects have bad code, but the budget doesn't exist to rewrite the codebase. You can update this code to be safer by introducing contemporary input validation, sanitation, and safer database queries.
The viewer will learn how to create and use a small PHP class to apply a watermark to an image. This video shows the viewer the setup for the PHP watermark as well as important coding language. Continue to Part 2 to learn the core code used in creat…
This tutorial will teach you the core code needed to finalize the addition of a watermark to your image. The viewer will use a small PHP class to learn and create a watermark.
Suggested Courses
Course of the Month18 days, 18 hours left to enroll

834 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question