PHP Script to transfer csv file to MySql

Hi experts,

I need a script that will transfer a csv file to a table in my db.  Sounds pretty straightforward but I'm going to throw a slight kink in it...

The csv file and the db table do not match.  I'm going from mssql to mysql and I've dumped all the data from each table in to a csv file.  My company is converting our asp site to a php site and the information we handle is Law Enforcement Sensitive so they have to have someone with the proper security clearance to do it.  When the new site db was made, a lot of features are different or upgraded and no longer needed.  So in my csv file I have for instance:

First_Name, Last_Name, email, password, date_joined, date_last_posted

and in the new db I would have the fields:

name, email, password, something else, ...etc

The way I was thinking in my head was taking each csv line and explode() the comma's and then use the array keys to enter the data?  I'm sure there is probably an easier way.  I need all the help I can get.

There are around 51,000 users if that makes a difference but the largest csv file is well over 250k records.
Who is Participating?

[Webinar] Streamline your web hosting managementRegister Today

rinfoConnect With a Mentor Commented:
$path_length = strrpos(__FILE__,DIRECTORY_SEPARATOR);
       $path  = substr(__FILE__, 0, $path_length) .DIRECTORY_SEPARATOR;
       $csv_with_path = "{$path}{$temp_csv_file}";

       $q="LOAD DATA LOCAL INFILE  '$temp_csv_file' INTO TABLE yourtablename
                                FIELDS TERMINATED BY \"\,\"
                                LINES TERMINATED BY \"\n\"
                                (mysql table fieldlist separted by comma and unquoted )";

       mysql_query($q) or die(mysql_error());

if csv file contains header as field name  you have to add IGNORE 1 LINES after the closing
bracket of the fieldlist.
This is a tested code and i have used it successfully.
Explode is not going to be a good solution. If there is a comma or something else random in a field, it will kill the import.

You're better off looping through the files to get them structured correctly using:

Read the file, and write a new one that is properly formatted for the new DB.

THEN... use mysqlimport to import the CSV file. It will be faster and better.

Depending on your structure, you'll want to use these switches:

*in case you need to map the columns.

*If your fields are enclosed with quotes.

*If some fields are enclosed in quotes and others aren't.

*If you have any escaped fields (usually a "no"... but included for completeness.

*appears that this will be a comma in your case, whcih is the default. But if you have a tab or ;, use this.

*usually CRLF in Windows or just a LF in Linux / Mac.
deputycheekAuthor Commented:
What I do now is import the csv file straight in to mysql and I wrote a function that takes what I need from one table and places it in to the other.  The only problem with that is some of the files timeout and have to be broken in to pieces etc...

I was just hoping there was a way to "cut out the middle man" so to speak because the new user table is so much different.

I didn't know if I could take like the 1st, 8th, and 11th value from each line on the csv and insert straight in to the db...which would make my life soooo much easier.  Pretty much picking and choosing.

Another reason is some of the old tables were combined in to one on the new site so It'd be nice to read line by line of two seperate csv files and take what I need from them.

Here's what I have now for CI in case someone wants to use the long way:

 public function users_to_user($start,$limit)
		$users = $this->migrate_model->getUsers($start,$limit); // gets from users table (old field names)
		foreach ($users as $u) {

		    // recid -> id
		    // UserId -> username
		    // Password -> password
		    $u->Password = md5($u->Password);
		    // UserType -> user_type_id
		    // in the new sys, admin is 99 old is 3
		    if ($u->UserType == 3) {
			$u->UserType = 99;

Open in new window

and of course I get the table name from the model.  

but there HAS to be a way to do it the other way.
The new generation of project management tools

With’s project management tool, you can see what everyone on your team is working in a single glance. Its intuitive dashboards are customizable, so you can create systems that work for you.

Ray PaseurCommented:
You might be able to get something to work using phpMyAdmin.  Failing that, I would write a thing that us computer science geeks call a polymorphic adapter.  It has one side that is the output side, and in that side it creates a CSV file that is exactly what you need for something like LOAD DATA INFILE.  It has one or more sides that are the input sides, and these are the polymorphs.  Each of these reads a file with some kind of data that is sort of like what you need (the old stuff).  They map the  data in such a way that it's exactly what the output side needs to write.

If you want to post the exact field names, and a description of the mappings, including the CREATE TABLE statements for all of the tables involved, we may be able to show you how an adapter will work.
hi , simply use LOAD DATA INFILE

$file ='xyz.csv';
$query = "LOAD DATA LOCAL INFILE '".$file."' INTO TABLE tablename FIELDS TERMINATED BY ' \\t ' LINES TERMINATED BY '\n' IGNORE 0 LINES (table_filed1, to_table_filed2,table_filed3);";
If the CSV is created the right way, you can insert the records from PHP per line

$file = fopen($xCsvFile,"r");
      $rec = fgets($file);
      while(! feof($file))
            $query  = "INSERT INTO Track (Fld01, Fld02, Fld03, Fld04, Fld05) VALUES ($rec)" ;
            $result = mysql_query($query) or die(mysql_error());
            $rec = fgets($file);

If the $rec contains more fields then you need, use an worktable first, or add temp dummy fields to the target file,
but adding a CSV record to a table is no problem :-)
All Courses

From novice to tech pro — start learning today.