Solved

PHP Script to transfer csv file to MySql

Posted on 2012-04-09
6
385 Views
Last Modified: 2012-04-24
Hi experts,

I need a script that will transfer a csv file to a table in my db.  Sounds pretty straightforward but I'm going to throw a slight kink in it...

The csv file and the db table do not match.  I'm going from mssql to mysql and I've dumped all the data from each table in to a csv file.  My company is converting our asp site to a php site and the information we handle is Law Enforcement Sensitive so they have to have someone with the proper security clearance to do it.  When the new site db was made, a lot of features are different or upgraded and no longer needed.  So in my csv file I have for instance:

First_Name, Last_Name, email, password, date_joined, date_last_posted

and in the new db I would have the fields:

name, email, password, something else, ...etc

The way I was thinking in my head was taking each csv line and explode() the comma's and then use the array keys to enter the data?  I'm sure there is probably an easier way.  I need all the help I can get.

There are around 51,000 users if that makes a difference but the largest csv file is well over 250k records.
0
Comment
Question by:deputycheek
6 Comments
 
LVL 32

Expert Comment

by:DrDamnit
ID: 37825572
Explode is not going to be a good solution. If there is a comma or something else random in a field, it will kill the import.

You're better off looping through the files to get them structured correctly using: http://php.net/manual/en/function.fgetcsv.php.

Read the file, and write a new one that is properly formatted for the new DB.

THEN... use mysqlimport to import the CSV file. It will be faster and better.

http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html

Depending on your structure, you'll want to use these switches:

--columns=column_list
*in case you need to map the columns.

--fields-enclosed-by=string
*If your fields are enclosed with quotes.

--fields-optionally-enclosed-by=string
*If some fields are enclosed in quotes and others aren't.

--fields-escaped-by
*If you have any escaped fields (usually a "no"... but included for completeness.

--fields-terminated-by=string
*appears that this will be a comma in your case, whcih is the default. But if you have a tab or ;, use this.

--lines-terminated-by=string
*usually CRLF in Windows or just a LF in Linux / Mac.
0
 

Author Comment

by:deputycheek
ID: 37825614
What I do now is import the csv file straight in to mysql and I wrote a function that takes what I need from one table and places it in to the other.  The only problem with that is some of the files timeout and have to be broken in to pieces etc...

I was just hoping there was a way to "cut out the middle man" so to speak because the new user table is so much different.

I didn't know if I could take like the 1st, 8th, and 11th value from each line on the csv and insert straight in to the db...which would make my life soooo much easier.  Pretty much picking and choosing.

Another reason is some of the old tables were combined in to one on the new site so It'd be nice to read line by line of two seperate csv files and take what I need from them.

Here's what I have now for CI in case someone wants to use the long way:

 public function users_to_user($start,$limit)
	    {
		$users = $this->migrate_model->getUsers($start,$limit); // gets from users table (old field names)
		foreach ($users as $u) {

		    // recid -> id
		    $this->db->set('id',$u->recid);
		    // UserId -> username
		    $this->db->set('username',$u->UserId);
		    // Password -> password
		    $u->Password = md5($u->Password);
		    $this->db->set('password',$u->Password);
		    // UserType -> user_type_id
		    // in the new sys, admin is 99 old is 3
		    if ($u->UserType == 3) {
			$u->UserType = 99;
		    }
.......

Open in new window


and of course I get the table name from the model.  

but there HAS to be a way to do it the other way.
0
 
LVL 9

Accepted Solution

by:
rinfo earned 500 total points
ID: 37826834
$path_length = strrpos(__FILE__,DIRECTORY_SEPARATOR);
       $path  = substr(__FILE__, 0, $path_length) .DIRECTORY_SEPARATOR;
       $csv_with_path = "{$path}{$temp_csv_file}";

       $q="LOAD DATA LOCAL INFILE  '$temp_csv_file' INTO TABLE yourtablename
                                FIELDS TERMINATED BY \"\,\"
                                LINES TERMINATED BY \"\n\"
                                (mysql table fieldlist separted by comma and unquoted )";

       mysql_query($q) or die(mysql_error());

if csv file contains header as field name  you have to add IGNORE 1 LINES after the closing
bracket of the fieldlist.
This is a tested code and i have used it successfully.
0
What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

 
LVL 108

Expert Comment

by:Ray Paseur
ID: 37826992
You might be able to get something to work using phpMyAdmin.  Failing that, I would write a thing that us computer science geeks call a polymorphic adapter.  It has one side that is the output side, and in that side it creates a CSV file that is exactly what you need for something like LOAD DATA INFILE.  It has one or more sides that are the input sides, and these are the polymorphs.  Each of these reads a file with some kind of data that is sort of like what you need (the old stuff).  They map the  data in such a way that it's exactly what the output side needs to write.

If you want to post the exact field names, and a description of the mappings, including the CREATE TABLE statements for all of the tables involved, we may be able to show you how an adapter will work.
0
 
LVL 15

Expert Comment

by:Insoftservice
ID: 37827093
hi , simply use LOAD DATA INFILE

http://zetcode.com/databases/mysqltutorial/exportimport/

$file ='xyz.csv';
$query = "LOAD DATA LOCAL INFILE '".$file."' INTO TABLE tablename FIELDS TERMINATED BY ' \\t ' LINES TERMINATED BY '\n' IGNORE 0 LINES (table_filed1, to_table_filed2,table_filed3);";
                  mysql_query($query);
0
 
LVL 16

Expert Comment

by:theo kouwenhoven
ID: 37827117
If the CSV is created the right way, you can insert the records from PHP per line

$file = fopen($xCsvFile,"r");
      $rec = fgets($file);
      while(! feof($file))
        {
            $query  = "INSERT INTO Track (Fld01, Fld02, Fld03, Fld04, Fld05) VALUES ($rec)" ;
            $result = mysql_query($query) or die(mysql_error());
            $rec = fgets($file);
        }

If the $rec contains more fields then you need, use an worktable first, or add temp dummy fields to the target file,
but adding a CSV record to a table is no problem :-)
0

Featured Post

Zoho SalesIQ

Hassle-free live chat software re-imagined for business growth. 2 users, always free.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Things That Drive Us Nuts Have you noticed the use of the reCaptcha feature at EE and other web sites?  It wants you to read and retype something that looks like this.Insanity!  It's not EE's fault - that's just the way reCaptcha works.  But it is …
Does the idea of dealing with bits scare or confuse you? Does it seem like a waste of time in an age where we all have terabytes of storage? If so, you're missing out on one of the core tools in every professional programmer's toolbox. Learn how to …
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
The viewer will learn how to dynamically set the form action using jQuery.

914 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

21 Experts available now in Live!

Get 1:1 Help Now