deputycheek
asked on
PHP Script to transfer csv file to MySql
Hi experts,
I need a script that will transfer a csv file to a table in my db. Sounds pretty straightforward but I'm going to throw a slight kink in it...
The csv file and the db table do not match. I'm going from mssql to mysql and I've dumped all the data from each table in to a csv file. My company is converting our asp site to a php site and the information we handle is Law Enforcement Sensitive so they have to have someone with the proper security clearance to do it. When the new site db was made, a lot of features are different or upgraded and no longer needed. So in my csv file I have for instance:
First_Name, Last_Name, email, password, date_joined, date_last_posted
and in the new db I would have the fields:
name, email, password, something else, ...etc
The way I was thinking in my head was taking each csv line and explode() the comma's and then use the array keys to enter the data? I'm sure there is probably an easier way. I need all the help I can get.
There are around 51,000 users if that makes a difference but the largest csv file is well over 250k records.
I need a script that will transfer a csv file to a table in my db. Sounds pretty straightforward but I'm going to throw a slight kink in it...
The csv file and the db table do not match. I'm going from mssql to mysql and I've dumped all the data from each table in to a csv file. My company is converting our asp site to a php site and the information we handle is Law Enforcement Sensitive so they have to have someone with the proper security clearance to do it. When the new site db was made, a lot of features are different or upgraded and no longer needed. So in my csv file I have for instance:
First_Name, Last_Name, email, password, date_joined, date_last_posted
and in the new db I would have the fields:
name, email, password, something else, ...etc
The way I was thinking in my head was taking each csv line and explode() the comma's and then use the array keys to enter the data? I'm sure there is probably an easier way. I need all the help I can get.
There are around 51,000 users if that makes a difference but the largest csv file is well over 250k records.
ASKER
What I do now is import the csv file straight in to mysql and I wrote a function that takes what I need from one table and places it in to the other. The only problem with that is some of the files timeout and have to be broken in to pieces etc...
I was just hoping there was a way to "cut out the middle man" so to speak because the new user table is so much different.
I didn't know if I could take like the 1st, 8th, and 11th value from each line on the csv and insert straight in to the db...which would make my life soooo much easier. Pretty much picking and choosing.
Another reason is some of the old tables were combined in to one on the new site so It'd be nice to read line by line of two seperate csv files and take what I need from them.
Here's what I have now for CI in case someone wants to use the long way:
and of course I get the table name from the model.
but there HAS to be a way to do it the other way.
I was just hoping there was a way to "cut out the middle man" so to speak because the new user table is so much different.
I didn't know if I could take like the 1st, 8th, and 11th value from each line on the csv and insert straight in to the db...which would make my life soooo much easier. Pretty much picking and choosing.
Another reason is some of the old tables were combined in to one on the new site so It'd be nice to read line by line of two seperate csv files and take what I need from them.
Here's what I have now for CI in case someone wants to use the long way:
public function users_to_user($start,$limit)
{
$users = $this->migrate_model->getUsers($start,$limit); // gets from users table (old field names)
foreach ($users as $u) {
// recid -> id
$this->db->set('id',$u->recid);
// UserId -> username
$this->db->set('username',$u->UserId);
// Password -> password
$u->Password = md5($u->Password);
$this->db->set('password',$u->Password);
// UserType -> user_type_id
// in the new sys, admin is 99 old is 3
if ($u->UserType == 3) {
$u->UserType = 99;
}
.......
and of course I get the table name from the model.
but there HAS to be a way to do it the other way.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
You might be able to get something to work using phpMyAdmin. Failing that, I would write a thing that us computer science geeks call a polymorphic adapter. It has one side that is the output side, and in that side it creates a CSV file that is exactly what you need for something like LOAD DATA INFILE. It has one or more sides that are the input sides, and these are the polymorphs. Each of these reads a file with some kind of data that is sort of like what you need (the old stuff). They map the data in such a way that it's exactly what the output side needs to write.
If you want to post the exact field names, and a description of the mappings, including the CREATE TABLE statements for all of the tables involved, we may be able to show you how an adapter will work.
If you want to post the exact field names, and a description of the mappings, including the CREATE TABLE statements for all of the tables involved, we may be able to show you how an adapter will work.
hi , simply use LOAD DATA INFILE
http://zetcode.com/databases/mysqltutorial/exportimport/
$file ='xyz.csv';
$query = "LOAD DATA LOCAL INFILE '".$file."' INTO TABLE tablename FIELDS TERMINATED BY ' \\t ' LINES TERMINATED BY '\n' IGNORE 0 LINES (table_filed1, to_table_filed2,table_file d3);";
mysql_query($query);
http://zetcode.com/databases/mysqltutorial/exportimport/
$file ='xyz.csv';
$query = "LOAD DATA LOCAL INFILE '".$file."' INTO TABLE tablename FIELDS TERMINATED BY ' \\t ' LINES TERMINATED BY '\n' IGNORE 0 LINES (table_filed1, to_table_filed2,table_file
mysql_query($query);
If the CSV is created the right way, you can insert the records from PHP per line
$file = fopen($xCsvFile,"r");
$rec = fgets($file);
while(! feof($file))
{
$query = "INSERT INTO Track (Fld01, Fld02, Fld03, Fld04, Fld05) VALUES ($rec)" ;
$result = mysql_query($query) or die(mysql_error());
$rec = fgets($file);
}
If the $rec contains more fields then you need, use an worktable first, or add temp dummy fields to the target file,
but adding a CSV record to a table is no problem :-)
$file = fopen($xCsvFile,"r");
$rec = fgets($file);
while(! feof($file))
{
$query = "INSERT INTO Track (Fld01, Fld02, Fld03, Fld04, Fld05) VALUES ($rec)" ;
$result = mysql_query($query) or die(mysql_error());
$rec = fgets($file);
}
If the $rec contains more fields then you need, use an worktable first, or add temp dummy fields to the target file,
but adding a CSV record to a table is no problem :-)
You're better off looping through the files to get them structured correctly using: http://php.net/manual/en/function.fgetcsv.php.
Read the file, and write a new one that is properly formatted for the new DB.
THEN... use mysqlimport to import the CSV file. It will be faster and better.
http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html
Depending on your structure, you'll want to use these switches:
--columns=column_list
*in case you need to map the columns.
--fields-enclosed-by=strin
*If your fields are enclosed with quotes.
--fields-optionally-enclos
*If some fields are enclosed in quotes and others aren't.
--fields-escaped-by
*If you have any escaped fields (usually a "no"... but included for completeness.
--fields-terminated-by=str
*appears that this will be a comma in your case, whcih is the default. But if you have a tab or ;, use this.
--lines-terminated-by=stri
*usually CRLF in Windows or just a LF in Linux / Mac.