Solved

Import multiple csv files  in mysql database

Posted on 2012-04-12
4
324 Views
Last Modified: 2012-04-12
Hi,

I have a 3 csv files (persons.csv, time.csv, history.csv). I want to import all these csv files in a single mysql table (alldata).

All these csv files has a variable number of colums.

I need a php script/function that will alow me to import all these files and all the columns in  my table like this:

alldata - table
ID   |  FileName    |    Date     |      ColumnName    |    DataImport    |  


so if  I have in the csv file (persons.csv) data like this
Name     |     Code     |    Actions   |    ActionsNo
Rick                 28              sales                    3
Mike               14              man                     1

The alldata - table after import should look like this
alldata - table
ID   |    Date         | FileName   |  RowNumber   |       ColumnName    |    DataImport    |  
 1      2012-04-12    persons.csv              2                       Name                         Rick          
 2      2012-04-12    persons.csv              2                        Code                          28
 3      2012-04-12    persons.csv              2                        Action                       sales
 4      2012-04-12    persons.csv              2                        ActionNo                     3
 5      2012-04-12    persons.csv              3                      Name                         Mike          
 6      2012-04-12    persons.csv              3                       Code                          14
 7      2012-04-12    persons.csv              3                        Action                       man
 8      2012-04-12    persons.csv              3                       ActionNo                     1


Please help me!
0
Comment
Question by:Qw M
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
4 Comments
 
LVL 35

Accepted Solution

by:
gr8gonzo earned 500 total points
ID: 37836923
<?php

mysql_connect("host","user","pass");
mysql_select_db("your database");

importCSVIntoMainTable("persons.csv");
importCSVIntoMainTable("time.csv");
importCSVIntoMainTable("history.csv");

function importCSVIntoMainTable($csvFile)
{
	$row = 1;
	$importDate = date("Y-m-d");
	$baseFileName = mysql_real_escape_string(basename($csvFile));
	
	if (($handle = fopen($csvFile, "r")) !== FALSE)
	{
		$headerRow = true;
		while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
		{
			// Get column names from first row
			if($headerRow == true)
			{
				$columnNames = $data;
				$headerRow = false;
				continue; // Skip rest of the processing
			}
			
			
			foreach($data as $idx => $value)
			{
				$columnName = mysql_real_escape_string($columnNames[$idx]);
				
				// Build query
				$query = "INSERT INTO alldata (Date,FileName,RowNumber,ColumnName,DataImport) VALUES ('{$importDate}','{$baseFileName}','{$row}','{$columnName}','".mysql_real_escape_string($value)."');";
				
				// Uncomment below to run query
				// mysql_query($query);
			}
			
			// Increment row #
			$row++;

		}
		fclose($handle);
	}
}
?>

Open in new window

0
 
LVL 35

Expert Comment

by:gr8gonzo
ID: 37836956
On a separate note, storing data like that for any long term purpose is usually a terrible idea. If it's all temporary and you get dynamic files all the time that you're simply transforming, that's one thing, but it's horribly inefficient to try to have this kind of dynamic structure. Even with indexes, the data will be cumbersome to retrieve, and there's no indication of the order of columns (if that's important to you). You also run the risk of running out of IDs if you have a lot of rows unless you use something like a BIGINT for your ID column, and the resulting database size will be much larger than the original files.

If it's truly all temporary, you could also do something where you simply json_encode() the row and store the resulting text in the database, so each row in the file is only taking up 1 ID in the database. At least you can probably stick with an INT for your ID. If you actually want to search on any values in the data, you're probably going to see some slow queries, but if you just want to pull all records from a particular file later, this might be a good approach.

Your table structure would look like:

ID, Date, FileName, RowNumber, JSONData

Then the above function would still work, but you'd change the query-building section a bit:

Old:
			foreach($data as $idx => $value)
			{
				$columnName = mysql_real_escape_string($columnNames[$idx]);
				
				// Build query
				$query = "INSERT INTO alldata (Date,FileName,RowNumber,ColumnName,DataImport) VALUES ('{$importDate}','{$baseFileName}','{$row}','{$columnName}','".mysql_real_escape_string($value)."');";
				
				// Uncomment below to run query
				// mysql_query($query);
			}

Open in new window


New:
			$mappedData = array();
			foreach($data as $idx => $value)
			{
				$columnName = $columnNames[$idx];
				$mappedData[$columnName] = $value;
			}

			// Build query
			$query = "INSERT INTO alldata (Date,FileName,RowNumber,JSONData) VALUES ('{$importDate}','{$baseFileName}','{$row}','{$columnName}','".mysql_real_escape_string(json_encode($mappedData))."');";
				
			// Uncomment below to run query
			// mysql_query($query);

Open in new window

0
 
LVL 35

Expert Comment

by:gr8gonzo
ID: 37836959
One other advantage to the second method (with JSONData) is that you're running only one INSERT query per row in the CSV instead of having one query for every column, which could dramatically slow down the import.
0
 

Author Comment

by:Qw M
ID: 37837024
gr8gonzo - thank you so much for your help!
0

Featured Post

Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Popularity Can Be Measured Sometimes we deal with questions of popularity, and we need a way to collect opinions from our clients.  This article shows a simple teaching example of how we might elect a favorite color by letting our clients vote for …
Part of the Global Positioning System A geocode (https://developers.google.com/maps/documentation/geocoding/) is the major subset of a GPS coordinate (http://en.wikipedia.org/wiki/Global_Positioning_System), the other parts being the altitude and t…
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
The viewer will learn how to dynamically set the form action using jQuery.

617 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question