• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 334
  • Last Modified:

Import multiple csv files in mysql database

Hi,

I have a 3 csv files (persons.csv, time.csv, history.csv). I want to import all these csv files in a single mysql table (alldata).

All these csv files has a variable number of colums.

I need a php script/function that will alow me to import all these files and all the columns in  my table like this:

alldata - table
ID   |  FileName    |    Date     |      ColumnName    |    DataImport    |  


so if  I have in the csv file (persons.csv) data like this
Name     |     Code     |    Actions   |    ActionsNo
Rick                 28              sales                    3
Mike               14              man                     1

The alldata - table after import should look like this
alldata - table
ID   |    Date         | FileName   |  RowNumber   |       ColumnName    |    DataImport    |  
 1      2012-04-12    persons.csv              2                       Name                         Rick          
 2      2012-04-12    persons.csv              2                        Code                          28
 3      2012-04-12    persons.csv              2                        Action                       sales
 4      2012-04-12    persons.csv              2                        ActionNo                     3
 5      2012-04-12    persons.csv              3                      Name                         Mike          
 6      2012-04-12    persons.csv              3                       Code                          14
 7      2012-04-12    persons.csv              3                        Action                       man
 8      2012-04-12    persons.csv              3                       ActionNo                     1


Please help me!
0
Qw M
Asked:
Qw M
  • 3
1 Solution
 
gr8gonzoConsultantCommented:
<?php

mysql_connect("host","user","pass");
mysql_select_db("your database");

importCSVIntoMainTable("persons.csv");
importCSVIntoMainTable("time.csv");
importCSVIntoMainTable("history.csv");

function importCSVIntoMainTable($csvFile)
{
	$row = 1;
	$importDate = date("Y-m-d");
	$baseFileName = mysql_real_escape_string(basename($csvFile));
	
	if (($handle = fopen($csvFile, "r")) !== FALSE)
	{
		$headerRow = true;
		while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
		{
			// Get column names from first row
			if($headerRow == true)
			{
				$columnNames = $data;
				$headerRow = false;
				continue; // Skip rest of the processing
			}
			
			
			foreach($data as $idx => $value)
			{
				$columnName = mysql_real_escape_string($columnNames[$idx]);
				
				// Build query
				$query = "INSERT INTO alldata (Date,FileName,RowNumber,ColumnName,DataImport) VALUES ('{$importDate}','{$baseFileName}','{$row}','{$columnName}','".mysql_real_escape_string($value)."');";
				
				// Uncomment below to run query
				// mysql_query($query);
			}
			
			// Increment row #
			$row++;

		}
		fclose($handle);
	}
}
?>

Open in new window

0
 
gr8gonzoConsultantCommented:
On a separate note, storing data like that for any long term purpose is usually a terrible idea. If it's all temporary and you get dynamic files all the time that you're simply transforming, that's one thing, but it's horribly inefficient to try to have this kind of dynamic structure. Even with indexes, the data will be cumbersome to retrieve, and there's no indication of the order of columns (if that's important to you). You also run the risk of running out of IDs if you have a lot of rows unless you use something like a BIGINT for your ID column, and the resulting database size will be much larger than the original files.

If it's truly all temporary, you could also do something where you simply json_encode() the row and store the resulting text in the database, so each row in the file is only taking up 1 ID in the database. At least you can probably stick with an INT for your ID. If you actually want to search on any values in the data, you're probably going to see some slow queries, but if you just want to pull all records from a particular file later, this might be a good approach.

Your table structure would look like:

ID, Date, FileName, RowNumber, JSONData

Then the above function would still work, but you'd change the query-building section a bit:

Old:
			foreach($data as $idx => $value)
			{
				$columnName = mysql_real_escape_string($columnNames[$idx]);
				
				// Build query
				$query = "INSERT INTO alldata (Date,FileName,RowNumber,ColumnName,DataImport) VALUES ('{$importDate}','{$baseFileName}','{$row}','{$columnName}','".mysql_real_escape_string($value)."');";
				
				// Uncomment below to run query
				// mysql_query($query);
			}

Open in new window


New:
			$mappedData = array();
			foreach($data as $idx => $value)
			{
				$columnName = $columnNames[$idx];
				$mappedData[$columnName] = $value;
			}

			// Build query
			$query = "INSERT INTO alldata (Date,FileName,RowNumber,JSONData) VALUES ('{$importDate}','{$baseFileName}','{$row}','{$columnName}','".mysql_real_escape_string(json_encode($mappedData))."');";
				
			// Uncomment below to run query
			// mysql_query($query);

Open in new window

0
 
gr8gonzoConsultantCommented:
One other advantage to the second method (with JSONData) is that you're running only one INSERT query per row in the CSV instead of having one query for every column, which could dramatically slow down the import.
0
 
Qw MAuthor Commented:
gr8gonzo - thank you so much for your help!
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Cloud Class® Course: Microsoft Windows 7 Basic

This introductory course to Windows 7 environment will teach you about working with the Windows operating system. You will learn about basic functions including start menu; the desktop; managing files, folders, and libraries.

  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now