Leo Juan
asked on
import multiple table into multiple mysql table at once
I have 50 csv files and they all share same structure like same column names and number of columns. I would like to import the csv files into mysql database in separate tables (1 table per file). It would take me a lot of time if i do it manually one by one and would like to know if there is an easy to do this all at one time. I am new to mysql I only know some basic so any help would be really appreciated.
ASKER
Thanks for replying.
Your right we just need to change the name but MySQL is extremely slow and it takes a very long time. I tried for a few and it keeps lagging and Plus I need an efficient way to do this.
I need a script that automatically import multiple files into multiple tables without giving the names like giving the directory of files it matches file names with table names.
Your right we just need to change the name but MySQL is extremely slow and it takes a very long time. I tried for a few and it keeps lagging and Plus I need an efficient way to do this.
I need a script that automatically import multiple files into multiple tables without giving the names like giving the directory of files it matches file names with table names.
Sorry but I don't understand that last comment.
If you create a script with 50 lines in it, it should just run. If your database is slow, that is a different issue.
Are you wanting to get the data loaded or are you looking for help with performance or something else?
I've only ever used the InnoDB engine and found it relatively decent on performance but I also never loaded a LOT of data or performed a speed test comparing it with any other database.
I suppose you could create the tables as a CSV storage and just access the files directly:
https://dev.mysql.com/doc/refman/8.0/en/csv-storage-engine.html
That might give you faster access to the data but I would assume actually querying it would be slow.
If you create a script with 50 lines in it, it should just run. If your database is slow, that is a different issue.
Are you wanting to get the data loaded or are you looking for help with performance or something else?
I've only ever used the InnoDB engine and found it relatively decent on performance but I also never loaded a LOT of data or performed a speed test comparing it with any other database.
I suppose you could create the tables as a CSV storage and just access the files directly:
https://dev.mysql.com/doc/refman/8.0/en/csv-storage-engine.html
That might give you faster access to the data but I would assume actually querying it would be slow.
ASKER
I need to write code that automatically import all tables to MySQL.
Define automatically.
I would go to a CMD prompt and to a DIR > myscript.txt. I would open ViM and with some search/replace and a macro or two, create the load data syntax. Save it, execute the script from the MySQL prompt and come back a few minutes/hours later to see it is finished.
If you want something that you need to run once a night, find all the new files, generates a way to "automatically" load them and log the results, that is a completely different question than the one you asked.
I would go to a CMD prompt and to a DIR > myscript.txt. I would open ViM and with some search/replace and a macro or two, create the load data syntax. Save it, execute the script from the MySQL prompt and come back a few minutes/hours later to see it is finished.
If you want something that you need to run once a night, find all the new files, generates a way to "automatically" load them and log the results, that is a completely different question than the one you asked.
ASKER
What do you mean by load data syntax
The link I provided in my first post:
https://dev.mysql.com/doc/refman/8.0/en/load-data.html
Example from that doc link:
https://dev.mysql.com/doc/refman/8.0/en/load-data.html
Example from that doc link:
LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
ASKER
So this script will import all csv files into tables by matching the file names with table names.
could you please tell me whats data.txt contain.
could you please tell me whats data.txt contain.
That is an example from the MySQL documentation that loads CSV data into a table.
It takes ONE CSV file and loads it into ONE table.
You need to copy that command 50 times if you have 50 CSV files. That is what my ViM suggestions are for. That is how I would do it.
You use whatever editor/program you have.
You will end up with 50 of those "load data" commands in a single file. Then you execute that single file from the examples in the second link I provided in my first post.
You need to tweak the LOAD DATA statement to match your specific files. I can't help with that because I don't know your files.
It takes ONE CSV file and loads it into ONE table.
You need to copy that command 50 times if you have 50 CSV files. That is what my ViM suggestions are for. That is how I would do it.
You use whatever editor/program you have.
You will end up with 50 of those "load data" commands in a single file. Then you execute that single file from the examples in the second link I provided in my first post.
You need to tweak the LOAD DATA statement to match your specific files. I can't help with that because I don't know your files.
No that is just one line for one file.
To do multiple files you would need to change the 'data.txt' and tbl_name for each and every CSV file.
To do that, depending on your OS you could use a batch file / bash script to either enumerate each file/table or loop through the files and substitute values for each line.
That requires some non-MySQL scripting expertise.
To do multiple files you would need to change the 'data.txt' and tbl_name for each and every CSV file.
To do that, depending on your OS you could use a batch file / bash script to either enumerate each file/table or loop through the files and substitute values for each line.
That requires some non-MySQL scripting expertise.
ASKER
So basically you suggested that I need to Copy the "Load Data" 50 times but where do I copy it .
ASKER
Hi dennis
This is what I was thinking but I was not sure since I am new to mysql and finding my way on how to do things.
This is what I was thinking but I was not sure since I am new to mysql and finding my way on how to do things.
>>That requires some non-MySQL scripting expertise.
Not really. 50 SQL commands in a file that is executed by MySQL.
>>since I am new to mysql
Scripting and executing scripts isn't a MySQL thing. YOu just need the script you are executing to be MySQL commands.
>>So basically you suggested that I need to Copy the "Load Data" 50 times but where do I copy it .
Into a text file.
It's a SQL script. Just like a Batch/Shell/??? script.
Not really. 50 SQL commands in a file that is executed by MySQL.
>>since I am new to mysql
Scripting and executing scripts isn't a MySQL thing. YOu just need the script you are executing to be MySQL commands.
>>So basically you suggested that I need to Copy the "Load Data" 50 times but where do I copy it .
Into a text file.
It's a SQL script. Just like a Batch/Shell/??? script.
ASKER
Sorry I was confirming if I understand correctly what you proposed. Then I run the text file into "Load data" function.
>>correctly what you proposed. Then I run the text file into "Load data" function.
No.
Create a text file with the 50 load commands with the 50 files and tables you have.
Then run MySQL and "execute" the text file as a script.
For example: You have file1.csv, file2.csv and file3.csv. You want to load them into table1,table2 and table3.
Each file has TWO lines as header lines so you want to skip them.
Assuming Linux
Open up vi
paste this in:
Save it to /tmp/I_am_a_sql_script.sql
Then run mysql.
At the mysql prompt:
No.
Create a text file with the 50 load commands with the 50 files and tables you have.
Then run MySQL and "execute" the text file as a script.
For example: You have file1.csv, file2.csv and file3.csv. You want to load them into table1,table2 and table3.
Each file has TWO lines as header lines so you want to skip them.
Assuming Linux
Open up vi
paste this in:
LOAD DATA INFILE 'file1.csv' INTO TABLE table1 FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 2 LINES;
LOAD DATA INFILE 'file2.csv' INTO TABLE table2 FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 2 LINES;
LOAD DATA INFILE 'file3.csv' INTO TABLE table3 FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 2 LINES;
Save it to /tmp/I_am_a_sql_script.sql
Then run mysql.
At the mysql prompt:
source /tmp/I_am_a_sql_script.sql
ASKER
The script doesn't specify database name so how I can be sure I am importing in the right one.
>>how I can be sure I am importing in the right one.
You manually connect to mysql and the correct database as the correct user. All that is done beforehand and you are sitting at the "mysql>" prompt. Then you source the file.
You manually connect to mysql and the correct database as the correct user. All that is done beforehand and you are sitting at the "mysql>" prompt. Then you source the file.
ASKER
Sorry I forgot we do that when log into MySQL server.
But still I have to create 50 tables manually one by one.
But still I have to create 50 tables manually one by one.
Prefix each line with "USE 'databasename';" - or just the whole script. Actually just once will do. And yes you might have to feed it account credentials for the database (user, password) as well.
>>But still I have to create 50 tables manually one by one.
Script it!
You will need to issue the create table command, yes. How do you expect the database to understand your data types, max lenths, etc.?
You only need to type it in once. Then copy/paste it 50 times and change the table names save the file to a flat text file and "source" it. The create table and load data can even be in the same file but I would probably make it two. Create all the tables first. Then load all of them.
10 or 10000 tables/files: Type the create table and load data text once. copy/paste/edit 10,000 times using Vim: Total time probably about 2 minutes.
Even if you want to do it a really bad way: Using Excel with parts of each command in a different column, then converting it to a text file. Maybe 10 minutes start to finish.
>>Prefix each line with "USE 'databasename';"
I probably wouldn't add that to the script. Accidentally run the wrong script, you might trash something important.
Script it!
You will need to issue the create table command, yes. How do you expect the database to understand your data types, max lenths, etc.?
You only need to type it in once. Then copy/paste it 50 times and change the table names save the file to a flat text file and "source" it. The create table and load data can even be in the same file but I would probably make it two. Create all the tables first. Then load all of them.
10 or 10000 tables/files: Type the create table and load data text once. copy/paste/edit 10,000 times using Vim: Total time probably about 2 minutes.
Even if you want to do it a really bad way: Using Excel with parts of each command in a different column, then converting it to a text file. Maybe 10 minutes start to finish.
>>Prefix each line with "USE 'databasename';"
I probably wouldn't add that to the script. Accidentally run the wrong script, you might trash something important.
ASKER
I need to create like 50 tables at once without having to do it one at a time.
>>I need to create like 50 tables at once without having to do it one at a time.
I'm not aware of and parallel DDL options in MySQL.
Why can't all 50 create table statements be in one script file and you execute it once?
I'm not aware of and parallel DDL options in MySQL.
Why can't all 50 create table statements be in one script file and you execute it once?
ASKER
I am not sure if that's possible by I will try to do my research and then update you guys.
The more information about exactly what you are trying to do, the more accurate our advice.
There must be some requirement you aren't telling us: Why can't you script the table creation?
There must be some requirement you aren't telling us: Why can't you script the table creation?
ASKER
This is what I am trying to figure out like how to do this
>>trying to figure out like how to do this
How to do what?
If it is the same as the original question: How to load 50 CSV files into 50 tables, that should be answered.
If you need to figure out if you can script the create table statements, the only thing that should prevent it is some internal requirement that is preventing it. That is the requirement we need to know.
I guess you could create an OS script file (bash/BAT/???) that runs 50 sub scripts all at the same exact time but that would probably slow down the server to a point where nothing really runs well.
How to do what?
If it is the same as the original question: How to load 50 CSV files into 50 tables, that should be answered.
If you need to figure out if you can script the create table statements, the only thing that should prevent it is some internal requirement that is preventing it. That is the requirement we need to know.
I guess you could create an OS script file (bash/BAT/???) that runs 50 sub scripts all at the same exact time but that would probably slow down the server to a point where nothing really runs well.
Are you running under linux or windows?
ASKER
I am using windows
So what you want is to write a batch file that will generate or the SQL you need. You can capture that output and edit it then pipe it into mysql or simply copy paste once you have the mysql / mariadb prompt and have created the database and selected it.
The batch file comes down to
select data directory
iterate over file names *.csv
capturing the * as a variable
using that to
generate a line of SQL
output sql line
An example of this can be found at e.g.
https://stackoverflow.com/questions/4663928/loop-through-file-names-in-a-batch-script
The batch file comes down to
select data directory
iterate over file names *.csv
capturing the * as a variable
using that to
generate a line of SQL
output sql line
An example of this can be found at e.g.
https://stackoverflow.com/questions/4663928/loop-through-file-names-in-a-batch-script
>>So what you want is to write a batch file that will generate or the SQL you need
From what we know right now, a BAT script like that is extreme overkill.
But, we don't have the complete requirements.
If that level of automation is necessary, I sure wouldn't use BAT. I would use Powershell.
Where BAT/Powershell will probably fail and not work:
How do you plan on the script generating the correct column formats and lengths in the create table statement?
From what we know right now, a BAT script like that is extreme overkill.
But, we don't have the complete requirements.
If that level of automation is necessary, I sure wouldn't use BAT. I would use Powershell.
Where BAT/Powershell will probably fail and not work:
How do you plan on the script generating the correct column formats and lengths in the create table statement?
ASKER
I don't have any experience writing batch script
>>I don't have any experience writing batch script
You might not need it.
We still need to know exactly what problem you are trying to solve.
Is this a one-time load of the 50 files?
Will you be getting 5o files every hour/week/month/year and need some automated process to recognize they have arrived and magically create tables and load the data, verify all the data loaded, then move the files to some archive folder and wait for he next files to arrive?
You really haven't told us anything about the problem you are trying to solve.
You might not need it.
We still need to know exactly what problem you are trying to solve.
Is this a one-time load of the 50 files?
Will you be getting 5o files every hour/week/month/year and need some automated process to recognize they have arrived and magically create tables and load the data, verify all the data loaded, then move the files to some archive folder and wait for he next files to arrive?
You really haven't told us anything about the problem you are trying to solve.
ASKER
This is just one time thing. I need to create 50 tables all share the same structure in terms columns names and quantity. Then I would like to import 50 csv files to those tables. I can do this using MySQL workbench import function but that's too exhausting so I need something like script to allow me do this task efficiently.
Hope it makes sense
Hope it makes sense
>>This is just one time thing...so I need something like script to allow me do this task efficiently.
I have answered that the best way I know how. Create a .txt file with 50 create statements. Create a .txt file with 50 load data statements. Use the "source" command to load them.
If you are having trouble generating the script with an editor on your machine (don't try notepad, it isn't easy enough).
Let's try this, Attached is an Excel spreadsheet.
There are two sheets. Sheet1 is the basics for the create table. Sheet is the data load.
Column A is static. column B is a sequential number for the table names. Column C is the column definitions.
Change row 1 to whatever you want. Then use the Excels drag to autofill feature to populate the other 49 lines:
https://support.office.com/en-us/article/fill-data-automatically-in-worksheet-cells-74e31bdd-d993-45da-aa82-35a236c5b5db
Sheet the same way with one exception: You cannot autofill the file names. You can populate that column from a copy/paste from a list of files you should have.
Save both sheets to a .txt file and replace the tabs with nothing. That should give you a valid script with all 50 rows in it.
Connect to the database using mysql and sue the source command.
>>but that's too exhausting
You do realize that in the amount of time you have spent on this question, you could have probably had them loaded by now?
Book1.xlsx
I have answered that the best way I know how. Create a .txt file with 50 create statements. Create a .txt file with 50 load data statements. Use the "source" command to load them.
If you are having trouble generating the script with an editor on your machine (don't try notepad, it isn't easy enough).
Let's try this, Attached is an Excel spreadsheet.
There are two sheets. Sheet1 is the basics for the create table. Sheet is the data load.
Column A is static. column B is a sequential number for the table names. Column C is the column definitions.
Change row 1 to whatever you want. Then use the Excels drag to autofill feature to populate the other 49 lines:
https://support.office.com/en-us/article/fill-data-automatically-in-worksheet-cells-74e31bdd-d993-45da-aa82-35a236c5b5db
Sheet the same way with one exception: You cannot autofill the file names. You can populate that column from a copy/paste from a list of files you should have.
Save both sheets to a .txt file and replace the tabs with nothing. That should give you a valid script with all 50 rows in it.
Connect to the database using mysql and sue the source command.
>>but that's too exhausting
You do realize that in the amount of time you have spent on this question, you could have probably had them loaded by now?
Book1.xlsx
Unless your files are large (multi Gig) just load them sequentially.
If you must parallelize this process, just do 50x loads tacking on an & at the end of each load to run the loads in background.
If you must parallelize this process, just do 50x loads tacking on an & at the end of each load to run the loads in background.
>>tacking on an & at the end of each load to run the loads in background.
I'm familiar with that in *nix at a single script/command level. But on Windows?
My MySQL is rusty but I don't remember that as an option with that utility either.
I would be interested in reading anything you can point to in the documentation for MySQL on Windows.
I'm familiar with that in *nix at a single script/command level. But on Windows?
My MySQL is rusty but I don't remember that as an option with that utility either.
I would be interested in reading anything you can point to in the documentation for MySQL on Windows.
I hate to ask so far in the thread but why are you creating 50 tables if:
they all share same structure like same column names and number of columns.
ASKER
It's for data analysis purpose. Thanks
Are you planning on updating the tables on an ongoing basis or a one-shot upload?
Do you have PHP installed on your Windows machine?
If so, what about running a loop to create the table if not exists, pull the csv files from the directory, then importing them one by one using fopen() and fgetcsv()?
snipet:
m_zero]
https://www.cloudways.com/blog/import-export-csv-using-php-and-mysql/
Do you have PHP installed on your Windows machine?
If so, what about running a loop to create the table if not exists, pull the csv files from the directory, then importing them one by one using fopen() and fgetcsv()?
snipet:
$file = fopen($filename, "r");
while (($getData = fgetcsv($file, 10000, ",")) !== FALSE)
{
$getData[column_number_starting_frohttps://www.cloudways.com/blog/import-export-csv-using-php-and-mysql/
Using PHP or any other programming/scripting language is overkill.
>>Are you planning on updating the tables on an ongoing basis or a one-shot upload?
Please read previous posts:
>>Are you planning on updating the tables on an ongoing basis or a one-shot upload?
Please read previous posts:
This is just one time thing.https://www.experts-exchange.com/questions/29114957/import-multiple-table-into-multiple-mysql-table-at-once.html?anchorAnswerId=42662028#a42662028
ASKER
I just want to upload one time but need easy way to do this.
I have posted the simplest and easiest method I can think of. Two sql script files that are sourced at the MySQL prompt.
No OS scripting. No programming. Just create flat text files and execute.
No OS scripting. No programming. Just create flat text files and execute.
ASKER
Thanks for your proposed solution and I think that seem simple tome than what others suggested.
I'm not sure suggesting a BAT script should be part of the solution. Any programming/scripting was deemed unnecessary for a one-time load.
Especially given the Leo's final comment (I added the missing space between "tome"):
Thanks for your proposed solution and I think that seem simple to me than what others suggested.
Seems to imply my simple script is the best solution in this case.
I do not disagree that it took a long time to get to that solution but participation shouldn't get points.
Especially given the Leo's final comment (I added the missing space between "tome"):
Thanks for your proposed solution and I think that seem simple to me than what others suggested.
Seems to imply my simple script is the best solution in this case.
I do not disagree that it took a long time to get to that solution but participation shouldn't get points.
It seems that OP was averse to extensive manual editing and wanted to avoid that. Writing a SQL script is a good alternative, but did not make allowance for file name extraction and table naming. Clearly OP was out of his depth and did not indicate successful resolution either way, so a point split of some sort seems fair.
There were at least two adequate answers to that question, one involving a scripted approach and the other a more manual approach. I believe the OP had some success with the latter approach. To now disappear the question is not helpful to others who might find the content useful.
Creating a script isn't rocket science. If I Googled "create a script to generate DML" I would expect something better in the results than this question.
mod: I vote to delete.
This question needs an answer!
Become an EE member today
7 DAY FREE TRIALMembers can start a 7-Day Free trial then enjoy unlimited access to the platform.
View membership options
or
Learn why we charge membership fees
We get it - no one likes a content blocker. Take one extra minute and find out why we block content.
Should be 50 of the same command where you only need to change the file name and table name. Seems like a good editor and some search and replace would work. I use ViM with a keyboard macro and search/replace to do this sort of thing all the time.
Create a script and use load data:
https://dev.mysql.com/doc/refman/8.0/en/load-data.html
Here is how to execute commands in a script:
https://dev.mysql.com/doc/refman/8.0/en/mysql-batch-commands.html