dovercomputers
asked on
Import very large text file into mysql using php
Hello experts,
Background:
Everynight an automated process runs on the internal server and creates a zip file uploaded to the web server. I have written a script which unzips the file, and imports each txt file contained within to a mysql database. There are 4 files, 3 small(ish) files of upto 10MB and containing 50,000 rows. These import fine.
Problem:
1 file is 1.5GB and contains 27,000,000 rows. I am using the code below to import the file, but it errors out at around 5 minutes with a server 500 error which means nothing to me.
If I open the file and pull off 1 milltion rows and write them to a seperate file, it imports fine. So obviously it is a size issue. Any ideas how I can solve it? I know I could sit here at 3am every morning and split the file, but thats not the ideal solution! :)
Many thanks
Jim
Background:
Everynight an automated process runs on the internal server and creates a zip file uploaded to the web server. I have written a script which unzips the file, and imports each txt file contained within to a mysql database. There are 4 files, 3 small(ish) files of upto 10MB and containing 50,000 rows. These import fine.
Problem:
1 file is 1.5GB and contains 27,000,000 rows. I am using the code below to import the file, but it errors out at around 5 minutes with a server 500 error which means nothing to me.
mysql_query("TRUNCATE TABLE `accountproducts_schema`;") or die(mysql_error());
mysql_query("LOAD DATA LOCAL INFILE '/home/wgoffice/public_html/FilesToImport/wgo_accountproducts_replace".$fn.".txt' INTO TABLE `accountproducts_schema` FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n';") or die(mysql_error());
If I open the file and pull off 1 milltion rows and write them to a seperate file, it imports fine. So obviously it is a size issue. Any ideas how I can solve it? I know I could sit here at 3am every morning and split the file, but thats not the ideal solution! :)
Many thanks
Jim
ASKER
But I need this to be an automated process. The file gets sent up at 2am every morning so I need some kind of automation to import this file. Speed isn't so much an issue at 3am, when I split the file I calculated that it would take around 20 minutes to import the whole lot.
If you are on Linux / Mac then use a CRON job. It is specifically designed to do this kind of job. On a command line enter
crontab -e
and for 2am of every day of every week add this line
0 2 * * * mysql -u username --password=password -D databasename < mysql_text.sql
and any commands in the file mysql_text.sql will be executed. By using the --password= option you can provide the password and the system will not prompt you for it.
crontab -e
and for 2am of every day of every week add this line
0 2 * * * mysql -u username --password=password -D databasename < mysql_text.sql
and any commands in the file mysql_text.sql will be executed. By using the --password= option you can provide the password and the system will not prompt you for it.
An additional thought, on mant systems the cron line will echo into the log thus showing the password. One way to prevent this is to put the mysql command in a text file and make the file executeable with chmod +x filename and you can then run the text file from the cronjob
0 2 * * * /path/to/file/textfile.sh
0 2 * * * /path/to/file/textfile.sh
ASKER
OK, I understand what you have written, but the text file isn't an sql formatted file. Its csv file (using | as a seperator) so how would I import it? I think using your method it would need to be an proper sql file, no?
No , it do not need to be proper sql file/dump , you can simple add you LOAD DATA command or any other command in the text file that will be executed in the cron.
the command can take the form like :
mysql -e LOAD DATA LOCAL INFILE 'thefile' FIELDS TERMINATED BY '|' LINES .......
Similarly other commands can be added to the file in sequence as per reqirement and that file can be executed through cron as explained by others in last few replies.
the command can take the form like :
mysql -e LOAD DATA LOCAL INFILE 'thefile' FIELDS TERMINATED BY '|' LINES .......
Similarly other commands can be added to the file in sequence as per reqirement and that file can be executed through cron as explained by others in last few replies.
ASKER
Sorry, having a really bad day here.
So i create a text file on the server (dataimport.txt) with my command in
mysql -e LOAD DATA LOCAL INFILE '/home/wgoffice/public_htm l/FilesToI mport/wgo_ accountpro ducts_repl ace".$fn." .txt' INTO TABLE `accountproducts_schema` FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n';
then create a cron job 0 2 * * * /path/to/file/dataimport.t xt
Is this correct?
So i create a text file on the server (dataimport.txt) with my command in
mysql -e LOAD DATA LOCAL INFILE '/home/wgoffice/public_htm
then create a cron job 0 2 * * * /path/to/file/dataimport.t
Is this correct?
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Worked out a solution myself taking an alternative approach.
mysql -u username -p -D database_name < text-file.sql
It will prompt you for the password for whatever username you are using. You may also need to incread MySQL buffer sizes and such, but I'll need to dig that one out
Using the command line like this is far, far, far faster than any other method.