What is the best option for importing a large (700,000 rows) of data into a MYSQL Database? This has to be done daily.
Posted on 2010-09-18
I am trying to import a larger XML file into my MYSQL DB. I have already written the script to ftp a zip file, unzip it, and parse the XML.
The problem I am having is the that doing a while loop and inserting each row one at a time takes forever! I am planning on creating a cron job for this, but it seems that there would be a more efficient way of importing a large amount of data.
I had thought about creating an .sql file (via PHP) and then run that script file from a php script using SHELL commands if possible. Even maybe a stored procedure?!?!
So, is running an INSERT via mysql_query in PHP for every row the best option in PHP? I know that you cannot run multiple statements in a PHP script.
What are my options? If I just had to do this once, I would wait. But, I will be running this once a day... It seems that this will take 2 hours the way it is now!
One last question on this: Is there a more efficient way of deciding whether an item needs updated or not? Basically, I'm either updating if found and inserting if not found. But, it seems that it's a waste of time to update rows that don't need updated. But, wouldn't it take more time to compare every column in the row to see if it's been changed???
Thanks you in advance!