I have a system where the client needs to upload a csv to their website (approx 4mb) the contents of which are extracted and placed into a mysql database.
I have created a script to process the csv data and separate it into 3 separate tables.
The script and the upload is working fine, the problem I have is that when the files is c. 4MB it is taking a long time to process (c. 4000 rows in the csv converted into 1 table with 3960 rows, another with 40 rows and the 3rd with 200,000+) and the browser is timing out.....
Clearly the big problem I have is the 200,000+ records... I am inserting each one individually.
INSERT INTO table(field1,field2,field3) VALUES(value1,value2,value3);
Is there a better way to do this?