• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 411
  • Last Modified:

upload csv extract to database - server timeout

Hi All

I have a system where the client needs to upload a csv to their website (approx 4mb) the contents of which are extracted and placed into a mysql database.

I have created a script to process the csv data  and separate it into 3 separate tables.

The script and the upload is working fine, the problem I have is that when the files is c. 4MB it is taking a long time to process (c. 4000 rows in the csv converted into 1 table with 3960 rows, another with 40 rows and the 3rd with 200,000+)  and the browser is timing out.....

Clearly the big problem I have is the 200,000+ records... I am inserting each one individually.

INSERT INTO table(field1,field2,field3) VALUES(value1,value2,value3);

Is there a better way to do this?

Cheers

0
cycledude
Asked:
cycledude
  • 6
  • 2
3 Solutions
 
dj_alikCommented:
0
 
cycledudeAuthor Commented:
dj_alik:

Thanks, I have though of using that but the problem is the CSV file is 4040 rows by 130 columns

the data is spilt into 3 tables

table 1 has 4 cols and c. 4000 rows
table2 has 4 cols and c. 40 rows
table3 has 4 cols and 200,000+ rows

Table3 contains 120 rows for each entry in table1 and table2, the data coming from 120 cols of the csv.

0
 
dj_alikCommented:
First load the data into a single table, then move it around to the needed tables using SQL statements.  If you need something more complex some programming will be required.
0
NEW Veeam Agent for Microsoft Windows

Backup and recover physical and cloud-based servers and workstations, as well as endpoint devices that belong to remote users. Avoid downtime and data loss quickly and easily for Windows-based physical or public cloud-based workloads!

 
cycledudeAuthor Commented:
so it will be faster to manipulate the database than csv?

0
 
cycledudeAuthor Commented:
how about something like

INSERT INTO tablename VALUES ( 'Bill', 29, 'Ford' ), ( 'Mike', 16, 'Beetle' ), ( 'Alisa', 36, 'Van' )");

how many 'values' can you put into a single sql statement?

0
 
cycledudeAuthor Commented:
I will need approx 120 sets of values.... can it be done this way?
0
 
cycledudeAuthor Commented:
welll I tried it, and my word did it make a difference!

I juust created a variable to store all of the ( 'Bill', 29, 'Ford' ) values for each insert, so 120 per insert...

massively quicker than my original single insert per value.

Cheers
0
 
hieloCommented:
>>it is taking a long time to process
You are probably doing an insert per record. You can try multiple rows per insert.

>>...and the browser is timing out
at the beginning of your script put set_time_limit(0)
http://php.net/manual/en/function.set-time-limit.php

>>the data is spilt into 3 tables
OK, then create a txt file for each of your tables then do:
LOAD DATA INFILE 'tableX.txt' INTO TABLE testX

for each of the files

>>how many 'values' can you put into a single sql statement?
I have not seen any imposed limit on this. However, you need to take into consideration your system resources.  If you are inserting "small" amounts of data (like age, status, firstName,lastName as opposed to paragraphs and paragraphs of info per record) then somewhere between 100-200 per insert might speed things up.
0
 
cycledudeAuthor Commented:
I was able to solve this one by merging the numerous inserts into a single insert, which massively reduced the time taken to perform the update.
0

Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

  • 6
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now