Importing a verly large CSV file into RDBMS

I have a 4gb data file that has 4 million rows and 314 colums/fields. I am trying to import this either into a MYSQL or SQL SERVER dbase. I have tried using the Import/Export and MIgration tool kits on SQL server. I encounter an error each time, when i amtrying to import it. Usually In the sql server it is truncation errosrs and In MYSQL the error is  the length of the row is too large.

Any help in this matter is greatly appreciated.
m3mdiclAsked:
Who is Participating?
 
Luan JubicaConnect With a Mentor Project ManagerCommented:
The first thing that comes in mind is to split the file......
0
 
Garry GlendownConsulting and Network/Security SpecialistCommented:
Length of the row in reference to a certain column? Have you checked the char sets used, as e.g. a string with the length of 10 utf-8 chars might be something like 30 chars long ...
Also, on the import, unless you use transactions, you should end up with "n" rows inserted, so you could check the CSV after those n lines and see if the next line is in fact malformed/defective ... maybe delete it, try importing the rest of the data, than manually fix the corrupt lines ...
On Unix, splitting the csv up into smaller packets, like e.g. 100k lines a piece, might also help getting it in ...
0
 
Dave BaldwinFixer of ProblemsCommented:
0
 
ob2sCommented:
Hi,

Assuming there aren't fixed length rows in your data file, check if your import tool's expectation of line terminators matches what is actually used in the data file (e.g. \n vs \r\n).

If you're using mysqlimport, see the --lines-terminated-by option.

Hope this helps,.
0
All Courses

From novice to tech pro — start learning today.