m3mdicl
asked on
Importing a verly large CSV file into RDBMS
I have a 4gb data file that has 4 million rows and 314 colums/fields. I am trying to import this either into a MYSQL or SQL SERVER dbase. I have tried using the Import/Export and MIgration tool kits on SQL server. I encounter an error each time, when i amtrying to import it. Usually In the sql server it is truncation errosrs and In MYSQL the error is the length of the row is too large.
Any help in this matter is greatly appreciated.
Any help in this matter is greatly appreciated.
Here's the limits for MySQL: http://dev.mysql.com/doc/refman/5.6/en/column-count-limit.html And this is about SQL Server 2005: http://weblogs.sqlteam.com/mladenp/archive/2007/07/24/60267.aspx And I believe this SQL Server 2008: http://msdn.microsoft.com/en-us/library/ms143432.aspx
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Hi,
Assuming there aren't fixed length rows in your data file, check if your import tool's expectation of line terminators matches what is actually used in the data file (e.g. \n vs \r\n).
If you're using mysqlimport, see the --lines-terminated-by option.
Hope this helps,.
Assuming there aren't fixed length rows in your data file, check if your import tool's expectation of line terminators matches what is actually used in the data file (e.g. \n vs \r\n).
If you're using mysqlimport, see the --lines-terminated-by option.
Hope this helps,.
Also, on the import, unless you use transactions, you should end up with "n" rows inserted, so you could check the CSV after those n lines and see if the next line is in fact malformed/defective ... maybe delete it, try importing the rest of the data, than manually fix the corrupt lines ...
On Unix, splitting the csv up into smaller packets, like e.g. 100k lines a piece, might also help getting it in ...