i have to import a 15g csv file, 7+ million rows, and 1650 columns. the number of columns is the problem. here is what i've tried:
1) import and export wizard. the wizard can't handle that many columns. deleting excess columns defined in the wizard causes the import to fail - conversion error is always generated on the last column that is not deleted.
2) bulk import. I've tried bulk import, specifying the columns, but this generates an error on the last column in each row so doesn't import any records. error message:
Msg 4863, Level 16, State 1, Line 2
Bulk load data conversion error (truncation) for row 1, column nn (VARNAME), where column nn is the last column specified and VARNAME is the column name. doesn't seem to matter which column I stop at.
3) BCP command line utility. (ipcsv in powershell). i can't seem to get the syntax correct for this. i'm trying: ipcsv -path c:\file.csv. error message: the member "1" is already present.
4) i wrote a vb program to read the file and write it out as two separate files so that each one is under SQL Server's maximum column limit of 1024. This seems to be working, but it takes a very long time to read and write the files. I'm using the TextFieldParser and WriteAllText objects/methods. These seems to be the best because I suspect the file has a few extra commas here and there, but these methods are much slower than Line Input.