I need to import about 600 different CSV files from different sources and then begin normalizing the data.
What I want to do is import them using Bulk Copy into files like:
Once the files are imported, then I can run utilities in CF to show me the first 10 rows, let me figure out which column is really the first name, the last name, etc., and then put together the mappings to place it all into 1 consolidated table
My problem is the bulk copy ...
What I need to do with a SQL statement is:
1 - Create the import table with 100 columns (I can do this)
2 - Issue the bulk copy against the file, such as c:\products1.csv
(note: some are CSV, some are TSV, but that's a delineater issue)
3 - Have it import
I have seen examples like this:
FIRSTROW = 1,
MAXERRORS = 999999,
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n'
It fails with errors:
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
(by coincidence, the Database I created to do the bulk imports is named BULK).
I'm fully aware that the data may have errors and inconsistent formatting between CSV files.
The question is this: what's the best way to "just get the data in there" into SQL, and then I can clean it up in phase 2 once it is in the temp table?
I'm running MS SQL 2005, and I need to do this with SQL statements. I have hundreds of files to import and will be using a separate box running Cold Fusion to find the files to import and initiate a CFQuery to run the query on the SQL server.
Doing the wizard 600 times is not an option...