I'm using SQL Server 2005, VB.NET 2005. I want to be able to import a very large (600,000+ records) CSV file into VB.NET, and insert those records into a SQL table called "OLP_PFM"
I've done this by doing the following:
1. Use OpenFileDialog to select the CSV file
2. Importing the CSV file to a dataset
3. Matching the CSV headers to the SQL headers using listboxes (the CSV files all have different headers. One may say "Zip5", another may say "Zip Code", and another might say "Zip"...but they are all matched to the same "Zip" column in SQL). There is no real way to anticipate what the CSV column headers will be, which is way I'm allowing the user to match columns.
4. Rename the CSV dataset to their matching SQL headers.
4. Clicking an "Import" button that loops through the dataset and adds each record to SQL one-by-one.
All of the above works fine for small files, but for very large files it's slow (because it's loading the CSV into a dataset in memory?). My question is...what's the best way to accomplish all this efficiently?...Import a CSV, match/map columns from the file to the database, then quickly import the CSV into SQL Server. I might also want to do some padding of leading zeros to fields. For example, make sure a certain column's contents are in the format of "08829" instead of "8829" I can do this easily when I loop through the dataset now, but I'll need this ability if I use some other suggested method. THANKS!