I need to parse a couple of CSV files (varying in size from a few KB to around 100 MB) and load the data to a SQL 2000 database.
I am using CsvReader from CodeProject (http://www.codeproject.com/KB/database/CsvReader.aspx
) and it seems to work great for reading large CSV files.
Here is the structure of the CSV files I am dealing with:
Field1 FIeld1_CHG Field2 Field2_CHG
So, I need to read the <fieldname>_CHG column. If I see a "Y", I need to load the data from <fieldname>, otherwise ignore it.
I have around 60+ field names in the CSV files and I need a way to map those to the database fields.
I started by creating a class for the field names.
THen I thought I would be able to map each field in the CSV file to a class member. Something like:
UpdateElement e = new UpdateElement();
Hashtable mappingTable = new Hashtable();
Then when I read the data, I could check to see if each column from the CSV exists in the keyscollection of the hashtable. Something like:
foreach (string field in fieldHeaders)
// check for a "Y" on the <fieldname>_CHG field
if (csv[field + "_CHG"].ToUpper() == "Y")
I don't even know if my logic is correct. I guess I need to know if this is the correct approach. If not, what's the best way to do this?
I don't know how easy it will be to implement Bulk insert in this scenario since the columns from the CSV won't match the columns of the destination table.