I'm hoping the solution to this is something really easy that I'm just overlooking. Currently I'm using DTS to load data from a comma delimited CSV file into temporary tables which are then combined with my active application tables. I'm trying to put together a system which minimizes the manual part of this data transfer and gets away from DTS. I will still have to manually pull data out of one system and get it into CSV files for the import, but I'm hoping that at that point I can let my application take over.
What I've done so far is have my application create the temp tables using CREATE TABLE, then take the CSV files which get uploaded to the server and do a BULK INSERT on them. The data all goes in without any errors (now that I've solved my premature EOF error) but the problem is, the files are all comma delimited and several of the fields contain commas. Client names can be multiple names separated by a comma (e.g. Smith, Smith, and Jones, LLC) and also there are several numeric fields which contain commas (e.g. 1,100 sq ft). Of course, what happens on these rows is that the data gets spread out across more columns than it should and throws off the order of things.
So, my question is: is there a way to have the BULK INSERT ignore commas that are contained in data fields but still use the commas which delimit those data fields?
Thanks very much!