my problem is simple.
i need to import data and insert/update my databases tables.
i am keeping track of the adds, inserts and deletes.
My current structure first reads all the key values from the target table into a dataset. During the process i scan this table for the key. If found it is an update, if not found i add the key and update the database, if it is a delete i remove the key from the table.
This works and eliminates hits on my database and web site. My database is accessed via web services so it is critical not to make to many hits on the database.
I ran into a problem the other day when i tried to import 200,000 records into a blank table. As the data set did repeated adds my preformance went into the tank.
Can anyone suggest the best way to build an in memory table of keys with up o 1/4 of million records that allows fast add delete and searching. Key size can be upto 40 characters.
By the way I know it is the dataset management, because i disabled it on my initial import where the table was blank and I was able to update my table at a rate of 100+ records a second. With the dataset enabled by peformance dropped to 5 records a second.
hope this all makes sense