On SQL 2000, Got a batch of about 100,000 inserts every 10-15 seconds. to a table.. there are no performance issues with the writing..
However.. I need to make the last few minutes of data available for multiple-clients to query in near realtime.. doing it on the table while its inserted into, would make the selects very slow, as well as the inserts will be directly affected as well.
I was thinking of mkaing an in-mem temptable and managing last 30-60 seconds of inserts. Each row is about 30 bytes - so RAM is a non-issue in terms of the temp-table footprint.
The select can be a VERY DIRTY read, and data does not have to be commited yet.
Any takes on how to do this ?
I can think of 2 possible ways :
1. Parallel - Double-insert.. to in-mem temptable and real commit
2. insert and trigger on insert for an additional insert.
THinking trigger overhead of #2 will have large overhead and performance issues.
More ways are welcome!
7 Columns (all int) C1,C2,C3,C4.C5,C6,C7
A workable Example is preffered!