I have a high throughput c# console app which is saving around 250-350 records a sec to a log table (sql server 2017 enterprise)
We are looking at ways to increase performance and sometimes a report is run on the table and things grind to a halt
We are also struggling to keep up at busy times
We currently do not use partitions, and are investigating this
We also insert one record at a time, and attach the insertedID to the c# object of log data for further operations
I'm considering partitions, however i'm not sure what to partition on hour, day, week month
We generally store 6 months of historical data (could also be 1 month or 24 months for smaller number of IOT devices.)
data is usually queried by day, last 7 days, week, month, and sometimes custom date ranges.
I would welcome ideas and comments about what is the best option here, we write more than we query, but queries will span multiple days/weeks
I'm also considering a different approach on inserting, instead of one at a time, do a batch insert.
However i'm unsure how I can then associate the database insertedID with C# object.
Currently its one at a time, so i just add the insertedID to the object, and send that off to messages queues for further processing,
I was looking at the SQL OUTPUT and inserted.ID but not sure when batching 500 records how to marry the inserted ID with the c# object
So in summary I would welcome ideas on
1) partition setup
2) batching inserts and marrying the inserted ID with the c# object (for further processing)