I have a 25GB database
We have web app which savings vehicle tracking data
We are on boarding lots more customers, and with this, the database sets to grow into 100's of gig's
Thats really expensive when it comes to RAM with the data centre
The main reason for the growth is we need to keep the last 12 months data, so this really makes the DB grow in size
The reality is, most people only care about the last 1 month, but we do need to keep the last 12 months for accident data etc
So my question is in relation to RAM and DB size and just caching the last months worth of data from 1 massive table
I've looked into SQL server 2016 stretch DB, but we aren't there yet, so its not an option
Someone mentioned partitioning on dates, but not really sure what that means
Others also mentioned SQL is clever and does this for you, but we do notice a slow down when the memory on the server is on the roof.
And we are seeing growth rate of 200MB per day
we've just run out of ram on the server again
Do i just need to endlessly throw RAM and an ever growing database, or is there a more efficient way to only cache the latest data in memory?
we are using sql server 2016