I have a very huge SQL Server table which stores GPS data of more than 1000 vehicles [More than 100 Million records in three months]. Each month, I manually delete data older than three months. Now I want to do this automatically. I have planned to write a SQL Job with following steps
Use bcp utility and create a backup file on disk with only last three months data
Truncate the table
Drop all the indexes of the table
Use bcp shell command to restore the table using the backup file
Recreate all the indexes
My question is, Is this aright way to do this? Is there any other way to optimize this?