I have a SQL Database with 1 primary data table in which most fields are used (about 20 columns). Index optimisation over a few months has resulted in 5 indexes. The concern I have is the database is 50Gb but the indexes take a further 95Gb. This simply feel wrong to me.
Is there a tool that can analyse the existing indexes, and the frequency of use of the included elements and suggest more efficient formats?
I have used profiler to capture transactions, and used database tuning to recommend indexes. I have then checked these against my knowledge of the queries and combined where logic says I should. This has been an on-going process and the database is fast; no queries taking longer than a second or 2. But having the indexes take up more "space" than the data suggests I have missed a trick.