I have created an ssis package that creates and populates tables in a DB every night
These tables have alot of data so naturally they need to be indexed. Because I do not yet know who or what queries will be ran on this database, I do not know what indexes to create. I set up SQL Server profiler to capture all queries ran against the database and mark the time the queries take to execute so that anything that takes longer than 10 seconds to load, I will run and grab the missing indexes code to create new indexes to improve load speeds on those queries.
The problem is, I am speaking to an audience that thinks you can index a table based on uniqueness of the columns alone and that is all there is to it. My experience, that is not possible, it would only be a guess. I even recall thinking a few years ago if I index all the columns, then the indexing problems are solved. The question is, is my approach the correct one to optimize the database based on queries executed against the database by users or is there a better approach?
Also, how do I explain how indexing works to a non technical audience? I tried but it seems to go over their heads, keeping in mind these are people that have been hooked on MS access and are clinging on to it for dear life.