Is there a way to set up monitoring of index pct used and kick off an online rebuild of the index when the 20% threshold is met.
what happens is a session spawns multiple inserts, deletes and updates on tables to set up a "data scenario" that the application uses. When the application has completed the java framework on the app side spawns an additional sqlid that deletes all the data for the scenario and replaces it with the original data before the session. Each time (hundreds of times a day) after the session closes the indexes on these tables (some are 3mm + rows) are not usable due to the massive dml changes. The next user runs another session and everything is going full table scan and the performance is terrible.
Any ideas how to deal with this?