The batch job that I am working with right now, is responsible for deleting certain records from a very large table (The table has 81+ million records). Once we are through with the job, almost 75% of this table would have been cleared. We have a couple of approaches in which we can achieve the final results in less than 4 hours. But, all these approaches involve dropping the actual table or creating a new temp table and then renaming the temp table.
At this point, we want to go back to basics and see if we can achieve the results in 4-5 hours by using a simple DELETE command. The current approach in this approach is:
- Fetch employee id's which meet the criteria in a block of 1717 records
- For every record fetched, perform a DELETE from the table.
- Commit at the end of the first block i.e. after processing 1717 records and then continue with the next block (this loop continues till end of table).
The problem that I am seeing is that the job is utilizing only 5% CPU, which means, the job is very I/O bound. Can any one get me started on how to get about improving the I/O in this job? Any articles on improving the I/O will also suffice.
Thanks for your time.