I have procedure/cursor that does the following
selects all row ids (say from table a) where the same id is in another table (say table b) and also matches an additional criteria.
Then iterates through these and updates a column in table a (say mycol) to a value
I have a fetch todo bulk collect into v_id limit 5000; so it does 5000 at a time and commits
It also needs to write to dbms_output.put_line the id it has changed
The attached code works (although modified to be generic, so hopefully no typos)
however it is taking a few hours to run where there are 3.8 million rows in table a and 2900 rows in table b
This is getting run from a sql script as exec test;
So questions are:
1. With a small amount of test data it runs through and immediately shows all entries it is updating. With the numbers as above there is no sign that anything is working (currently) which is partly the intention of the dbms_output.put_line and limit 5000. Why don't I see each row or group of rows getting updated? I can however see the sqlplus process running with no further output (yet)
2. It needs to commit after updating each 5000 rows and also right at the end (I might have an extra/too many commits?)
3. How can this code be improved or are there any flaws in the way this has been done?
Thanks in advance. Any assitance/comments appreciated.