Am developing a Pl/SQL program and the program is quick and successful when run for a small number (1000) of rows but fails after a long run (30 minutes) if scaled up to 10,000 rows. It fails with an out of memory ORA-04030 error.
What it does:
Program processes records in a loop, calling several functions along the way. Many of these functions populate a record type like the following example:
Declare
rMyRec MyRecType; -- a type we defined
cursor c is select * from blah;
For r in c loop
MyFunction(rMyRec);
-- do something with current record "r" and data in rMyRec
end loop;
MyFunction does this
rMyRec := MyRecType(null, null, null, null, null, null);
then reads a record and puts data into rMyRec
Question is: does initializing this over and over without ever releasing it cause it to use up memory, so that running with larger number of records results in memory error? That is, does it use new memory every time I do:
rMyRec := MyRecType(null, null, null, null, null, null);
If this is the problem should I be destroying the initialized record every time, or initialize it one time in the calling routine, or something else?
Thanks,
Terry