We help IT Professionals succeed at work.

Check out our new AWS podcast with Certified Expert, Phil Phillips! Listen to "How to Execute a Seamless AWS Migration" on EE or on your favorite podcast platform. Listen Now

x

Bulk Collect Large union cursor

GNOVAK
GNOVAK asked
on
Medium Priority
2,386 Views
Last Modified: 2013-12-19
I've used Bulk collect in the past with small tables or cursors and am familiar with the overall process.   I'm using Oracle 11G in W Server 03
 
The current problem is that I have a cursor (GC$CA) that is defined like:
(Select a.*, b.fd  from somefile a, someother b where a.condition = b.condition and a.conditionA = 'U' and a.field1 = b.field1
UNION
Select a.*, 1 as fd
from somefile where a.conditionA = 'B' )

The code  looks like this:
Loop_cnt :=0;
select count(*) into v$row_counter from somefile

While loop_cnt < v$row_counter
LOOP
   FOR cur$ca in G$CA
   LOOP
     loop_cnt:= loop_cnt +1
    <CODE>


I wish to convert this to a Bulk collect. My only concern is that the table a has 300+ columns! (inherited code...).  
I'm having trouble with
1) defining the collection
  I think there's an easier way than to create tables of each of the 300+ fields in the table...?
 
2) the bulk collect statement.
The statement should be:

Select ?? Bulk Collect into  ???
from ???

Because of the union and where clauses, I'm confused here.  

I'm not intending on cleaning up the tables at this point since that would be a major effort .
I'm trying to take a step at a time and anything that can make this step easier would be greatly appreciated.



Comment
Watch Question

Commented:
Unlock this solution and get a sample of our free trial.
(No credit card required)
UNLOCK SOLUTION
declare
   TYPE t_type IS TABLE OF table%ROWTYPE;
   l_type       t_type;

   CURSOR c1  IS
   select query;
   
BEGIN

   OPEN c1;

   LOOP
      FETCH c1  BULK COLLECT INTO l_type  LIMIT 500;

      FORALL i IN l_type .FIRST .. l_type .LAST
         INSERT INTO table_name
              VALUES l_type (i);
      EXIT WHEN c1%NOTFOUND;
   END LOOP;
   CLOSE c1;
   COMMIT;
END;

this is help for u  
i agree with andytw about point 1.

Author

Commented:
Thanks folks.
The reason for the exercise is that after 5 million records, the current code slows to a crawl. Seems like a memory issue somewhere.  Bulk collect should solve it.

One last clarification -  before the program addresses cur$ca.<field>  All I need do is a global search replace with replacing cur$ca with ltData, correct?

Any other gotcha's that I might notice?
Commented:
Unlock this solution and get a sample of our free trial.
(No credit card required)
UNLOCK SOLUTION
if u r insert bulk data than make the commit after some limit other wise redo log will generate....

Author

Commented:
Great job - thanks folks
Unlock the solution to this question.
Thanks for using Experts Exchange.

Please provide your email to receive a sample view!

*This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

OR

Please enter a first name

Please enter a last name

8+ characters (letters, numbers, and a symbol)

By clicking, you agree to the Terms of Use and Privacy Policy.