Bulk Collect Large union cursor

I've used Bulk collect in the past with small tables or cursors and am familiar with the overall process.   I'm using Oracle 11G in W Server 03
 
The current problem is that I have a cursor (GC$CA) that is defined like:
(Select a.*, b.fd  from somefile a, someother b where a.condition = b.condition and a.conditionA = 'U' and a.field1 = b.field1
UNION
Select a.*, 1 as fd
from somefile where a.conditionA = 'B' )

The code  looks like this:
Loop_cnt :=0;
select count(*) into v$row_counter from somefile

While loop_cnt < v$row_counter
LOOP
   FOR cur$ca in G$CA
   LOOP
     loop_cnt:= loop_cnt +1
    <CODE>


I wish to convert this to a Bulk collect. My only concern is that the table a has 300+ columns! (inherited code...).  
I'm having trouble with
1) defining the collection
  I think there's an easier way than to create tables of each of the 300+ fields in the table...?
 
2) the bulk collect statement.
The statement should be:

Select ?? Bulk Collect into  ???
from ???

Because of the union and where clauses, I'm confused here.  

I'm not intending on cleaning up the tables at this point since that would be a major effort .
I'm trying to take a step at a time and anything that can make this step easier would be greatly appreciated.



GNOVAKAsked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
AndytwConnect With a Mentor Commented:
1).  That's no problem that you have over 300 records since it's not necessary to create a PL/SQL table for each of the fields.  You can define a type based on the columns selected from the cursor - you do this using anchored types, e.g.%ROWTYPE

2).  The correct syntax is:
 SELECT ....
 BULK COLLECT INTO ....
 FROM ...

Or since you're declaring a cursor, you open it, then use the following syntax:
FETCH .... BULK COLLECT INTO ....

If you have many rows in the returned dataset, you may want to make use of LIMIT option of bulk collect (BULK COLLECT INTO .... LIMIT <numeric_expression>).  This allows you do limit the number of rows returned by the fetch.  For an example see the following chapter from the Oracle docs:

Oracle Database PL/SQL User's Guide and Reference
http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14261/tuning.htm#sthref2231     

Putting all that together, you'll end up with something along the lines of the following ...
DECLARE
  CURSOR gc$ca is ( Select a.*, b.fd  
                    from somefile a, 
                         someother b 
                    where a.condition = b.condition 
                      and a.conditionA = 'U' 
                      and a.field1 = b.field1
                    UNION
                    Select a.*, 1 as fd
		    from somefile 
                    where a.conditionA = 'B' );
  TYPE ltDataTable is TABLE OF gc$ca%ROWTYPE INDEX BY BINARY_INTEGER;
  ltData ltDataTable;
BEGIN
  OPEN gc$ca;
  FETCH gc$ca BULK COLLECT INTO ltData;
  CLOSE gc$ca;
  
  FOR i IN 1.. ltData.COUNT
  LOOP
    -- process data here
    -- <CODE>
    NULL;
  END LOOP;
END;
/

Open in new window

0
 
shru_0409Commented:
declare
   TYPE t_type IS TABLE OF table%ROWTYPE;
   l_type       t_type;

   CURSOR c1  IS
   select query;
   
BEGIN

   OPEN c1;

   LOOP
      FETCH c1  BULK COLLECT INTO l_type  LIMIT 500;

      FORALL i IN l_type .FIRST .. l_type .LAST
         INSERT INTO table_name
              VALUES l_type (i);
      EXIT WHEN c1%NOTFOUND;
   END LOOP;
   CLOSE c1;
   COMMIT;
END;

this is help for u  
i agree with andytw about point 1.

0
 
GNOVAKAuthor Commented:
Thanks folks.
The reason for the exercise is that after 5 million records, the current code slows to a crawl. Seems like a memory issue somewhere.  Bulk collect should solve it.

One last clarification -  before the program addresses cur$ca.<field>  All I need do is a global search replace with replacing cur$ca with ltData, correct?

Any other gotcha's that I might notice?
0
Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

 
AndytwConnect With a Mentor Commented:
Yes, that's correct, but more precisely you'll have to replace all "cur$ca.<field>" with "ltData(i).<field>".  Since it's a collection, you need to provide the index of the element you're referencing.

With 5 millions records, you may suffer memory problems storing a collection of that size in memory.  In which case, I would definitely recommend using LIMIT (providing you don't need to process the entire collection at once).  So, you would do:
DECLARE
  CURSOR gc$ca is ( Select a.*, b.fd  
                    from somefile a, 
                         someother b 
                    where a.condition = b.condition 
                      and a.conditionA = 'U' 
                      and a.field1 = b.field1
                    UNION
                    Select a.*, 1 as fd
        from somefile 
                    where a.conditionA = 'B' );
  TYPE ltDataTable is TABLE OF gc$ca%ROWTYPE INDEX BY BINARY_INTEGER;
  ltData ltDataTable;
BEGIN
  OPEN gc$ca;
  LOOP
    FETCH gc$ca BULK COLLECT INTO ltData LIMIT 5000;
    EXIT WHEN ltData.COUNT = 0;
    
    FOR i IN 1.. ltData.COUNT
    LOOP
      -- process data here
      -- <CODE>
      NULL;
    END LOOP;
  END LOOP;
  CLOSE gc$ca;
END;
/

Open in new window

0
 
shru_0409Commented:
if u r insert bulk data than make the commit after some limit other wise redo log will generate....
0
 
GNOVAKAuthor Commented:
Great job - thanks folks
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.