Bulk Collect Large union cursor

I've used Bulk collect in the past with small tables or cursors and am familiar with the overall process.   I'm using Oracle 11G in W Server 03
 
The current problem is that I have a cursor (GC$CA) that is defined like:
(Select a.*, b.fd  from somefile a, someother b where a.condition = b.condition and a.conditionA = 'U' and a.field1 = b.field1
UNION
Select a.*, 1 as fd
from somefile where a.conditionA = 'B' )

The code  looks like this:
Loop_cnt :=0;
select count(*) into v$row_counter from somefile

While loop_cnt < v$row_counter
LOOP
   FOR cur$ca in G$CA
   LOOP
     loop_cnt:= loop_cnt +1
    <CODE>


I wish to convert this to a Bulk collect. My only concern is that the table a has 300+ columns! (inherited code...).  
I'm having trouble with
1) defining the collection
  I think there's an easier way than to create tables of each of the 300+ fields in the table...?
 
2) the bulk collect statement.
The statement should be:

Select ?? Bulk Collect into  ???
from ???

Because of the union and where clauses, I'm confused here.  

I'm not intending on cleaning up the tables at this point since that would be a major effort .
I'm trying to take a step at a time and anything that can make this step easier would be greatly appreciated.



GNOVAKAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

AndytwCommented:
1).  That's no problem that you have over 300 records since it's not necessary to create a PL/SQL table for each of the fields.  You can define a type based on the columns selected from the cursor - you do this using anchored types, e.g.%ROWTYPE

2).  The correct syntax is:
 SELECT ....
 BULK COLLECT INTO ....
 FROM ...

Or since you're declaring a cursor, you open it, then use the following syntax:
FETCH .... BULK COLLECT INTO ....

If you have many rows in the returned dataset, you may want to make use of LIMIT option of bulk collect (BULK COLLECT INTO .... LIMIT <numeric_expression>).  This allows you do limit the number of rows returned by the fetch.  For an example see the following chapter from the Oracle docs:

Oracle Database PL/SQL User's Guide and Reference
http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14261/tuning.htm#sthref2231     

Putting all that together, you'll end up with something along the lines of the following ...
DECLARE
  CURSOR gc$ca is ( Select a.*, b.fd  
                    from somefile a, 
                         someother b 
                    where a.condition = b.condition 
                      and a.conditionA = 'U' 
                      and a.field1 = b.field1
                    UNION
                    Select a.*, 1 as fd
		    from somefile 
                    where a.conditionA = 'B' );
  TYPE ltDataTable is TABLE OF gc$ca%ROWTYPE INDEX BY BINARY_INTEGER;
  ltData ltDataTable;
BEGIN
  OPEN gc$ca;
  FETCH gc$ca BULK COLLECT INTO ltData;
  CLOSE gc$ca;
  
  FOR i IN 1.. ltData.COUNT
  LOOP
    -- process data here
    -- <CODE>
    NULL;
  END LOOP;
END;
/

Open in new window

0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
shru_0409Commented:
declare
   TYPE t_type IS TABLE OF table%ROWTYPE;
   l_type       t_type;

   CURSOR c1  IS
   select query;
   
BEGIN

   OPEN c1;

   LOOP
      FETCH c1  BULK COLLECT INTO l_type  LIMIT 500;

      FORALL i IN l_type .FIRST .. l_type .LAST
         INSERT INTO table_name
              VALUES l_type (i);
      EXIT WHEN c1%NOTFOUND;
   END LOOP;
   CLOSE c1;
   COMMIT;
END;

this is help for u  
i agree with andytw about point 1.

0
GNOVAKAuthor Commented:
Thanks folks.
The reason for the exercise is that after 5 million records, the current code slows to a crawl. Seems like a memory issue somewhere.  Bulk collect should solve it.

One last clarification -  before the program addresses cur$ca.<field>  All I need do is a global search replace with replacing cur$ca with ltData, correct?

Any other gotcha's that I might notice?
0
Big Business Goals? Which KPIs Will Help You

The most successful MSPs rely on metrics – known as key performance indicators (KPIs) – for making informed decisions that help their businesses thrive, rather than just survive. This eBook provides an overview of the most important KPIs used by top MSPs.

AndytwCommented:
Yes, that's correct, but more precisely you'll have to replace all "cur$ca.<field>" with "ltData(i).<field>".  Since it's a collection, you need to provide the index of the element you're referencing.

With 5 millions records, you may suffer memory problems storing a collection of that size in memory.  In which case, I would definitely recommend using LIMIT (providing you don't need to process the entire collection at once).  So, you would do:
DECLARE
  CURSOR gc$ca is ( Select a.*, b.fd  
                    from somefile a, 
                         someother b 
                    where a.condition = b.condition 
                      and a.conditionA = 'U' 
                      and a.field1 = b.field1
                    UNION
                    Select a.*, 1 as fd
        from somefile 
                    where a.conditionA = 'B' );
  TYPE ltDataTable is TABLE OF gc$ca%ROWTYPE INDEX BY BINARY_INTEGER;
  ltData ltDataTable;
BEGIN
  OPEN gc$ca;
  LOOP
    FETCH gc$ca BULK COLLECT INTO ltData LIMIT 5000;
    EXIT WHEN ltData.COUNT = 0;
    
    FOR i IN 1.. ltData.COUNT
    LOOP
      -- process data here
      -- <CODE>
      NULL;
    END LOOP;
  END LOOP;
  CLOSE gc$ca;
END;
/

Open in new window

0
shru_0409Commented:
if u r insert bulk data than make the commit after some limit other wise redo log will generate....
0
GNOVAKAuthor Commented:
Great job - thanks folks
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Oracle Database

From novice to tech pro — start learning today.