Link to home
Start Free TrialLog in
Avatar of androknego
androknego

asked on

Import/processing large volume of data (Oracle 9i)

Hi,

I need advice on the method to be used for the following requirement:
- A large flat file (circa 1 million) needs to be imported into an Oracle 9i database, then several queries needs to be run against the existing data which would validate the newly imported data, and then the new data needs to be inserted into production tables, and some other production tables need to be updated (for the records that have been validated).

As far as I know, the most efficient way to do this is to import the data from the flat file into a NOLOGGING table using SQL Loader. After that, the hard part comes - how do you efficiently scan 1 million records, and validate/insert/merge the data into the production tables. The best thing I could think of is to take chunks of 1000-5000 records, insert them into a temporary table and then do the rest of processing. However, I'm not sure how to actually scan the data from the large NOLOGGING table and then delete the data from that table. I could declare a cursor on the whole table and then loop through 1000-5000 rows, insert them into a temporary table, delete them (using DELETE WHERE CURRENT OF) and then process the temporary table. However, I'm not sure if this would take a lot of CPU/memory resources.

Any help would be greatly appreciated,

Regards

Andrew
ASKER CERTIFIED SOLUTION
Avatar of MikeOM_DBA
MikeOM_DBA
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of nadhuvi
nadhuvi

Bulk collect and for all will defently solve yr problem .refer manual
Avatar of androknego

ASKER

Hi again,

Thanks for the tip - but I've looked at the docs and I tried to run a test script and it's always returning an error. Here is what I got - two tables with identical structure called test_trader and test_trader1:

TRADER_ID                           NOT NULL NUMBER(16)
PROGRAM_ID                          NOT NULL NUMBER(16)
NAME                                NOT NULL NVARCHAR2(200)

test_trader1 has got some data in it, and I want to insert 5 rows of them into test_trader. Here is the script I have (I ran it on Oracle 9.2):

DECLARE
   TYPE trader_table IS TABLE OF Test_TRADER1%ROWTYPE;
   trader_rec trader_table;
   
   CURSOR curs IS
      SELECT *
      FROM Test_TRADER1;
BEGIN
   OPEN curs;
   FETCH curs
      BULK COLLECT INTO trader_rec
       LIMIT 5;
   FORALL i IN trader_rec.FIRST..trader_rec.LAST
  --INSERT INTO Test_Trader VALUES (trader_rec(i).trader_id, trader_rec(i).program_id, trader_rec(i).name);
   INSERT INTO Test_Trader VALUES (trader_rec(i));
   
   CLOSE curs;
END;

When I try to use the commented out INSERT INTO Test_Trader statement I get error PLS-00436: implementation restriction: cannot reference fields of BULK In-BIND table of records. I guess what that means is that I can refer to specific columns of Row which is contained in a Table container. Fair enough...

However when I use the statement just as I've shown it here I get error PL/SQL: ORA-00947: not enough values. From the official Oracle docs I saw that you can do this, and I've checked many times that the structures of the two tables are identical (I've created them using CREATE TABLE AS SELECT...).

Regards

Andrew
Just a small correction: " I guess what that means is that I CAN'T refer to specific columns of Row which is contained in a Table container."
You cannot use an array of 'record' or 'rowtype', you need to define single arrays as:

type num_type is table of number;
type nvchr_type is table of varchar2(200);

v_TRADER_ID num_type;
v_PROGRAM_ID num_type;
v_NAME nvchr_type;

And then do:

FETCH curs
 BULK COLLECT INTO v_TRADER_ID, v_PROGRAM_ID, v_NAME
..etc...



Hi Mike,

Sorry, but my idea was OK. I got it working by simply removing two brackets, it should be
INSERT INTO Test_Trader VALUES trader_rec(i);

I'll give you the pts anyway, you did point me in the right direction.

Regards

Andrew