I need advice on the method to be used for the following requirement:
- A large flat file (circa 1 million) needs to be imported into an Oracle 9i database, then several queries needs to be run against the existing data which would validate the newly imported data, and then the new data needs to be inserted into production tables, and some other production tables need to be updated (for the records that have been validated).
As far as I know, the most efficient way to do this is to import the data from the flat file into a NOLOGGING table using SQL Loader. After that, the hard part comes - how do you efficiently scan 1 million records, and validate/insert/merge the data into the production tables. The best thing I could think of is to take chunks of 1000-5000 records, insert them into a temporary table and then do the rest of processing. However, I'm not sure how to actually scan the data from the large NOLOGGING table and then delete the data from that table. I could declare a cursor on the whole table and then loop through 1000-5000 rows, insert them into a temporary table, delete them (using DELETE WHERE CURRENT OF) and then process the temporary table. However, I'm not sure if this would take a lot of CPU/memory resources.
Any help would be greatly appreciated,