Swaminathan K
asked on
Oracle SQL queries -- Challenging question
Hi Team,
I have an requirement , I need to write an ETL tool using Oracle SQL and PLSQL for applying transformation and then loading into the target tables. The data is already loaded in the staging tables , I need to come with the performance oriented code to do the transformation. I have around 15 lakh records in my staging table.
Iam planning two approach for this ,
1. Using normal SQL statements to apply the transformation and use temporary tables to store the data and then load into the target tables.
2. Use PLSQL collections to process 1000 records at a time and use bulk insert into temporary tables and then load them into target tables.
In both the cases i need to mutilple tables to fetch data and then apply the transformation. The transformation can be applied by looking into some lookup tables or direct formating of the data.
I want to know which method is best for this requirement.
Any help is really appreciated.
I have an requirement , I need to write an ETL tool using Oracle SQL and PLSQL for applying transformation and then loading into the target tables. The data is already loaded in the staging tables , I need to come with the performance oriented code to do the transformation. I have around 15 lakh records in my staging table.
Iam planning two approach for this ,
1. Using normal SQL statements to apply the transformation and use temporary tables to store the data and then load into the target tables.
2. Use PLSQL collections to process 1000 records at a time and use bulk insert into temporary tables and then load them into target tables.
In both the cases i need to mutilple tables to fetch data and then apply the transformation. The transformation can be applied by looking into some lookup tables or direct formating of the data.
I want to know which method is best for this requirement.
Any help is really appreciated.
In my opinion you can combine both method. You can also load data into target tables without using temporary tables.
In the past I have done similar work as you do. Our ETL process (work) was similar to 2), but we didn't use temporary tables. Materialized views were our target tables.
This documentation was our "basic":
http://docs.oracle.com/cd/E11882_01/server.112/e25554/toc.htm
I am looking forward for answers from another experts because ETL is very interesting and I think every ETL is unique.
In the past I have done similar work as you do. Our ETL process (work) was similar to 2), but we didn't use temporary tables. Materialized views were our target tables.
This documentation was our "basic":
http://docs.oracle.com/cd/E11882_01/server.112/e25554/toc.htm
I am looking forward for answers from another experts because ETL is very interesting and I think every ETL is unique.
Temporary tables are really good if we are dealing with large number of rows.
ASKER
Can I create partitions in my temporary tables so as to increase performance ? Is it possible to create index on temporary tables and will it have anyh impact on the query performance
Yes you can partition a temporary table.
1.5 mil records is not all that much
try and create them in 1 go and then commit
i hope for you they don't load the data into the staging table at a 1 commit per record rate
try and create them in 1 go and then commit
i hope for you they don't load the data into the staging table at a 1 commit per record rate
why do you want to use a temporary table?
why do you need pl/sql, are your transformations something that can't be done in sql?
why do you need pl/sql, are your transformations something that can't be done in sql?
I agree that temporary tables should be a last resort. They are necessary for some database products but rarely necessary in Oracle.
I also agree that PL/SQL should be avoided unless absolutely necessary.
I wanted to comment on the indexes: Indexes slow down inserts. If you want performance, why would you want indexes on your temp tables?
As far as which method you use: Test, test then test some more. Mentioned above: Every situation is different. What works for one situation may not work in another.
If you go with smaller batches and commit often, what happens when the 6th batch fails? Will you need to back out the previous 5? If so, don't do batches. Do everything as a single transaction. Unless you just don't have the resources.
I also agree that PL/SQL should be avoided unless absolutely necessary.
I wanted to comment on the indexes: Indexes slow down inserts. If you want performance, why would you want indexes on your temp tables?
As far as which method you use: Test, test then test some more. Mentioned above: Every situation is different. What works for one situation may not work in another.
If you go with smaller batches and commit often, what happens when the 6th batch fails? Will you need to back out the previous 5? If so, don't do batches. Do everything as a single transaction. Unless you just don't have the resources.
ASKER
Hi Sdstuber,
The process goes like below
1. First we pull out the record from the base tables i.e staging
2. We apply transformations using functions or some lookup tables.
3. Then we load into the target table.
This is what we are doing.
The process goes like below
1. First we pull out the record from the base tables i.e staging
2. We apply transformations using functions or some lookup tables.
3. Then we load into the target table.
This is what we are doing.
The fastest way in Oracle to do a three-step process like that is to use a single SQL statement that is something like this:
insert into [target_table]
select ...
from [source_table]
with no temp tables or PL\SQL at all.
If the transformations or lookups are too-complex to write into a single query, or perform too slowly when you try to do them all at once, then it will make sense to use a temporary table or a PL\SQL procedure with a cursor loop, or even both, depending on your data, and your transformations.
I like slightwv's suggestion to "test, test and test" with a few different options to see what works best with your data, your oracle and O/S versions and your hardware.
Also note that "lakh" is not an English word. The common English word for a large number is: million, for the number: 1,000,000 (or 10 lakh). I only know what "lakh" means because I read a book written by a Pakistani author who used it but explained it, because she knew that word is only understood by people from a Hindi (or Hindi-influenced) background.
insert into [target_table]
select ...
from [source_table]
with no temp tables or PL\SQL at all.
If the transformations or lookups are too-complex to write into a single query, or perform too slowly when you try to do them all at once, then it will make sense to use a temporary table or a PL\SQL procedure with a cursor loop, or even both, depending on your data, and your transformations.
I like slightwv's suggestion to "test, test and test" with a few different options to see what works best with your data, your oracle and O/S versions and your hardware.
Also note that "lakh" is not an English word. The common English word for a large number is: million, for the number: 1,000,000 (or 10 lakh). I only know what "lakh" means because I read a book written by a Pakistani author who used it but explained it, because she knew that word is only understood by people from a Hindi (or Hindi-influenced) background.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
I agree with sdstuber, the best way is to use select queries for insertion using leels of subqueries. If process is very complex you can save long subqueris into temporary tables and then use them in your main select.
Where possible instead of using user defined functions try to get the values thru select
Where possible instead of using user defined functions try to get the values thru select
ASKER
awesome
2. Use PLSQL collections to process 1000 records at a time and use bulk insert into temporary tables and then load them into target tables.