Link to home
Start Free TrialLog in
Avatar of TClevel
TClevelFlag for United States of America

asked on

Oracle PL/SQL Procedure is running very slow

I have an oracle pl/sql that is running very slow. It is trying to process a million plus records. It is checking a field and will do an insert, update or delete depending on the value of the field. Can someone show a better way to write this procedure to process these records. the proccess has been running for 4 1/2 days now.
I have attach the procedure to this email below

thanks


load-procedure.txt
ASKER CERTIFIED SOLUTION
Avatar of jwahl
jwahl
Flag of Austria image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
It's worth pointing out that the data_custom_staging table should, as a minimum, have an index on the account_number column.  This would probably help with the subqueries, however, it really depends on data volumes and which version of Oracle/Optimiser is being used.
Avatar of TClevel

ASKER

I do not have an index on the staging table.
The index on the staging table would only be helpful with the set based sql example I gave you.  It wouldn't make any difference with your original plsql code as you only access the staging table once and read every row within a cursor loop.

Adding an index on account_number to the data_custom_accs table (as mentioned by jwahl above) would help your original script, however, I believe using set processing would give you a much bigger performance increase.
Avatar of TClevel

ASKER

Okay I will try the set processing.

What is the syntax for the set processing?
Set processing is a concept rather than a specific syntax.

I'm sure you're aware that you can write straight forward SQL statements e.g. SELECT, INSERT, UPDATE, DELETE etc. to manipulate data.  These statements act on a "set" of data.  If you have a lot of data to process, it is preferable to write fewer SQL statements to act on large "sets" .

When you start to use cursor loops in plsql your are introducing lots of repetative processing which doesn't lend itself to processing large volumes of data.

So rather than writing cursor loops to process a single row at a time (or even multiple rows using bulk fetches etc) it is always better to write SQL statements that act on many rows in one go.

The sample code snippet I gave above is an example of completely changing the structure of your program to use SQL statements that perform a particular function on large sets of data e.g. identify ALL duplicate rows in the staging table and move them to a sidelined table, delete ALL duplicate rows from the staging table,  insert ALL new rows into the account table, update ALL existing rows in the account table and finally, delete ALL redundant rows from the accounts table.

The key here (and the essence of set processing) is that each sql statement performs a specific task, but it performs it on a large data set and performs it only once.

As these are still standard SQL statements you can run them from a PLSQL block or stored procedure, so the sample code snippet I gave earlier could exist within a PLSQL block.  Obviously, in the case that you want to report selected data (as in sidelined rows) you would need to perform the select as a cursor loop to enable dbms_output to write the data out for each row.  Using bulk fetch would increase performance, however I would assume that you wouldn't expect that many accounts to be in error so this is probably unnecessary.

Alternatively, the whole script could be run through SQL*Plus.  The choice on how to run the code is down to you.
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial