Identify Poor and Expensive Oracle SQL and Automate

enthuguy
enthuguy used Ask the Experts™
on
Hi Database Experts,

Scenario:
As part of code deployment, we also execute Database DML and DDL. Recently a cursor statement (select / update) badly affected our PPTE environment. Since we have less data in lower environment. we actually couldnt catch this. however it took 3+ hrs in PPTE.

Is there any automated way to check this poor coding when executing in lower environment please?
E.g
  1. Can we add custom sql (as part of post sql on every deployment) to generate profile or gather stats when executing in lower env? So we check this report every time
  2. Anything with non-sys or non-system user. Basically with the user account who executes SQLs.
  3. SonarQ profile
Comment
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
Top Expert 2008

Commented:
The passive way is to use AWR reports and figure out long running SQL statement.
Active way is to install OEM 13c and analyzing the views. They will give you the whole picture. It is not easy but is possible.

Author

Commented:
Thx, this AWR can be executed in lower env where we have less data to generate the poor performing sql?

Can same db user perform this ? Or system user
Top Expert 2008
Commented:
SYS user is responsible for this:

To see the Snapshot intervals:

select snap_interval, retention   from dba_hist_wr_control;

SELECT  *  FROM  DBA_HIST_SNAPSHOT ORDER BY BEGIN_INTERVAL_TIME;

SELECT SNAP_ID, STARTUP_TIME ,BEGIN_INTERVAL_TIME,  END_INTERVAL_TIME FROM  DBA_HIST_SNAPSHOT ORDER BY BEGIN_INTERVAL_TIME;

 

To run the AWR report (choose HTML format for better reading):

SQL>spool c:\AWR_P646.txt

SQL> @ ?/rdbms/admin/awrrpt.sql

? stands for ORACLE_HOME directory
Ensure you’re charging the right price for your IT

Do you wonder if your IT business is truly profitable or if you should raise your prices? Learn how to calculate your overhead burden using our free interactive tool and use it to determine the right price for your IT services. Start calculating Now!

Geert GOracle dba
Top Expert 2009
Commented:
AWR is a paying option on Enterprise edition

free way:
you could also poll v$sql regularly and sort by disk reads, buffer gets and execution in desc order.

and save the matching execution plans from v$sql_plan
and the history for the plans, sometimes a new plan can get generated and get used only once and cause very bad performance
usually due to stale statistics
Most Valuable Expert 2012
Distinguished Expert 2018

Commented:
I fail to see how AWR on a small test system will identify what queries will not perform well on a scaled up production system.

The execution plans will be completely different between the two systems.

You cannot test against 1,000 rows if production has 1,000,000 rows.

You will need a test system that has enough data to represent the production system.  It won't be 100% linear but it should be close.

Author

Commented:
Thx
What could be the reasonable approach in my scenario pls. I'm trying to avoid manual review

I understand data matters, is there anything I can integrate & perform without DBA pls
Mark GeerlingsDatabase Administrator
Commented:
Yes, you can (and should) calculate fresh statistics for your tables when you copy or deploy new data to a test (or production!) system.  And yes, you can do this as the schema owner.  You do not need to use the Oracle SYS or SYSTEM accounts for this.

But, an automated way to catch poor code?  That may be a challenge!  Some coding environments (like Oracle's SQL Developer) will detect some (but not all) poor code as it being written.  If data volumes or distribution of records within a table change later, or if the amount of RAM and/or the SGA sizes change later, or are different between systems, any testing or development done earlier may be not valid for the new system or new data volumes.
Most Valuable Expert 2012
Distinguished Expert 2018

Commented:
>>I understand data matters, is there anything I can integrate & perform without DBA pls

You now know the SQL that performed poorly.  In your test system, it performed well.

Load up your test system until you see an indication that it starts performing slowly.  That would be the minimal data set you need.  It still won't be perfect but it is a place to start.

There is no magic to tuning.  You have to understand the data and the system accessing it.  You should know your problem tables.  I would look at any SQL that accesses them.

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial