Identify Poor and Expensive Oracle SQL and Automate

Hi Database Experts,

Scenario:
As part of code deployment, we also execute Database DML and DDL. Recently a cursor statement (select / update) badly affected our PPTE environment. Since we have less data in lower environment. we actually couldnt catch this. however it took 3+ hrs in PPTE.

Is there any automated way to check this poor coding when executing in lower environment please?
E.g
  1. Can we add custom sql (as part of post sql on every deployment) to generate profile or gather stats when executing in lower env? So we check this report every time
  2. Anything with non-sys or non-system user. Basically with the user account who executes SQLs.
  3. SonarQ profile
enthuguyAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

schwertnerCommented:
The passive way is to use AWR reports and figure out long running SQL statement.
Active way is to install OEM 13c and analyzing the views. They will give you the whole picture. It is not easy but is possible.
0
enthuguyAuthor Commented:
Thx, this AWR can be executed in lower env where we have less data to generate the poor performing sql?

Can same db user perform this ? Or system user
0
schwertnerCommented:
SYS user is responsible for this:

To see the Snapshot intervals:

select snap_interval, retention   from dba_hist_wr_control;

SELECT  *  FROM  DBA_HIST_SNAPSHOT ORDER BY BEGIN_INTERVAL_TIME;

SELECT SNAP_ID, STARTUP_TIME ,BEGIN_INTERVAL_TIME,  END_INTERVAL_TIME FROM  DBA_HIST_SNAPSHOT ORDER BY BEGIN_INTERVAL_TIME;

 

To run the AWR report (choose HTML format for better reading):

SQL>spool c:\AWR_P646.txt

SQL> @ ?/rdbms/admin/awrrpt.sql

? stands for ORACLE_HOME directory
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
Powerful Yet Easy-to-Use Network Monitoring

Identify excessive bandwidth utilization or unexpected application traffic with SolarWinds Bandwidth Analyzer Pack.

Geert GOracle dbaCommented:
AWR is a paying option on Enterprise edition

free way:
you could also poll v$sql regularly and sort by disk reads, buffer gets and execution in desc order.

and save the matching execution plans from v$sql_plan
and the history for the plans, sometimes a new plan can get generated and get used only once and cause very bad performance
usually due to stale statistics
1
slightwv (䄆 Netminder) Commented:
I fail to see how AWR on a small test system will identify what queries will not perform well on a scaled up production system.

The execution plans will be completely different between the two systems.

You cannot test against 1,000 rows if production has 1,000,000 rows.

You will need a test system that has enough data to represent the production system.  It won't be 100% linear but it should be close.
1
enthuguyAuthor Commented:
Thx
What could be the reasonable approach in my scenario pls. I'm trying to avoid manual review

I understand data matters, is there anything I can integrate & perform without DBA pls
0
Mark GeerlingsDatabase AdministratorCommented:
Yes, you can (and should) calculate fresh statistics for your tables when you copy or deploy new data to a test (or production!) system.  And yes, you can do this as the schema owner.  You do not need to use the Oracle SYS or SYSTEM accounts for this.

But, an automated way to catch poor code?  That may be a challenge!  Some coding environments (like Oracle's SQL Developer) will detect some (but not all) poor code as it being written.  If data volumes or distribution of records within a table change later, or if the amount of RAM and/or the SGA sizes change later, or are different between systems, any testing or development done earlier may be not valid for the new system or new data volumes.
1
slightwv (䄆 Netminder) Commented:
>>I understand data matters, is there anything I can integrate & perform without DBA pls

You now know the SQL that performed poorly.  In your test system, it performed well.

Load up your test system until you see an indication that it starts performing slowly.  That would be the minimal data set you need.  It still won't be perfect but it is a place to start.

There is no magic to tuning.  You have to understand the data and the system accessing it.  You should know your problem tables.  I would look at any SQL that accesses them.
1
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Sonar

From novice to tech pro — start learning today.