Solved

Cost is high, but query is faster. how?

Posted on 2014-09-07
6
344 Views
Last Modified: 2014-09-15
Dear Experts,

In my application, it is full of queries having cost > 2000.

But application is still faster,

which criteria / which query i should consider for tuning.?
Why the cost is high and how the query is fast?
0
Comment
Question by:sakthikumar
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
6 Comments
 
LVL 74

Expert Comment

by:sdstuber
ID: 40308529
cost doesn't represent speed

it's an internal number that has no meaning outside of the optimizer.

you could consider it a design bug that the number is exposed at all


also even if you are to try to decipher some meaning from cost, remember that the optimizer is a collection of estimates and sometimes it's wrong so, again, don't rely on it.
0
 
LVL 38

Expert Comment

by:Gerwin Jansen, EE MVE
ID: 40308586
>> But application is still faster
Faster than what? Before you made some changes? If so, what were the changes? Do you have indexes, do you gather statistics on a regular basis?
0
 

Author Comment

by:sakthikumar
ID: 40309544
Yes statistics are gathered on regular basis, but index might not be available.

So how can I evaluate the performance of the query and consider for tuning.?
where I can get the poor performance queries?
0
 [eBook] Windows Nano Server

Download this FREE eBook and learn all you need to get started with Windows Nano Server, including deployment options, remote management
and troubleshooting tips and tricks

 
LVL 35

Expert Comment

by:johnsone
ID: 40309616
The best place to get "poor performance" queries is the users.  If they aren't complaining about it, then it isn't a problem.

What you think is statistically a poor performing query is within an acceptable limit to a user.  Why go wasting time on improving a query that nobody will see or care about the result?

You could go through every query that comes through the database and try to improve it.  Seems like a waste of effort.
0
 
LVL 74

Assisted Solution

by:sdstuber
sdstuber earned 250 total points
ID: 40311898
In the absence of user input, ask your system admins.  If the system load was to grow, what would break first?  Would you run out of memory, cpu, io?  Then check your awr, statspack or other monitoring tools to find the queries that are the biggest consumers of those constrained resources.

or, rather than looking at specific queries for tuning, look at design issues like parse-to-execute ratios on an oltp system.  If you have a high parse ratio then you probably have developers that need to learn to use bind variables.

but - these are secondary to what johnsone said.  Go by your users first, specifically the processes that are costing your company money.  If you see a query that you think is bad but the customers don't care, then move on to something they do care about.

If you have a shared system, then things might get a little more complicated.  If users in db1 are complaining about a memory intensive query, you might want to look at db2 on the same server and tune some queries there that are memory hogs.
0
 
LVL 4

Accepted Solution

by:
jtrifts earned 250 total points
ID: 40311945
Cost is essentially an estimated measure of the resources the Optimizer is expected to use when producing an explain plan for a query. There are no true units to the cost.

The problem with the notion of cost, is that it gives the developer or dba the false illusion that small means fast and large means slow. This is not the case.

Neither is it necessarily the case that small means good and large means bad -- though most developers will aim for a lower number in general.

It is entirely possible and in fact happens all the time, that queries with higher costs will return results in less time than those with smaller costs. However, large cost queries place a greater burden on the system when there are lots of them being executed at the same time. Thus, there is a general practice to seek lower cost queries in general.
As a practice of performance tuning of the wider DB, the practice of attacking high cost queries can have an overall positive impact on your DB performance (but as I say, it might not actually speed up specific queries).

In terms of criteria for tuning, you should consider addressing the problem areas. How do you determine this? Do you have processes that are taking a long time to run? Are they getting longer and longer such that jobs are at risk of not completing within their available time windows?

Do you have users complaining that certain actions take too long (or that they execute something then have to get up and go get a coffee)?

Tuning activities can often be a process of user management -- what the focus areas should be often come down to who is evaluating the benefit.
End users want screens to get data returned to them quickly.  
Accountants / finance department would want you to ensure that they do not have upgrade tin (hardware).  
Managers often want their reports to get to them faster.
Executives want their BI data loaded quickly so they are better able to get their strategic reporting from warehouses.

At the end of the day -- you will get the most value out of addressing the issues that affect the most (or the most *important*) people.
If it is the end users who hate using your system because their most common tasks take ages to complete due to poor performance, that is a good candidate.
If the end users love the system, but managers are frustrated because the batch jobs often don't finish before the batch window slips into office hours, you might need to tackle the batch jobs.
If your BI team is complaining that they cannot load the warehouse because your ODS load takes forever, go there.

Knowing your users and communicating with them to understand what makes them tick (and what pisses them off) is key to your role as a consultant. Happy users mean more business, more $$$, more contracts.

From a tuning point of view, it has been commonly believed that 80% of the issues with performance are found at the application layer. in our case that means in SQL, PL/SQL, etc.

After that, you can look at tuning the DB itself. After that you can look at the OS and then H/W.

With any tuning exercise, you need to have a goal in mind. You can tune until the cows come home -- but if the system is already *good enough* leave it alone and find something to do which makes your client happy.
0

Featured Post

VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Cursors in Oracle: A cursor is used to process individual rows returned by database system for a query. In oracle every SQL statement executed by the oracle server has a private area. This area contains information about the SQL statement and theā€¦
Shell script to create broker configuration file using current broker Configuration, solely for purpose of backup on Linux. Script may need to be modified depending on OS-installation. Please deploy and verify the script in a test environment.
Via a live example show how to connect to RMAN, make basic configuration settings changes and then take a backup of a demo database
This video shows how to Export data from an Oracle database using the Datapump Export Utility.  The corresponding Datapump Import utility is also discussed and demonstrated.
Suggested Courses

635 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question