Solved

Make this query run faster

Posted on 2004-09-24
19
1,492 Views
Last Modified: 2008-01-09
We are running this query on a DB2 database and are wondering if there is anything which can be done to make it run faster?

with temp(id,desk,CUR_MTD_TOTAL, WTD, CUR_FY_YTD, PRV_MTD_TOTAL, PRV_BUS_DAY,PRV_DAY_AVG, PRV_WEEK_AVG,PRV_MONTH_AVG, PRV_FY_YTD,CUR_REV_ANN,PRV_FY,FORECAST,TOTAL_YTD_ACCRUED, NEXT_PLAN_PRIM,NEXT_PLAN_SECDY,NEXT_PLAN) as (
    select rootoe.row_id,rootoe.name as desk, sum(prev.CUR_MTD_TOTAL), sum(prev.WTD), sum(prev.CUR_FY_REV), sum(prev.PRV_MTD_TOTAL) , sum(prev.PRV_BUS_DAY) , sum(prev.DAILY_AVG) ,sum(prev.WEEKLY_AVG) ,sum(prev.PRV_MONTH_AVG) ,sum(prev.PRV_YTD_REV) ,sum(prev.PRV_FY_REV),sum(prev.CUR_FY_ANN_REV),0,0,0,0,0
    from cx_matrix_pos tr
    inner join cx_revenue prev on prev.postn_id=tr.child_pos
    inner join s_org_ext oe on oe.row_id=prev.oe_id
    inner join s_postn postn on postn.row_id=tr.child_pos
    inner join s_org_ext rootoe on rootoe.row_id=oe.frght_terms_cd
    where tr.parent_pos='1-156J4' and oe.FRGHT_TERMS_INFO like 'Investment Advisor' and Ucase(rootoe.name) like '%'
    group by rootoe.row_id,rootoe.name
    union all
    select rootoe.row_id,rootoe.name as desk,  0,0,0,0,0,0,0,0,0,0,0,sum(bu.forecast),sum(bu.total_ytd_accrued),0,0,0
    from cx_matrix_pos tr
    inner join cx_forecast_fy bu on bu.position_id=tr.child_pos
    inner join cx_fiscal_year fy on fy.fiscal_year=bu.fiscal_year
    inner join s_postn postn on postn.row_id=tr.child_pos
    inner join s_org_ext oe on oe.row_id=bu.ou_id
    inner join s_org_ext rootoe on rootoe.row_id=oe.frght_terms_cd
    where tr.parent_pos='1-156J4' and oe.FRGHT_TERMS_INFO like 'Investment Advisor' and Ucase(rootoe.name) like '%'
    group by rootoe.row_id,rootoe.name
   union all
     select rootoe.row_id,rootoe.name as desk,  0,0,0,0,0,0,0,0,0,0,0,0,0,sum(bu.ni_forecast),sum(bu.sec_forecast),sum(bu.forecast)
    from cx_matrix_pos tr
    inner join cx_forecast bu on bu.position_id=tr.child_pos
    inner join s_period per on per.row_id=bu.period_id
    inner join s_postn postn on postn.row_id=tr.child_pos
    inner join s_org_ext oe on oe.row_id=bu.ou_id
    inner join s_org_ext rootoe on rootoe.row_id=oe.frght_terms_cd
    where tr.parent_pos='1-156J4' and oe.FRGHT_TERMS_INFO like 'Investment Advisor' and Ucase(rootoe.name) like '%'
    and per.desc_text='Fiscal Year 2005'
    group by rootoe.row_id,rootoe.name
)
 select id,desk,  cast(sum(CUR_MTD_TOTAL) as double) as Cur_Month, cast(sum(FORECAST) as double) as Cur_Plan_FY, cast(sum(TOTAL_YTD_ACCRUED) as double) as Cur_Plan_YTD, cast(sum(WTD) as double) as Cur_Week,cast(sum(CUR_FY_YTD) as double) as Cur_YTD, cast(sum(PRV_MTD_TOTAL) as double) as Prv_Month, cast(sum(PRV_BUS_DAY) as double) as Prv_Day,cast(sum(PRV_DAY_AVG) as double) as Prv_Day_Avg, cast(sum(PRV_WEEK_AVG) as double) as Prv_Week_Avg, cast(sum(PRV_MONTH_AVG) as double) as Prv_Month_Avg,cast(sum(PRV_FY_YTD) as double) as Prv_YTD ,cast(sum(PRV_FY) as double) as Prv_FY, cast(sum(CUR_REV_ANN) as double) as Cur_Rev_Ann,cast(sum(NEXT_PLAN_PRIM) as double) as Next_Plan_Prim, cast(sum(NEXT_PLAN_SECDY) as double) as Next_Plan_Secdy,cast(sum(NEXT_PLAN) as double) as Next_Plan  from temp group by id,desk order by Cur_Ytd desc
0
Comment
Question by:leonstryker
  • 9
  • 5
  • 2
  • +2
19 Comments
 
LVL 10

Expert Comment

by:Jay Toops
ID: 12143585
Ideas ..
a) looks like this would make a nice View ...
b) Remove/Replace Ucase(rootoe.name) like '%'
with somthing like rootoe.name <> ''
c) Create a SP to do this... and Run it every so often and cache the results in a table


Questions
a) how long does it take
b) how many rows does it return
c) what are the tables involved and
d) example of what it returns.


Jay

ps... THATS one fat harry monster of a query u have there
0
 
LVL 29

Author Comment

by:leonstryker
ID: 12143832
This query runs on db2 ver6 stored procedures are not supported.

It currently run in 11 sec.

CX_REVENUE 500000 rows
CX_FORECAST 140000 rows
Cx_forecast_fy 15000 rows

Returns 634 rows

Purpose for Ucase() like '%' -- During runtime this gets replaced with users query string
0
 
LVL 29

Author Comment

by:leonstryker
ID: 12143898
How will a view improve this?
0
 
LVL 10

Assisted Solution

by:Jay Toops
Jay Toops earned 100 total points
ID: 12143930
you could create the veiw with all the code EXCEPT the like statement which changes.

Views are basically compiled queries, the data is already arranged and organized like it is
needed then you query the view just like any other table. using your Like statement
which is basically the only "DYNAMIC" piece of this query .

According to the manual DB2v6 supports Stored procedures.

Jay
0
 
LVL 29

Author Comment

by:leonstryker
ID: 12144058
Stored procedures for db2 6 are written in C++
0
 
LVL 10

Expert Comment

by:Jay Toops
ID: 12144089
bummer...
Jay
0
 
LVL 29

Author Comment

by:leonstryker
ID: 12144356
Yeah ....
0
 
LVL 10

Expert Comment

by:Jay Toops
ID: 12144466
really a view should help

Jay
0
 
LVL 29

Author Comment

by:leonstryker
ID: 12144560
We will try it.
0
What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

 
LVL 18

Expert Comment

by:SjoerdVerweij
ID: 12145775
Like Leon said: the like '%' stuff is killing you. It is very likely forcing a table scan. Replace with <> ''.
0
 
LVL 29

Author Comment

by:leonstryker
ID: 12146018
SjoerdVerweij,

In this case the like '%' is a ariable to which a value is bound at run time.  It is not a condition for items which are not empty.
0
 
LVL 18

Expert Comment

by:SjoerdVerweij
ID: 12146450
Man oh man I'm fried today. Okay, what kind of terms do you put in there?
0
 
LVL 10

Expert Comment

by:Jay Toops
ID: 12146458
LOL have a nice weekend guys..

JAY
0
 
LVL 29

Author Comment

by:leonstryker
ID: 12146609
Its a name of a person
0
 
LVL 4

Expert Comment

by:derekpapesch
ID: 12149368
That query is ugly.  

There is nothing wrong with it if you only plan to run it once or twice, infrequently or on very small tables, but if that isn't the case you should look at each join and remove some.

Take a look at the tables you are using, and find which tables are the biggest to work out which joins are going to be the most expensive.  Then check to make sure you have indexes on those columns.

Find out where the query is burning all the time, but using an analysis package (I don't know DB2) or just break it down into small queries to find out which part goes slow.

Also, when you join data, joining 100 rows will usually be much faster than 10,000, so if you filter the data first (perhaps using a subquery) then join, it will be much faster than joining the data, then filtering. (and don't forget indexes).

Lastly, if the speed of this query is important, you might want to de-normalize your data.  DB2 should have a feature like Oracle's snapshot, which will create a view, then keep updating it a set time intervals to keep it current.  You treat the snapshot like a normal table and get very fast access to the data.

Hope this gets you started.

P.S. Unless DB2 has special features built into its view, I can't see how a view will speed up your query.  The execution path may be precompiled, but that cost will be cheap alongside joins on 500,000 rows.

P.P.S. Indexes should be your first port of call.
0
 
LVL 3

Expert Comment

by:granbajo
ID: 12168751

  DB2 has something called "summary tables", I am pretty sure it is available in version 6, although you would have to check.  Check the sql reference for the CREATE TABLE STATEMENT.  They are used to create "persistent views", and they can be calculated every time you add a row to the base tables, then querying them would be much faster.

0
 
LVL 29

Author Comment

by:leonstryker
ID: 12183446
"summary tables" are not available in version 6

:(
0
 
LVL 4

Accepted Solution

by:
derekpapesch earned 400 total points
ID: 12185836
Looks like there are a few things missing in v6!

Depending on how frequently your data is updated, how fast you need the query to run, and whether you need the latest data, you could  create a table then schedule a script (to be called from outside the database if v.6 doesn't have any way of doing it!), then keep updating the table every day (/hour/month).  Thus, you would have a lightning fast query, but would only get the data since the latest update.

Of course, this approach will only work if you are happy with serving data which may be several hours (/minutes/days) old.
0
 
LVL 29

Author Comment

by:leonstryker
ID: 12186680
derekpapesch,

I was actually considering this approach and am currently attempting to implement it.  I will let you know as soon as possible.
0

Featured Post

Find Ransomware Secrets With All-Source Analysis

Ransomware has become a major concern for organizations; its prevalence has grown due to past successes achieved by threat actors. While each ransomware variant is different, we’ve seen some common tactics and trends used among the authors of the malware.

Join & Write a Comment

Introduction: I have seen many questions on EE and elsewhere, asking about how to find either gaps in lists of numbers (id field, usually) ranges of values or dates overlapping date ranges combined date ranges I thought it would be a good …
Many companies are looking to get out of the datacenter business and to services like Microsoft Azure to provide Infrastructure as a Service (IaaS) solutions for legacy client server workloads, rather than continuing to make capital investments in h…
Video by: Steve
Using examples as well as descriptions, step through each of the common simple join types, explaining differences in syntax, differences in expected outputs and showing how the queries run along with the actual outputs based upon a simple set of dem…
Polish reports in Access so they look terrific. Take yourself to another level. Equations, Back Color, Alternate Back Color. Write easy VBA Code. Tighten space to use less pages. Launch report from a menu, considering criteria only when it is filled…

743 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

11 Experts available now in Live!

Get 1:1 Help Now