Finding corresponding column of a row that has a maximum value

Posted on 2009-05-12
Last Modified: 2012-05-06
I have 2 tables: a daily table and an hourly table. The daily table has the following columns:
avg_30day float;
max_hour_value float;
max_hour date;
timestamp; /* timestamp is always = trunc(timestamp) */

while the hourly table has:
metric_value float;
timestamp date; /* timestamp is always = trunc(timestamp, 'hh') */

I'm trying to create an update statement that will update the daily table as follows:
avg_30day = average of previous 30 days' daily_metric
max_hour_value = maximum metric_value in hour table for that day
max_hour = corresponding timestamp of the row that has maximum metric_value

The problem with my statement below is that I can't find the correct SQL that will return the corresponding timestamp from the daily table that has the maximum metric_value for that day. I get the ORA-00937: not a single-group group function. If I remove the busy_hour column and h1.timestamp, it works.
update daily d2
set (avg_30day, busy_hour_value, busy_hour) =
(select avg(d1.daily_metric), max(h1.metric_value), h1.timestamp
from daily d1 left outer join hourly h1 on = and d1.timestamp = trunc (h1.timestamp)
where = and d1.timestamp > (d2.timestamp - 29) and d1.timestamp <= d2.timestamp)

Open in new window

Question by:jdymahal
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions

Expert Comment

ID: 24363940
Just run the select statement if it retrieves the correct values. I am suspecting the group by clause not being included in the select statement. It should contain group by h1.timestamp as avg() and max() are group functions.

Hope this helps. Let me know if you have any more questions.
LVL 20

Accepted Solution

gatorvip earned 125 total points
ID: 24367357
You can accomplish this via analytic functions, see below for an example that you should be able to adapt to your 30 day requirement. If this is not enough, post some sample values with desired result.

What happens if you happen to have two different timestamps that have the same max daily value for your metric? If you have a specific criteria (for example, the first one throughout the day) then add this requirement to 'order by metric_value'
select avg(d1.daily_metric) over() avg_30_days,
from daily d1
  left outer join (select h.*, row_number() over(partition by trunc(h.ts) order by metric_value desc) max_hour from hourly h) h1
    on d1.ts=trunc(h1.ts)
where h1.max_hour=1           

Open in new window

LVL 11

Expert Comment

ID: 24367821
Try the statement below.  It uses the FIRST function which allows you to get a value from the first row of a sorted group, where the value is not what you're sorting on.  See ora docs for more info ...

Oracle® Database SQL Reference

update daily d2
set (avg_30day, max_hour_value, max_hour) =
( select avg(d1.daily_metric), 
	 max(h1.date_time) keep (dense_rank first order by h1.metric_value desc)
  from daily d1 left outer join hourly h1 on ( = and d1.date_time = trunc (h1.date_time) )
  where = 
    and d1.date_time > (d2.date_time - 29) 
    and d1.date_time <= d2.date_time)

Open in new window

Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!


Author Closing Comment

ID: 31580475
Thank you! I've never heard of analytical functions before (hence my slow response since I spent the last day studying it), but it was exactly what i needed to solve my problem. I had to split the solution into two update statements for performance reasons. here's my final solution:

update daily d2 set (avg_30day) =
(select avg(d1.daily_metric) over (partition by id)
from daily d1
where = and d1.timestamp > (d2.timestamp - 29) and d1.timestamp <= d2.timestamp)

the second update is similar to what you specified in your solution.
LVL 20

Expert Comment

ID: 24384062
Have you tried Andytw's solution? I believe it might actually help solve your problem moreso than my answer did. I would suggest splitting the points.

Author Comment

ID: 24386686
Yes, I did, but using row_number() and then picking out the row that has row_number() = 1 in the outer select as in your solution was better. Andytw's solution using "keep first" did find the correct timestamp, but it returned an "ora-01427: single-row subquery returns more than one row". In other words, I would have needed to add a row_number() column in the inner select and pick out where row_number() = 1 in an outer select to make it work in the update... making the "keep first" clause superfluous.

I hope I explained myself clear here.. Thanks!

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Note: this article covers simple compression. Oracle introduced in version 11g release 2 a new feature called Advanced Compression which is not covered here. General principle of Oracle compression Oracle compression is a way of reducing the d…
I remember the day when someone asked me to create a user for an application developement. The user should be able to create views and materialized views and, so, I used the following syntax: (CODE) This way, I guessed, I would ensure that use…
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function
This video shows how to copy an entire tablespace from one database to another database using Transportable Tablespace functionality.

756 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question