# Rolling average query

I have a database with the first column which is datestamp (dd/mm/yy hh/nn/ss) and second column is readings at 10 minute intervals.

I want to set up a query which does a rolling average of readings over the previous 12 hours for the previous 7 days (so I'll need 7 1/2 days of data for 7 days of averages).

I'm sure it's not impossible, but I have absolutely no idea of how to go about it.

Many thanks
###### Who is Participating?

Commented:
Per recommendation, points NOT refunded and question closed.

Netminder
CS Moderator
0

Commented:
Here goes the first attempt

WHERE (((Int([DateStamp]))>=Date()-7 And (Int([DateStamp]))<Now()) AND ((Int((Now()-[DateStamp])*24/12)/2)=Int(Int((Now()-[DateStamp])*24/12)/2)))
GROUP BY Int((Now()-[DateStamp])*24/12)
WITH OWNERACCESS OPTION;

My Test Data was.
09/07/2002 18:00:00     10
09/07/2002 17:50:00     11
08/07/2002 18:00:00     12
02/07/2002 18:00:00     10
10/07/2002 08:00:00     1000
09/07/2002 08:00:00     100
09/07/2002 06:00:00     10000

Cheers, Andrew
0

Commented:
Nice problem ;-)
It also raises some questions:
1) Are the readings without gaps?
If there are, what to do ?
2) Are the readings countvalues to be averaged or
are the readings the value of a counter, so they need to be subtracted to get the difference for the actual calculation.

Nic;o)
0

progCommented:
I defined a table called tblReadings with a date field called datTime and a numeric field called reading.

The best way seems to be to calculate the total for the last 12 hours and the count of readings in the last 12 hours using subqueries renaming the table

you can then query the query to get values over 7 days

SELECT summed/counted AS RollingAverage, *

0

Author Commented:
nic;o)

I've never been complimented on the quality of my questions, but I'll take it as a compliment anyway - after all, a girl has to take them where she can get them ;-)

In answer to your questions, there shouldn't be any gaps in the data, and if there is, we'll just average over what there is.  The readings themselves are voltages (all between 11 & 13), which are later converted into lengths (I'm monitoring movement).  The overall movement is fairly slow, but the instrument reading them is showing more movement than we know is occurring - the averaging smooths this out.  Hope that isn't too much detail - no one ever said the world of civil engineering was exciting.

I'll have a look at the suggestions so far and get back to you.
0

progCommented:
..its always best to give your table name and field names, in this case it is only 3 names

- then the sql we make will be more likely to work directly.
0

Commented:
bigc,

Some Q's are just more fun as others and this is just one that's putting the brain to work ;-)
As the number of measurements in the 12 hrs can vary, we'll need the SUM / COUNT calculation deighton used in his query.

Straight forward this can be done using DSUM and DCOUNT, it will however be slow...
You'll get a query like:
WHERE Datestamp > NOW()-7.5;

Personally I would probably use a recordset processing function to speedup things.
BTW I think it's a bad idea to use an average to smoothen the error measurements out.
I would probably go for using the 95 percentile of the measurements, to drop the extreme measurements both the high and the low ones...

Success !

Nic;o)
0

Commented:

No comment has been added lately, so it's time to clean up this TA.
I will leave a recommendation in the Cleanup topic area that this question is:
- PAQ'd and pts removed
next seven days.

Nic;o)
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.