JENNIFER NEWTON
asked on
Grouping on dates different only by seconds
I have an update query where one of it's functions is to track changes made on a table.
The update query sometimes processes large amounts of data and takes a few seconds to run. This means that the date/time stamp on the tracking table looks something like this:
9/2/2016 10:36:29 AM
9/2/2016 10:36:29 AM
9/2/2016 10:36:30 AM
9/2/2016 10:36:30 AM
9/2/2016 10:36:31 AM
9/2/2016 10:36:31 AM
9/2/2016 10:36:32 AM
I'd like to run a report on what changes have been made using this table. However I'm having a hard time grouping on the change time because seconds are too specific and minutes are too vague.
What I'd like to do is set up some kind of expression that captures ~10 second range, and group on that expression. Does anyone have any ideas on how I could do this?
Note: The 10 second range would have to be a dynamic moving range. In my above example I wouldn't want to just set hard numbers like 11-20, etc. In my above example I wouldn't want the 29 second changes to be grouped separate from the 30, 31, 32 second changes.
The update query sometimes processes large amounts of data and takes a few seconds to run. This means that the date/time stamp on the tracking table looks something like this:
9/2/2016 10:36:29 AM
9/2/2016 10:36:29 AM
9/2/2016 10:36:30 AM
9/2/2016 10:36:30 AM
9/2/2016 10:36:31 AM
9/2/2016 10:36:31 AM
9/2/2016 10:36:32 AM
I'd like to run a report on what changes have been made using this table. However I'm having a hard time grouping on the change time because seconds are too specific and minutes are too vague.
What I'd like to do is set up some kind of expression that captures ~10 second range, and group on that expression. Does anyone have any ideas on how I could do this?
Note: The 10 second range would have to be a dynamic moving range. In my above example I wouldn't want to just set hard numbers like 11-20, etc. In my above example I wouldn't want the 29 second changes to be grouped separate from the 30, 31, 32 second changes.
So create a calculated column on that data that gives you the date time to just the precision you want.
this works to get the grouping for every 10 records:
SELECT t.id, t.DateTimeField, CInt((SELECT count( a.id) FROM MyTablename a WHERE a.id <= t.id)\10) AS RankGroup
FROM MyTablename AS t;
BUT it crashes Access (for me) when grouping by RankGroup in a report.
I did not really understand what you are after .. my suggestion is to create a field in the table to use for grouping. Then run an update query to fill it with 'moving' logic -- and that way, you can group by a field that is not calculated.
SELECT t.id, t.DateTimeField, CInt((SELECT count( a.id) FROM MyTablename a WHERE a.id <= t.id)\10) AS RankGroup
FROM MyTablename AS t;
BUT it crashes Access (for me) when grouping by RankGroup in a report.
I did not really understand what you are after .. my suggestion is to create a field in the table to use for grouping. Then run an update query to fill it with 'moving' logic -- and that way, you can group by a field that is not calculated.
You mean something like.....
"So create a calculated column on that data that gives you the date time to just the precision you want."
"So create a calculated column on that data that gives you the date time to just the precision you want."
ASKER
Neilsr, your "solution" is terrible. You either don't understand what I'm asking for or you're being willfully ignorant so you can come across as condescending. No one asked you to post here. If you're going to spend your time here being a jerk to me for asking a question or being a jerk to Crystal for actually trying to help me then I'll have no problem reporting you to the mods.
re-reading your question, I see you want about a 10 second range ... best bet, imo, is to add field to group by then loop through a recordset and fill it. When looping, each time group is changed, store start time for group then count how many records are in the group and have a tolerance for the amount of time and amount of records in each group
If you don't want to add a field to the data table, you can create another table with the ID of the data table and the RankGroup that you join in to the RecordSource for the report.
If you don't want to add a field to the data table, you can create another table with the ID of the data table and the RankGroup that you join in to the RecordSource for the report.
I'm sorry that you feel that way.
I fail to see how the accepted solution is a fat lot different. It is basically creating an extra column and doing a calculation to populate it OR creating a new table and populating that with a calculation based on your dates and storing that along with the key of your data rows.
You ASKED for solutions and my input was to give you a solution that would work very simply. There is only one individual who is bad mouthing here....
I fail to see how the accepted solution is a fat lot different. It is basically creating an extra column and doing a calculation to populate it OR creating a new table and populating that with a calculation based on your dates and storing that along with the key of your data rows.
You ASKED for solutions and my input was to give you a solution that would work very simply. There is only one individual who is bad mouthing here....
ASKER
Then ask for clarification instead of being flippant.
There is no date/time format that will match what I need. Changes happen on a timescale of about 15-30 seconds. If I want to format as just seconds then I get the problem of chopped up groupings:
29
--
30
30
30
--
31
If I set a static range (like 10-19, 20-29) then it still gets chopped inappropriately
29
--
30
30
30
31
If I do minutes then I might merge unrelated groups. It will also chop related groups on the minute mark
12
12
12
59
59
--
00
So, sorry, but "format your column better" isn't helpful advice and misunderstands the problem
There is no date/time format that will match what I need. Changes happen on a timescale of about 15-30 seconds. If I want to format as just seconds then I get the problem of chopped up groupings:
29
--
30
30
30
--
31
If I set a static range (like 10-19, 20-29) then it still gets chopped inappropriately
29
--
30
30
30
31
If I do minutes then I might merge unrelated groups. It will also chop related groups on the minute mark
12
12
12
59
59
--
00
So, sorry, but "format your column better" isn't helpful advice and misunderstands the problem
Can you give exact expected result? If data will be:
9/2/2016 10:36:11 AM
9/2/2016 10:36:11 AM
9/2/2016 10:36:12 AM
9/2/2016 10:36:13 AM
9/2/2016 10:36:14 AM
9/2/2016 10:36:14 AM
9/2/2016 10:36:15 AM
9/2/2016 10:36:29 AM
9/2/2016 10:36:29 AM
9/2/2016 10:36:30 AM
9/2/2016 10:36:30 AM
9/2/2016 10:36:31 AM
9/2/2016 10:36:31 AM
9/2/2016 10:36:32 AM
9/2/2016 10:36:49 AM
9/2/2016 10:36:49 AM
9/2/2016 10:36:50 AM
9/2/2016 10:36:50 AM
9/2/2016 10:36:51 AM
9/2/2016 10:36:51 AM
9/2/2016 10:36:52 AM
what is expected?
First (median, last?) of some sequental range?
9/2/2016 10:36:11 AM
9/2/2016 10:36:29 AM
9/2/2016 10:36:49 AM
Such calculations could take a lot of time on large tables
9/2/2016 10:36:11 AM
9/2/2016 10:36:11 AM
9/2/2016 10:36:12 AM
9/2/2016 10:36:13 AM
9/2/2016 10:36:14 AM
9/2/2016 10:36:14 AM
9/2/2016 10:36:15 AM
9/2/2016 10:36:29 AM
9/2/2016 10:36:29 AM
9/2/2016 10:36:30 AM
9/2/2016 10:36:30 AM
9/2/2016 10:36:31 AM
9/2/2016 10:36:31 AM
9/2/2016 10:36:32 AM
9/2/2016 10:36:49 AM
9/2/2016 10:36:49 AM
9/2/2016 10:36:50 AM
9/2/2016 10:36:50 AM
9/2/2016 10:36:51 AM
9/2/2016 10:36:51 AM
9/2/2016 10:36:52 AM
what is expected?
First (median, last?) of some sequental range?
9/2/2016 10:36:11 AM
9/2/2016 10:36:29 AM
9/2/2016 10:36:49 AM
Such calculations could take a lot of time on large tables
ASKER CERTIFIED SOLUTION
membership
Create a free account to see this answer
Signing up is free and takes 30 seconds. No credit card required.
Use the simple function below to round your time entries to ten seconds.
Then your query will follow this layout:
Then your query will follow this layout:
Select Max(SomeField) From YourTable
Group By DateRound([TimestampField], 0, 0, 10)
Public Function DateRound( _
ByVal datDate As Date, _
Optional ByVal intHours As Integer, _
Optional ByVal intMinutes As Integer, _
Optional ByVal intSeconds As Integer) _
As Date
' Rounds datDate to hours and/or minutes and/or seconds as
' specified in parameters intHours, intMinutes, and intSeconds.
'
' Will accept any value within the range of data type Date:
'
' From 100-01-01 00:00:00 to 9999-12-31 23:59:59
'
' In case the range is exceeded due to rounding, the min. or max.
' value will be returned.
'
' Examples:
' DateRound(#9999-12-31 23:57:50#,0,5,0)
' returns: 9999-12-31 23:59:59
' DateRound(#9999-12-31 23:57:10#,0,5,0)
' returns: 9999-12-31 23:55:00
' DateRound(#9999-12-30 22:57:50#,0,5,0)
' returns: 9999-12-30 23:00:00
' DateRound(#2015-02-28 12:37:50#,0,15,0)
' returns: 2015-02-28 12:45:00
' DateRound(#2015-05-05 11:27:52#,3,0,0)
' returns: 2015-05-05 12:00:00
' DateRound(#2015-05-25 11:11:13#,0,0,2)
' returns: 2015-05-25 11:11:14
'
' Round to the tenth of a day:
' DateRound(#2012-11-15 15:00:00#, 2, 24, 0)
' returns: 2012-11-15 14:24:00
'
' 2008-04-16. Gustav Brock, Cactus Data ApS, CPH.
' 2010-05-12. Modified CDate expression.
Dim datRounding As Date
On Error GoTo Err_DateRound
datRounding = TimeSerial(intHours, intMinutes, intSeconds)
If datRounding <= 0 Then
' Round to the second.
datRounding = TimeSerial(0, 0, 1)
End If
' Apply CDec to prevent rounding errors from Doubles and allow large values.
datDate = CDate(Int(CDec(datDate) / CDec(datRounding) + 0.5) * datRounding)
Exit_DateRound:
DateRound = datDate
Exit Function
Err_DateRound:
If datDate < 0 Then
datDate = #1/1/100#
Else
datDate = #12/31/9999 11:59:59 PM#
End If
Resume Exit_DateRound
End Function
/gustav