Solved

Calculate the difference between two timestamps - Redux

Posted on 2011-03-09
9
597 Views
Last Modified: 2012-05-11
Please refer to the accepted answer at http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/Q_26869322.html

The solution works just great until I have a table with over 1M records, at 1.5M records the query goes from mere seconds to who knows when, I usually get impatient and kill it after 30 minutes. I even created an index on the columns that I am using in the script, that took all of three seconds, thinking that that would improve performance. After a little research it appears to be related to the ROW_NUMBER() call or maybe it is just CTE functions in general do not scale well, doesn't really matter. What I would like to know is if this code can be rewritten to perform better on really large data sets, not use ROW_NUMBER(), not use CTE, just whatever makes it work. It really should not take a long time to find the difference between two numbers in consecutive rows and the original solution works superbly for all of my other smaller tables.

Thanks


0
Comment
Question by:Mr_Difficult
  • 4
  • 3
  • 2
9 Comments
 
LVL 40

Assisted Solution

by:Sharath
Sharath earned 150 total points
ID: 35089826
Give a try.
SELECT ID, 
         Weeks, 
         Seconds, 
         NextSeconds - Seconds TimeDiff 
    FROM (SELECT *, 
                 (SELECT MIN(Seconds) 
                    FROM your_table t2 
                   WHERE t2.Seconds > t1.Seconds) NextSeconds 
            FROM your_table t1) t1 
ORDER BY Seconds

Open in new window

0
 
LVL 11

Expert Comment

by:JoeNuvo
ID: 35089865
for that over 1M table.
do you have any condition to narrow down the concern data? (I mean put "WHERE" inside the CTE part)

for ex. compare only for some certain Weeks etc.
0
 

Author Comment

by:Mr_Difficult
ID: 35089937
no, it is all data from a sensor array and this is only a few minutes worth of data.
0
 
LVL 11

Expert Comment

by:JoeNuvo
ID: 35090387
this is the best I can perform (query using 90% CPU on Quad CPU Intel Xeon 2.66GHz 6 Cores)
sample data 50k rows , 30 seconds
sample data 100k rows, 2 minutes

assume these condition were true
1. the data in RandomData to perform, having more than 1 value for Weeks
2. have index over field (Weeks, Seconds)

SELECT ID, Weeks, Seconds, 
	(SELECT MIN(Weeks * 604800 + Seconds)
	 FROM RandomData t2 WITH (index=IX_Weeks_Seconds)
	 WHERE t2.Weeks IN (t1.Weeks, t1.Weeks + 1) AND t2.Seconds > CASE WHEN t2.Weeks = t1.Weeks THEN t1.Seconds ELSE 0 END
	) - (t1.Weeks * 604800 + t1.Seconds) TimeDiff
FROM RandomData t1
ORDER BY Weeks, Seconds

Open in new window

0
Optimizing Cloud Backup for Low Bandwidth

With cloud storage prices going down a growing number of SMBs start to use it for backup storage. Unfortunately, business data volume rarely fits the average Internet speed. This article provides an overview of main Internet speed challenges and reveals backup best practices.

 
LVL 11

Accepted Solution

by:
JoeNuvo earned 250 total points
ID: 35090508
But, if you're not insist on "not uses temp table"

this code run only 1 second for 100k rows sample
3 seconds for 250k sample

Remark : my system physical disk using kind of external RamDrive, which will considering disk access is pretty fast compare to regular disk system.

Declare @R TABLE (id int primary key IDENTITY(1,1), Weeks int, Seconds int)
INSERT INTO @r (weeks, Seconds)
select Weeks, Seconds
from RandomData
order by Weeks, Seconds

SELECT R1.*, (R2.Weeks - R1.Weeks) * 604800 + (R2.Seconds - R1.Seconds) TimeDiff
FROM @R R1
	INNER JOIN
	@R R2 ON R1.ID = R2.ID-1

Open in new window

0
 

Author Comment

by:Mr_Difficult
ID: 35090583
Sharath's solution looks promising so far, how would I add a test for all values over a threshold?

JoeNuvo, it appears that not using a temp table was a bad constraint for very large tables, I will check your solution now.
0
 
LVL 40

Expert Comment

by:Sharath
ID: 35103535
>> Sharath's solution looks promising so far, how would I add a test for all values over a threshold?

Did you run my query against your whole table? What is the query execution time?
0
 

Author Comment

by:Mr_Difficult
ID: 35113949
This is the test: 1408926 rows already sorted data, no indexes.

JoeNuvo: 4  seconds
Sharath_123:  killed the test after 30 minutes

Same table with an index on the weeks and seconds
JoeNuve: 4 seconds
Sharath_123: 3 seconds to create the index, 4 seconds for the timediff

So, it turns out that creating an index is extremely important when working on the master table and the results are essentially a tie (which is not a bad thing.)

0
 

Author Comment

by:Mr_Difficult
ID: 35113958
Forgot to bump the points
0

Featured Post

PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Introduction In my previous article (http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/SSIS/A_9150-Loading-XML-Using-SSIS.html) I showed you how the XML Source component can be used to load XML files into a SQL Server database, us…
For both online and offline retail, the cross-channel business is the most recent pattern in the B2C trade space.
Via a live example, show how to shrink a transaction log file down to a reasonable size.
Viewers will learn how the fundamental information of how to create a table.

948 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

22 Experts available now in Live!

Get 1:1 Help Now