Solved

SQL Query Optimisation

Posted on 2012-04-03
8
309 Views
Last Modified: 2012-04-04
I have a query that is selecting the last known Latitude and Longitude along with the Name from a table. At the moment the query is taking over 10 seconds to execute with only 18 Units to sort through, I can see this being a rather large issue when there are 300+ Units.

So I need a way to optimise this Query to execute much faster.

SELECT DISTINCT T1.ID, T1.sVesselName,
                          (SELECT     TOP (1) Latitude
                            FROM          tGPSData AS T3
                            WHERE      (VesselID = T1.ID)
                            ORDER BY ID DESC) AS LastLat,
                          (SELECT     TOP (1) Longitude
                            FROM          tGPSData AS T3
                            WHERE      (VesselID = T1.ID)
                            ORDER BY ID DESC) AS LastLon
FROM         tUser INNER JOIN
                      tUserVesselMap ON tUser.ID = tUserVesselMap.iUserID INNER JOIN
                      tVesselMaster AS T1 ON tUserVesselMap.iVesselID = T1.ID
WHERE     (T1.bEnabled = 1) AND (tUser.sUsername = @Username)
0
Comment
Question by:mgordon-spi
  • 4
  • 2
  • 2
8 Comments
 
LVL 69

Accepted Solution

by:
Scott Pletcher earned 500 total points
ID: 37803956
For *this* query -- not taking into account any other workload on the tables -- you need to have these indexes:

tUser (sUsername) --preferably clustered
tUserVesselMap (iUserID)
tVesselMaster (ID)
tGPSData (VesselID, bEnabled ) --if bEnabled is bit, make it a tinyint instead
0
 

Author Comment

by:mgordon-spi
ID: 37804014
I'm sorry - I dont quite understand your response?

I was more referring to the fact that I was using two nested select statements to retrieve the last record in the GPS table for each vessel. I was wondering if there was a faster way to do that?

The GPSData table holds hundreds of thousands of records for many vessels.

Untimately I am trying to retrieve a list of vessels and their last known location. This query achieves that but takes about 20 seconds to execute.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 37804200
I was wondering if there was a faster way to do that?
Yes, as Scott has pointed out it is called indexes.  Until you have confirmed the existence of those indexes, there is not a lot we can add...
0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 

Author Comment

by:mgordon-spi
ID: 37804406
I'm sorry but this might be stretching my SQL abilities. Jsut to confirm we are not talking about primary keys?

I have not created indexes before. Will the creation of them help with query execute speed?
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 37804525
Jsut to confirm we are not talking about primary keys?
Maybe, but not necessarily.

Will the creation of them help with query execute speed?
Yes.  Although you should be aware that too many indexes can slow down INSERTs and UPDATEs, But I suspect you are a long way from that.
0
 

Author Comment

by:mgordon-spi
ID: 37804535
So it is as simple as creating indexes (the ones mentioned above) and I should see a difference?
0
 

Author Closing Comment

by:mgordon-spi
ID: 37804592
Your help has directed me to learn a whole new aspect of SQL. Indexing!

Cheers.
0
 
LVL 69

Expert Comment

by:Scott Pletcher
ID: 37806452
>> creating indexes ... and I should see a difference? <<

Yes, definitely!
0

Featured Post

Best Practices: Disaster Recovery Testing

Besides backup, any IT division should have a disaster recovery plan. You will find a few tips below relating to the development of such a plan and to what issues one should pay special attention in the course of backup planning.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

In this article I will describe the Detach & Attach method as one possible migration process and I will add the extra tasks needed for an upgrade when and where is applied so it will cover all.
Load balancing is the method of dividing the total amount of work performed by one computer between two or more computers. Its aim is to get more work done in the same amount of time, ensuring that all the users get served faster.
Via a live example, show how to shrink a transaction log file down to a reasonable size.
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question