How to determine which LINQ query is more efficient

Given two or more LINQ queries that bring back the right answer, it there a way of determining which one is more efficient?

For example, suppose we have these two LINQ queries.


List<AddressCandidate> returnedCandidates = args.Results;
AddressCandidate addCand = (
                from address in returnedCandidates
                orderby address.Score
                select address).Last(); 

Open in new window


List<AddressCandidate> returnedCandidates = args.Results;
AddressCandidate addCand = (
                from address in returnedCandidates                 
                where address.Score == returnedCandidates.Max(adr => adr.Score)
                select address).First();

Open in new window

They both bring back the correct result.
What are some ways of determining which of several querries is more efficient?
I'm looking for ways that can be applied to most LINQ queries in general, not just the two above.
Who is Participating?

Improve company productivity with a Business Account.Sign Up

Jacques Bourgeois (James Burger)Connect With a Mentor PresidentCommented:
This is something you lose when working with LINQ  and similar technologies: you are never really sure what happens in the background and lose control on part of the process.

Simply build your test so that you take that into account. You might look at the following link that explains deferred execution.

In your case, since you are requesting .Last and .First, my understanding is that both will execute at the point where you declare the variable. So timing the declaration itself should do the trick.

Note that you probably need a lot of data in order to evaluate a significant difference. And you should also run the test many time and average the results, because a lot of things can happen in the background while your code is running, both on the client station and on the server.

There are tools on SQL Server that enables you to log and evaluate almost anything that happens there. I had a DBA a few years ago show me the commands that a DataAdapter was sending to the server during an Update. I do not remember how he did it however. Some tool he had in Management Studio if my bad memory is better than usual this morning.
Jacques Bourgeois (James Burger)PresidentCommented:
You cannot generalize about query efficiency. Each query is different, so what is best for one might not be good for another. The data that you have and the design of the database  come into account and can make a huge difference.

2 queries that seem built the same way, but one that sorts on an indexed field while the other one sorts on a non indexed field wont have the same efficiency. It has nothing to do with the query, it has to do with the design of the database.

A query that is efficient when looking for people names will not have the same efficiency as the same query looking for states, because there could be thousands of different names while there are less than 100 states and provinces.

That would play in your example. There is no way to tell if ordering would be faster than searching with a WHERE. It depends on the data and the way the address field is indexed. The only way to know is the same one has for almost everything in programming: test both and see which one gives you the best results.

Also, because a database is a living thing, in which the data changes with time, the query that would be the best today might not be the best in a year or so. Always with the idea of a states field, if you start only with Canada where you have only 10 provinces, and after a couple of years you add the US and Mexican states, you might see that the query that was efficient suddenly takes a lot more time, just because you have more variety in the data.

This is why databases such as SQL Server have a query optimizer that builds an execution plan that a good DBA would refresh from time to time. Depending on the current data, it might be faster to first go for the WHERE and then the JOIN. But a few months later, the data will have changed and the optimizer might find out that doing the JOIN before the WHERE is now more efficient.

The same holds for your queries.
XTOAuthor Commented:
Thank you. That was very well explained.
Suppose, as in my case, the database structure is going to remain mostly the same for a long time and the data will change very little.
I considered using a C# Stopwatch object to take a before and after time like this:

var queryTimer = new Stopwatch()

// run query #1
MessageBox.Show("Query time is: " + queryTimer.Elapsed.TotalSeconds.ToString());

// run query #2
MessageBox.Show("Query time is: " + queryTimer.Elapsed.TotalSeconds.ToString());

Open in new window

But then, LINQ uses deferred execution, and I don't know how that comes into play.

Maybe there is a way to look at the SQL that gets generated.

How do the suggestions above sound? Are there better ones?


I just found a class called ExecutionStopWatch on codeproject:

According to the article, the regular dotNet Stopwatch measures real world time while ExecutionStopwatch measures CPU time.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.