O (n log n) performance

Dear experts
Can anyone explain in laymans terms what it means to say Collections.sort() which uses merge sort guarantees
O (n log n) performance.

I undersand O(Log n) , not sure what O (n log n) means.
An explanation in laymans terms will be greatly appreciated.
Thanks.
royjaydAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Andrew ReinmanstudentCommented:
O(logn) means you are guaranteed that a time complexity of less than logn where n is the size of what you are sorting. So O(nlogn) simply guarantees a complexity of less than nlogn. Graphing logn and nlogn together may help you picture what this actually looks like.
dpearsonCommented:
OK so O (log n) take time proportional to log n.  So if n is 1000 then log n (base 10) is 3.

O(n log n) means "n * O(log n)" - i.e. each step in the process takes O(log n) time and there are n of them.

So to go back to our example where n is 1000
log n is 3
n is 1000
so
n log n is 3000

So O(n log n) where n is 1000 would be O(3000) which compares well to O(n) which would be O(1000).

(You shouldn't actually write values into O(x) statements like I just did there, but I hope you get the idea).

If you're sorting a list you need to look at each element in the list.  So it's going to take at least O(n) steps.
So O(n log n) is "almost as good as O(n)" (because O(log n) is very small) and the task cannot task less than O(n) for any sorting algorithm - even ones nobody has discovered yet (technically this is assuming a fixed number of CPUs).

So in increasing time it goes:
O(1)         // Constant time
O(log n)  // Log time
O(n)        // Linear time
O(n log n)  // Log linear time
O(n^2)    // Quadratic time (horrible)
O(2^n)   // Exponential time (lifetime of the universe stuff)

All of the best sorting algorithms are O(n log n).

Does that help?

Doug

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
royjaydAuthor Commented:
Thanks.
OWASP: Avoiding Hacker Tricks

Learn to build secure applications from the mindset of the hacker and avoid being exploited.

royjaydAuthor Commented:
One final question..
In some of the forums I read it says o(n) may be slower than o(n log n)
Can that ever happen?
Thanks.
dpearsonCommented:
In some of the forums I read it says o(n) may be slower than o(n log n)
Can that ever happen?

An algorithm designed to run in O(n) time can - on a particular run - take more time than an algorithm designed to run in O(n log n) time.  That's because these big-O measures are talking about the complexity of the algorithm as n grows.  They ignore all "constants" in the algorithm.

int count = 0 ;
for (int i = 0 ; i < n ; i++) {
    Sleep(1000) ;
    count++ ;
}

So that algorithm is O(n) - as n grows the task time increases in linear proportion to the work.
But it's also clearly really slow - it sleeps for 1 second each step.
I could change that to sleep for 10 minutes for each step - it's still an O(n) algorithm.

Clearly though an O(n log n) algorithm that's written without the big sleeps, will be faster on lots of problems than O(n) that has big sleeps.  But if I make n really really big (they call that "asymptotic behavior") then an O(n log n) is always going to eventually be slower than O(n) (even one with a big sleep for each step).  I just have to make n large enough.

So "can an O(n) be slower than O(n log n)" is yes it can, but  only for some 'n' values.  But for a large enough n, O(n) will always be faster.

Does that help clear it up?

Doug
royjaydAuthor Commented:
thanks doug.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Java

From novice to tech pro — start learning today.