• C

# Complexity of Algorithms

I want to know how the complexity of the Algorithms are calculated. and depending on what values of the complexities the best algorithm is selected , lets take algorithms for sorting or searching .

Thanks
###### Who is Participating?

Commented:
babashri,

I assume you already know the algorithm or the logic for quicksort. Here is how the complexity becomes O(N*LogN):

Imagine that M = Log N to the base 2.
On the first pass, there are approximately N comparisons (actually N-1). After this, the input data splits into two sublists each of size N/2 approximately. For each of these sublists, there are approx. N/2 comparisons, resulting into a total of four sublists each of size N/4/ and so on.

After halving the sublists M times, there are N sublists each of size 1. Thus, the total number of comparisons becomes

N + 2*(N/2) + 4*(N/4) + ... + N*(N/N)
or
N + N + ... + N (M terms).
which is N * M which is N * LogN
(because M = Log N to the base 2).

This is how the complexity of quicksort becomes O(NLogN). Which is considerably better than O(N^2) of Bubblesort for large values of N.

Hope this finally gives you the complete answer you needed. (Whew, that was quite a bit of work for 10 points! In future, please consider giving larger points to attract more experts to yield greater variety of answers :-)

cheers
- stochastic
0

Commented:
1. Time taken(No of comparisons)
2. Pointer(Memory) overheads to implement the solution
are a few
Regards,
Prem.
0

Author Commented:
can you explain me the terms as O, log(n) or similar terms that appear in the complexities , with an example

Thanks
0

Commented:
g(n) is O(f(n)) if thre exists a constant c such that
g(n) <= c*f(n) for all but a finite set of n > 0
0

Author Commented:
ozo , you are using terms that are difficult to understand
Cna you explain with some example to explain th terms : 0, log ...

Thanks
0

Commented:
e.g. log(n+2) is O(log(n)) because we can let c=log(2) so log(n+2) <= c*log(n)
0

Commented:
to find a max number in an array (with n cells) takes visiting all cells - O(n)
ofcourse each visit might take SOME operations but we will round them to one because its a CONSTANT number.
(even if its one line in C, it might be 4-5 in assembly...)

to calculate the Mul-board takes
to fill up a matrix n*n size.
each cell takes CONSTANT (O(1)) number of operation ==> O(n*n)*O(1) = O(n^2).

to explain sorting is a bit more difficult, and need some extend understanding.

Yair

0

Commented:
babashri,

Let me attempt to make it a little bit simpler.

Order O(n) means that the time taken by the algorithm is proportional to n, if you are processing n items through it.
Example: linear search in an unordered array.

Order O(log n) means that the time take is proportional to log n.
Example: Binary search in a sorted array.

Order O(n^2) means proportional to n-square. Example: Bubble Sort.

And so on.

Which algorithm suits a purpose best, depends on the expected number of items to process, n. It sometimes also depends on the sequence in which the input items come in. For example, building an unbalanced (rather, not-necessarily-balanced) binary tree by inorder traversal. Depends on the sequence of input. If the input is in ascending order, the time taken can be very large.

Hope this puts things in a simpler perspective? If you need more clarification or examples, please ask.

- stochastic
0

Author Commented:

can you tell me how the term 'O(log n)' comes for the binary stuff?

Thanks
0

Commented:
binary finding for example.
split the problem to half each iteration.
so if you have 128 items, next iteration you will have 64
then 32, 16, 8, 4 ,2 ,1

so for given n items how many steps are needed ?
for 1 item - 0  (2^0=1)
for 2 items - 1  (2^1=2)
for 4 items - 2  (2^2=4)
for 8 items - 3  (2^3=8)

for n items - log(n)   2^(log(n)=n

0

Commented:
I hope you know how binary search works. It works by halving the scope of search each time. It starts with the middle element first, and compares the middle element with your search key. Thereafter, it reduces the scope of search to the left half (if middle element is larger than your search key) or to right half (if middle element is smaller than your search key).

It keeps doing this halving, till it finds a match, or concludes that there is no match.

Let me take a simplified example. Imagine that you have a sorted array of numbers 1 to 127 in ascending order (so n=127). Say you are searching for the key value 13. The steps of comparison will be as follows:

step 1: Searching 1..127. middle point is 64. 13<64. Take left half.
step 2: searching 1..63. Middle point is 32. 13<32. Take left half.
step 3: searching 1..31. Middle point is 16. 13<16. Take left half.
step 4: searching 1..15. Middle point is 8. 13>8. Take right half.
Step 5: searching 9..15. Middle point is 12. 13>12. Take right half.
Step 6: searching 13..15. Middle point is 14. 13<14. Take left half.
Step 7: searching 13..14. Middle point is 13 (integer division). Found.

So you see, we took 7 steps to do this.
Log of 128 to the base 2 is 7.

Now if you try to do this with n=256, you will see that you will need (upto) 8 steps, as Log of 256 to the base 2 is 8.

So in general, in binary search, the worst case number of comparisons you will need will be log of n to the base 2 where n is the number of items in the array.

I hope this example helps.

- stochastic.
0

Author Commented:
Thanks stochastic,
At the last I want to ask you how we come to O(NLogN) complexity for the quick sort.

Thanks a lot
Baba..
0

Author Commented:
Adjusted points from 10 to 20
0

Author Commented:
Thanks a lot stochastic ,
Hey i have increased the points , i am bit miser because i think if i exhaust all the available points , i would not be able to ask questions with 0 points
..Moreover i am not an expert too.
0

Author Commented:

Thanks to all
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.