Link to home
Start Free TrialLog in
Avatar of adrian_ang
adrian_angFlag for United States of America

asked on

Algorithms Design :)

How can we modify almost any algorithm to have a good best-case running time?
ASKER CERTIFIED SOLUTION
Avatar of baboo_
baboo_

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of adrian_ang

ASKER

I mean best - case!
There is always input , for example n integers, if we measure in number of items.
Avatar of _TAD_
_TAD_

adrian_ang>

as baboo mentioned, the best case running time is always going to be a constant.  If you prefer to use big-O notation, then best case scenarios will always be 1-O


Case in point:

  Suppose we are looking at sorting algorithms.  The Bubble sort is the most basic and has the absolute worst "worst case" scenario.  Conversly, the bubble sort also has the best performance if all of the items are already in order.

Quick sort (the most common advanced algorithm), overall has one of the best "worst-case" scenarios and a best case (already sorted) that comes very close to that the bubble sort.  

Finally there is the shear-sort.  For very large datasets that are radically out of sync, this is the absolute best sorting method.  But as it turns out, the efficiency does not change much at all if the data is already pre-sorted.  This makes the "best-case" rather weak.


In summary, I believe what you are looking for is really minimizing the "worst case" algorithms.  Doing an analysis in this direction will always give you some kind of results.  Only optimizing for best-case will not yield accurate quatitative results.


I suggest you like into Big O (Big-oh) notation.   I think you'll have more success with what you are looking for