Algorithms Design :)

How can we modify almost any algorithm to have a good best-case running time?
LVL 5
adrian_angAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

baboo_Commented:
Uhh...  "Best-case" running time would be with a dataset of the size of 0, in which case the running time would be constant.  So you wouldn't have to.

Typically, algorithm analysis is performed for average- or worst- case.  The best-case situation for *any* algorithm is when it doesn't have any data on which to be run.

Did you mean "worst-case" running time?  I'm not trying to be flip here, but I'm making sure I understand the question...

baboo_
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
adrian_angAuthor Commented:
I mean best - case!
0
adrian_angAuthor Commented:
There is always input , for example n integers, if we measure in number of items.
0
_TAD_Commented:
adrian_ang>

as baboo mentioned, the best case running time is always going to be a constant.  If you prefer to use big-O notation, then best case scenarios will always be 1-O


Case in point:

  Suppose we are looking at sorting algorithms.  The Bubble sort is the most basic and has the absolute worst "worst case" scenario.  Conversly, the bubble sort also has the best performance if all of the items are already in order.

Quick sort (the most common advanced algorithm), overall has one of the best "worst-case" scenarios and a best case (already sorted) that comes very close to that the bubble sort.  

Finally there is the shear-sort.  For very large datasets that are radically out of sync, this is the absolute best sorting method.  But as it turns out, the efficiency does not change much at all if the data is already pre-sorted.  This makes the "best-case" rather weak.


In summary, I believe what you are looking for is really minimizing the "worst case" algorithms.  Doing an analysis in this direction will always give you some kind of results.  Only optimizing for best-case will not yield accurate quatitative results.
0
_TAD_Commented:


I suggest you like into Big O (Big-oh) notation.   I think you'll have more success with what you are looking for
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
C#

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.