1. ## asymptotic notation clarify

to clarify: if we're talking about worst case, we should use big O, and average case can be big O, but best case I should use Big omega and not big O since it means we can do better which isn't true (since big O classifies the upper bound) and Big omega classifies some c s.t. cg(n) is same or belwo some f(n) so cg(n) is the lower bound, so we can do as good or worse. It's just that wikipedia uses big O for worst case, best case and average cases.

e.g. for quick sort it be:
average case/best case: Big-Omega of nLogbase2n (when list is half sorted)
worst case: Big-O of n^2 (when items are sorted or reversed

2. To say that f is big-Oh g is to say that f is asymptotically bounded ABOVE by g.

To say that f is big-Omega g is to say that f is asymptotically bounded BELOW by g.

So Wikipedia is correct to use big-oh for all the cases you mention since it's talking about upper bounds.

3. Let's reuse quick sort, so if best case is O(nlogbase2n) then it means this the worst it can do, but wouldn't that be wrong since we can do worse, which is quadratic time. I'm assuming Wikipedia is using big o loosely. That's why I thought it should be big omega for best case b/c best case means we can do no better, only do worse, but by saying best case is big oh of nlogbase2n we're saying this is upper bound but really we can do worse of O(n^2). Hope this makes sense.

4. I see what you mean. Now you've got me confused. Hopefully someone else will chime in here.

5. EDIT: I was never the best at this stuff, but as far as I understand it...

So I think you're mixing two concepts here. Best case performance is like saying "under ideal circumstances" or "with the best possible data set, parameters, etc". Thus, the big-O of the best case is like saying "under ideal circumstances, your algorithm will never do worse than some function". I find it easier to visualize the best and worst cases in terms of a binary tree:

In the best case, your BST will end up balanced. That means (for example) an insert operation will never be worse than O(log2(n)). Under the most ideal circumstances, your tree will never take more than log2(n) operations to insert a node.
In the worst case, your BST will be completely degenerate, i.e. it will basically be a linked list. In such a case, an insert operation, for example, will never be worse than O(n). That is, it can not be worse than linear time no matter how bad the data set is. A BST will never operate in quadratic time, exponential time, etc.

Hope that helps.

EDIT 2: Note, best case performance is rarely talked about. Usually you are concerned with how an algorithm performs in the average case (i.e. how is it going to behave most of the time), and the worst case (i.e. if I get a crappy data set to work with, what's the worst this thing is going to do).

6. I think you're confusing the notion of "best case", "average case" and "worst case" with the notion of asymptotic growth rates.

Originally Posted by monkey_c_monkey
so if best case is O(nlogbase2n) then it means this the worst it can do, but wouldn't that be wrong since we can do worse, which is quadratic time.
The thing is, if a function is in O(n log n), then the function is also in O(n^10000). That is, if you state an upper bound, then anything above that upper bound is also an upper bound. So, what this means is that the function that describes the best case is bounded above by n log n. Thus, it indeed is bounded above by n^2, i.e., it is in O(n^2) as well. This does not mean that it cannot be bounded below by n log n too, and indeed I believe that is the case, i.e., the function describing the best case is in theta(n log n) since it is both in O(n log n) and omega(n log n). But for the sake of convenience (the symbols for theta and omega are not so straightforward to type, and even though the symbol for omicron is not O, O will suffice), it is expressed as O(n log n), which is still correct anyway, even if it is not so precise.

7. Despite the fact that cMonkey seems to got his answer,i would like to add something

What oogabooga said in 2nd post is described by this

You should have in mind the theta notation too.

Also mind that an algorithm is optimum when O(complexityOfTheAlgorithm) = Ω(complexityOfTheAlgorithm),which means that the big Omaga is a border that your algorithm can not pass!