Surprisingly in the many math courses I've been forced to take for my cs degree I haven't come accross O(n). At least not that I remember, perhaps it's the context thats throwing me off.
Any one feel like explaining it?
Printable View
Surprisingly in the many math courses I've been forced to take for my cs degree I haven't come accross O(n). At least not that I remember, perhaps it's the context thats throwing me off.
Any one feel like explaining it?
When you hear "O" you can pretty much substitute uppebound disregarding constant factor. For instance, you can say an algorithm is O(n^2) and you know that a function for the time(t) and some nonnegative constant c 0 < f(t) <= c * n^2.
One restriction that I forgot to mention is that it's possible to restrict the use of n so that n is greater than n_0. This way a function can behave like n^3 for n < 10 but if for n > 10 it behaves like n^2 the bound is O(n^2)(it is still also n^3).
Lame, some on actually gave me negative reputation for this thread. Some people are simply jackasses.