I have absolutely no clue as to how it works, but i cant seem to find any resources to learn basics about O(log n) O(n) etc.. etc...
Anybody care to help ?
This is a discussion on Algorithms (big-Oh) within the C Programming forums, part of the General Programming Boards category; I have absolutely no clue as to how it works, but i cant seem to find any resources to learn ...
I have absolutely no clue as to how it works, but i cant seem to find any resources to learn basics about O(log n) O(n) etc.. etc...
Anybody care to help ?
http://en.wikipedia.org/wiki/Big_O_notation
The basic idea is... "How many operations do you need, in terms of n, in order to complete the algorithm?"
That is pretty much what it is. If you're going through an array linearly for something, then you have to go through all elements. If there are n elements, you require n operations -- or accesses of data. Thus you could say it is O(n).
Your wikipedia/google skills need work, then. (Granted, the Wikipedia article is rather formal and mathematical and stuff.)
ETA: And, for that matter, just about every algorithms book is going to not only explain what the notation means, but how to figure it out for some actual algorithms.
7. It is easier to write an incorrect program than understand a correct one.
40. There are two ways to write error-free programs; only the third one works.*