1. ## Heeeelp.. Question...

Suppose by carful measurement we have discovered a function that describes the precise running time of some algorithm. For each of teh following such functions, discribe the algorithmic running time in big-Oh notation:

A. 3n2 + 3n + 7
B. (5*n) * (3 + log n)
C. (5n+4)/ 6
D. 1+2+3+ ..... +n
E. n + log n2
F. (n+1) log n / 2

Thank you...

2. hmmm....

homework...........?

3. ## Yes it is ....

But I need some hints not all.

becuase I couldn't understand the question ....

that is all
...

4. Maybe this will help.

If the largest term in a formula is no more than a constant times n^2, then the algorithm is said to be big-O of n^2..written O(n^2)
Example: n^2+7n

Linear Time

If the largest term in a formula is a constant times n, then the algorithm is said to be big-O of n....written O(n)
Example: 50n+5

Logarithmic Time

If the largest term in a formula is a constant times a logarithm of n, then the algorithm is big-O of the logarithm of n..written
O(log n)

From Data Structures and Other Objects Using C++, Michael Main, Walter Savitch

5. Each expression describes the maximum possible running time for an algorithm.
>A. 3n2 + 3n + 7
Let's forget about this for a second and work on how to determine O notation. Given the following algorithm:
Code:
for ( i = 0; i < N; i++ ) {
for ( j = 0; j < N; j++ ) {
// Stuff
}
}
for ( k = 0; k < N; k++ ) {
// Stuff
}
The first set of nested loops is O(N2) and the second loop is O(N). This is O(max(N2,N)) which is O(N2). Remember that O notation ignores constants and low order terms because they don't matter when the algorithm becomes very large. So once you strip the non-essentials, O notation is very easy to determine.

-Prelude

6. ## Thanks Guys...

I think that should do it ...