1. Complexity

Code:
```m = 0;
for (i=1; i<n; i = 2*i)
m = m + i;```
Is the complexity of the above code N or N^2?

I believe its N since the variable m wont effect the time of the function.

Code:
```i = 1;
while ( i < n )
{
if( i % 2 == 0 )
m = m * i;
i++;
}```
An I believe complexity of this function is N

Code:
```for (i = 0; i < n; i++ )
for ( j=0; j < i; j++ )
printf( “i = %d and j = %d\n”, i, j );```
The complexity of this function is N^2.

2. No expert here, but I'm going to agree with your analysis.

3. Originally Posted by spikestar
Is the complexity of the above code N or N^2?
I believe its N since the variable m wont effect the time of the function.
Actually <N, but more in fact a nonsensical case since "N" is not significantly involved. This is not a loop, this is "m += sqrt(n) + n^2" or something.

Originally Posted by spikestar
Code:
```for (i = 0; i < n; i++ )
for ( j=0; j < i; j++ )
printf( “i = %d and j = %d\n”, i, j );```
The complexity of this function is N^2.
Nope. The complexity of that code is completely indeterminate.

4. MK27 I'm not sure I quiet understand your reply for the first and second codes.

For the third code wouldn't the first loop run N times and the second would run N-1 times?

making it N^2 - N which is O(N^2)?

5. Originally Posted by MK27
Nope. The complexity of that code is completely indeterminate.
No it isn't. You could give a worst/best case bound of N*N. Since you're going from (0 to N - 1, 0 to N - 2, ...) N times. This can easily be compared to selection sort.

You seem to have it down pat, other than the first. It's far easier if you highlight the O(1) bits, then just go from there. For example for the second:
Code:
```i = 1;
while ( i < n )
{
if( i % 2 == 0 )
m = m * i;
i++;
}```
Where the green code can be done in O(1) time. Giving obviously O(N).

And the first problem is O(log2 N). You'll go from 1 to N stepping by i^2.

6. Originally Posted by zacs7
No it isn't. You could give a worst/best case bound of N*N. Since you're going from (0 to N - 1, 0 to N - 2, ...) N times.
Oh. I guess I didn't understand that n == j all the time for some reason.

[edit: except it must be less than i -- my mistake -- sorry to all concerned ]

7. Oh I understand what I done wrong with the first example. Didn't read the code carefully. Thought it was i++ not i*2.

8. Id fully agree with zacs7 on all of them.

spikestar, for your first one, each iteration doubles i, so that it approaches n very fast (i.e. exponentially). Say n is 32. On the first iteration i is 1, then 2, then 4, then 8, then 16, then 32 and its done. The number of iterations is the base-2 logarithm of n=32, or lg(n) = lg(32) = 5. Again, as mentioned above, it is lg(n) (where lg is log base 2).

Also, the only real case where the algorithm complexity cannot be determined is when either the complexity of a single statement cannot be determined, or the input isnt fixed. For example, if you have
Code:
```int n = 42;
int i;
for(i = 1; i < n; i ++)
{
scanf("%d", &i);
}```
Its pretty obvious from this that its impossible to determine it's complexity.

9. Originally Posted by MK27
Nope. The complexity of that code is completely indeterminate.
Unless the runtime is somehow data-dependent, or incorporates randomness, ALL algorithms have a complexity -- precisely what the complexity is may be extremely hard to determine, but it's always there. The case of two nested loops, with the inner loop looping up to a values determined by the outer loop counter, is a pretty classical example of O(N^2).

For instance, bubble sort is structured precisely that way. Nobody is going to claim that bubble sort has an "indeterminate" complexity.

10. Originally Posted by spikestar
Code:
```m = 0;
for (i=1; i<n; i = 2*i)
m = m + i;```
Is the complexity of the above code N or N^2?
Um, no. What made you think it was either of those?
The only variable that can affect the running time of this algorithm is n, and because i reaches n by successive doubling, this code is O(log n)