1. ## Big O Notation

I have some code snippets in which I have to find the run time using big O notation.

code frag 1:

Code:
```void E(int array[], int N){
int i, j;
for(i=1; i<N-5; i++){

for (j=i; j<i+5; j++)
array[i] +=array[j];
array[i] = array[i]/5;
}
}```

I understand the simpler ones such as O(N) and O(N^2), but this seems to be slightly more complex.

could someone explain how we would arrive at an answer for this?

2. You need to count how many times you will add things in. How many times will each for loop run?

3. say if N was 10, the outside loop would run 5 times, and the inside loop would also run 5 times, so would it be O(5N)?

4. array subscripts start at 0 btw

5. Originally Posted by bigparker
say if N was 10, the outside loop would run 5 times, and the inside loop would also run 5 times, so would it be O(5N)?
Because obviously 5 * 10 is 25.

The point of big O is, how does the number of steps change, when N changes.

6. N is the number of elements. You need to determine what is done for each element. But not with constants such as 10, but like "nē" or "n * log(n)".

7. Hint: if you look carefully, you'll notice that this inner loop
Code:
```     for (j=i; j<i+5; j++)
array[i] +=array[j];```
always executes a constant number of times. Hence, the inner loop is O(1). Then just figure out the asymptotic bound for the outer loop, and multiply it by O(1), i.e. 1, and you'll get the asymptotic bound for the whole function.

(It's like the fundamental counting principle. When you have loops that are one after the other, you can add their running times -- which in the context of big-Oh notation basically means that you can ignore whichever loop is less expensive. For nested loops, you can multiply the running time of inner loops by outer loops to get the overall running time.)

8. if it was "j < n" or like that, the algorithm would run in O(n^2) time, since you have to do something for each element, which in turn is doing something for each element.