# Average asymptotic run time?

This is a discussion on Average asymptotic run time? within the C Programming forums, part of the General Programming Boards category; Hi all, The following code is from one of the programming classes that I used to attend. This function recursively ...

1. ## Average asymptotic run time?

Hi all,

The following code is from one of the programming classes that I used to attend. This function recursively sums the elements in an array. They said that the a priori complexity of this algorithm is O(n), while the average asymptotic run time is 2n. What I want to know is what is this "average asymptotic run time," how to calculate it, and how it is related to the a priori complexity (The Big Oh notation)? Thanks!

Code:
```int sum(int array[], int n) {
if (n <= 0) {
return (0);
}
return (array[n - 1] + sum(array, n - 1));
}```

2. The worst-case asymptotic runtime is O(n) (probably what he means by a priori complexity), and the average-case asymptotic runtime is O(n).

To say something like "2n" is just moronic, well, partially moronic, that's probably your professor being a moron or you not taking complete notes. The reason "2n" means nothing is because what does the "2" mean? 2 microseconds? 2 millenia?

"Average case asymptotic runtime" refers to the run time in the average case. There are some algorithms that work reasonably well in the average case, but take much more time for certain inputs. For example, quicksort takes O(n log n) in the average case (random ordering of elements), but for certain types of inputs it takes Theta(n * n) time.