Average asymptotic run time?

Hi all,

The following code is from one of the programming classes that I used to attend. This function recursively sums the elements in an array. They said that the a priori complexity of this algorithm is O(n), while the average asymptotic run time is 2n. What I want to know is what is this "average asymptotic run time," how to calculate it, and how it is related to the a priori complexity (The Big Oh notation)? Thanks!

Code:

`int sum(int array[], int n) {`

if (n <= 0) {

return (0);

}

return (array[n - 1] + sum(array, n - 1));

}