Now, of course we all know how to find the mathematical average or mean of a set of numbers.

Average = Sum of Numbers / Quantity of Numbers

so the average of 50, 80, and 100 is 76.6_

76.666 = (50 + 80 + 100) / 3

However, I was pondering the definition of a mathematical average or mean earlier today, and came upon some questions.

Webster's dictionary defines both of these words as such:

Average - the numerical result obtained by dividing the sum of two or more quantities by the number of quantities; an arithmetical mean.

Mean - halfway between extremes; in a middle or intermediate position...a number between the smallest and largest values of a set of quantities, obtained by some prescribed method.

Arithmetic Mean - the average obtained by dividing a sum by the number of its addends.

Now we come to my question. What really is an average? All of these definitions just beat around the bush. The definition of "Mean" came to closest to actually defining what an average really is. The other definitions only defined how to find the average of some numbers, and we all know how to do that.

I have always assumed the average of a set of numbers to be a certain number that is in the middle of all the numbers in the set.

For example, in a set of two numbers: 100, 50

The average is 75. The number in the middle of those two numbers is also 75. I have always assumed those to values to be the same.

Lets take another set of numbers: 30, 35

Average = 32.5
Number in the middle of the numbers = 32.5

Using simple logic such as thus stated, I always assumed that the average of a set of numbers was the directly in the middle of a set of numbers, and that to find that middle number you added them all up and divided by the quantity of numbers in the set.

However, things begin to get complicated and confusing as we begin to use sets that have a greater amount of numbers in them.

For example, take a set of 3 numbers: 70, 80, 100

Using the normal method of finding the arithmetic mean (or average) of those 3 numbers, we get 83.333_.

Now I tried to figure out if that was really the MIDDLE or CENTER of those 3 numbers...To do this I tried several methods.

First I examined that the middle of 70 and 80 is 75. Then I examined that the middle of 80 and 100 is 90. This has brought me down to 2 numbers. So I found the middle of those two numbers and expected to get the same average as I had with the 3 numbers, however, it was different. The new average (and middle number of those two numbers) was 82.5. That is not 83.333_.

Therefore I set off with a different method.

Middle of 70 and 80 = 75
70 and 100 = 85
80 and 100 = 90

75 and 85 = 80
75 and 90 = 82.5
85 and 90 = 87.5

As I looked at these results, I continued the process of recursion to see what I could find out...

80 and 82.5 = 81.25
80 and 87.5 = 83.75
82.5 and 87.5 = 85

81.25 and 83.75 = 82.5
81.25 and 85 = 83.125
83.75 and 85 = 84.375

82.5 and 83.125 = 82.8125
82.5 and 84.375 = 83.4375
83.125 and 84.375 = 83.75

Now, you have probably caught on to what I am doing here. Instead of just finding the middle of adjacent numbers in a sorted set of numbers, I am finding the middle between all numbers in a set of numbers.

As a result, each time I recurse and do it again, the general direction of the numbers goes towards what the arithmetic mean should be.

After recursing all these times that I did, I decided to find the arithmetical mean of my new numbers. Well, in the first method I had tried, my new arithmetical mean was 82.5. To my surprise, however, when using this method, my arithmetical mean stayed constant at 83.333_, the same as it had originally been.

Therefore, having analyzed my method of doing this, I have come to a realization of what an average or arithmetical mean really is. It is the middle number of ALL the numbers in the set together. Therefore my first assumption was correct.

I was just analyzing my work and seeing what I came up with.



So what is everyones thoughts on this?