My assignment was to create a 10 by 10 table that stores 100 random numbers between 0 and 99,999. Then calculate the mean, variance and standard deviation of the numbers.
I have created the table and calculated the mean, but I'm having trouble on how I might calculate the variance of the random numbers. Also, when I run the program, the mean is usually between 0 and 400. This doesn't seem right, shouldn't the mean be somewhere in the 5 digit range?
This is my code so far.
long int seed;
printf("Enter an integer number: ");
printf("\n\n\tResults of Programming Assignment 4 (Buisiness Option)\n");
printf("\t\tList of 100 Randomly Generated Numbers\n\n");
sum += rand();
printf("\n\nArithmetic Mean : %g", mean);