My assignment was to create a 10 by 10 table that stores 100 random numbers between 0 and 99,999. Then calculate the mean, variance and standard deviation of the numbers.

I have created the table and calculated the mean, but I'm having trouble on how I might calculate the variance of the random numbers. Also, when I run the program, the mean is usually between 0 and 400. This doesn't seem right, shouldn't the mean be somewhere in the 5 digit range?

This is my code so far.

#include<stdio.h>
#include<stdlib.h>

int main()
{
long int seed;
int i;
double sum;
double mean;

printf("Enter an integer number: ");
scanf("%ld",&seed);
srand(seed);

printf("\n\n\tResults of Programming Assignment 4 (Buisiness Option)\n");
printf("\t\tList of 100 Randomly Generated Numbers\n\n");
for(i=0;i<100;i++) printf("%5d\t",rand());
sum += rand();
mean=sum/100;

printf("\n\nArithmetic Mean : %g", mean);
}