I get an undefined error when compiling this. Everything else works when I take out the standard deviation, but I get the error when I put it back in.
I also need help with the user input. My professor wants it done in a certain way and I just can't figure out how it's done. This is just an example of the output.
1.0 5.5 2.9 -1
After that, the program is suppose to count how many integers there are and calculate the rest of the stuff. -1 is there to tell the program to not count the numbers beyond it.
One more thing. I can't find out why the loop isn't working in at the end. When I answer "y" it just prints back the summary instead of going back to the beginning asking "Enter number." I did it similar to her notes and it still doesn't work.
int n = 0;
double max = 0;
double min = 9999;
double sum = 0;
double mean = 0;
double vari = 0;
double devi = 0;
while(a != -1)
printf("Enter numbers: \n");
if (a != -1)
if (a >= max)
max = a;
if (a <= min)
min = a;
sum += a;
vsum += a * a;
mean = sum / n;
vari = (vsum / n) - (mean * mean);
devi = sqrt(vari);
printf("You entered %i numbers \n", n);
printf("Maximum value entered: %g \n", max);
printf("Minimum value entered: %g \n", min);
printf("Sum of all the values read: %g \n", sum);
printf("Mean of all values: %g \n", mean);
printf("Variance of all values: %g \n", vari);
printf("Standard Deviation of all values: %g \n", devi);
printf("\nWould you like to factor another (Y/N)? ");
} while (answer!='N');