I've recently recieved the task of making a program where it distinguishes the difference between global and local variables.
Code:
#include <stdio.h>
#include <math.h>
# define PRINT(x,y,z) printf("running average is %d / %d = %.3f Floor is %.0f Ceiling is %.0f\n", #x, #y, #z, floor(z), ceil(z))
double runningAverage(int sum, int numInput)
{
static int numinput = 0;
static double avg = 0;
numinput++;
static int tempsum;
tempsum = tempsum + sum;
avg = (double)tempsum / numinput;
printf("running average is %d / %d = ", sum, numinput);
printf("%.3f Floor is %.0f Ceiling is %.0f\n", avg, floor(avg), ceil(avg));
//PRINT(tempsum, numinput, count);
return 0;
}
int main()
{
int count;
int num;
printf("enter number (-1 to quit): ");
scanf("%d", &num);
while (num != -1)
{
count++;
runningAverage(num, count);
printf("enter number (-1 to quit): ");
scanf("%d", &num);
}
return 0;
}
The program is supposed to read a number until -1 is inputted. Until then, it averages the number through each iteration and floors and ceilings it.
macro doesnt work so it's commented out. but my main problem is figuring out a method to make the function double runningAverage(int input) work instead of the two argument function that I currently have without using global variables.
In what ways can I achieve this? I can do the same program with a void function but I don't know how to tackle this otherwise. Any help would be greatly appreciated.