Code:
#define CENTS_PER_DOLLAR 100
int avg_min_cost;
avg_min_cost=(wkday_min+night_min+wkend_min))/(CENTS_PER_DOLLAR);
printf("Average minute cost: $ %.2lf\n",(double)avg_min_cost);
I'm not sure in the above code whether or not i casted the int to doubles correctly.
Here are the specifications:
"Note that since you are dividing by 100, and not 100.0
You will have to cast your int variable to doubles in your printf statements before dividing by CENTS_PER_DOLLAR"
ps: when dividing by 100.0 the program works fine. Dividing by the 100, the numbers are off from the goal.
However, i am required to divide by 100... what lol?