Hello,
Thank you for looking at my issue.
I am wondering why my decimals are always .00 even if they are supposed to be .50 or something etc. Next is the code and a screenshot of the program in action.
Code:
#include <stdio.h>
main()
{
int first_number, second_number;
float average;
printf("Enter the first number:");
scanf("%d", &first_number);
printf("Enter the second number: ");
scanf("%d", &second_number);
average = ( first_number + second_number ) / 2;
printf("\naverage is %.2f", average);
return 0;
}