# Thread: Average of integers with 2 accurate decimals

1. ## Average of integers with 2 accurate decimals

Hello,

Thank you for looking at my issue.

I am wondering why my decimals are always .00 even if they are supposed to be .50 or something etc. Next is the code and a screenshot of the program in action.

Code:
```#include <stdio.h>

main()
{
int first_number, second_number;
float average;

printf("Enter the first number:");
scanf("%d", &first_number);

printf("Enter the second number: ");
scanf("%d", &second_number);

average = ( first_number + second_number ) / 2;

printf("\naverage is %.2f", average);

return 0;
}``` 2. Because you are evaluating an integer expression: 1/2 (with integers) is 0
Change to:
Code:
`average = ( first_number + second_number ) / 2.0f;`
This 2.0f, at the end, will promote the first subexpression to float before the division is made. 3. An warning:

There is the matter of precision to consider... int has 31 bits of precision, but float has 24. This means, if you enter 17777777 twice you won't get 17777777 as average:
Code:
```\$  cat test.c
#include <stdio.h>

int main( void )
{
int a, b;

scanf("%d %d", &a, &b );
printf( "%.2f\n", ( a + b ) / 2.0f );
}

\$ ./test <<< '17777777 17777777'
17777776.00```
In this context, you should change the type to double, which has 53 bits of precision (the constant 2.0f need to loose this 'f' at the end).

PS: If you aren't using any functions from "math.h", you don't need to include the header file. 4. Thank you ! excellent, thanks for the warning also! Popular pages Recent additions average, basic, float, integer, issue 