# Thread: Newbie C Progammer - Why getting logic error in my simple program?

1. ## Newbie C Progammer - Why getting logic error in my simple program?

Hello sir, I just have started to learn C programming from a few days and I am a total newbie. I created a simple program of printing the average of three integers (input by user) to one decimal place. My correct program is:

Code:
```//Write a Program to request three integers...
//...and print their average to one decimal place.

#include <stdio.h>

main()
{
int a,b,c;
printf("\n  Enter the values of a,b,c = ");
scanf("%d %d %d",&a,&b,&c);
double d=a+b+c;
printf("\n  The Average of a,b,c to one decimal place is = %1.1f \n", d/3);
}```
Now this program runs fine without any problem giving correct results. But when I write the statement 'double d=a+b+c' before first printf function (like in below program) instead of the place as in above, then I get a logic error (example image at the bottom) giving absurd value result on input values.

Code:
```#include <stdio.h>

main()
{
int a,b,c;
double d=a+b+c;
printf("\n  Enter the values of a,b,c = ");
scanf("%d %d %d",&a,&b,&c);
printf("\n  The Average of a,b,c to one decimal place is = %1.1f \n", d/3);
}```
Please anybody help me understanding the problem in the second program as what is wrong in it.

2. Originally Posted by omani
But when I write the statement 'double d=a+b+c' before first printf function (like in below program) instead of the place as in above, then I get a logic error (example image at the bottom) giving absurd value result on input values.

Code:
```...

main()
{
int a,b,c;
double d=a+b+c;
printf("\n  Enter the values of a,b,c = ");
scanf("%d %d %d",&a,&b,&c);
printf("\n  The Average of a,b,c to one decimal place is = %1.1f \n", d/3);
}```
In this program, you are initializing d with uninitialized values a, b and c. Since you didn't initialize a, b and c, their contents is undefined (it could be anything). They could have zeros or strange values in there, so the result of adding three such values together and then averaging them will be unpredictable.

3. Code:
```    double d=a+b+c;
printf("\n  Enter the values of a,b,c = ");
scanf("%d %d %d",&a,&b,&c);
printf("\n  The Average of a,b,c to one decimal place is = %1.1f \n", d/3);```
What you're suggesting here is that d should always be the sum of a,b,c.
Such that, if any of them change, then d is re-computed automatically when necessary.
C isn't one of these kinds of languages -> Declarative programming - Wikipedia

C is imperative. Everything is done in exactly the order you write it in your program.
So if you get the order of statements wrong, your program breaks.

4. Originally Posted by c99tutorial
In this program, you are initializing d with uninitialized values a, b and c. Since you didn't initialize a, b and c, their contents is undefined (it could be anything). They could have zeros or strange values in there, so the result of adding three such values together and then averaging them will be unpredictable.
Thanks a lot sir, your simple explanation really cleared my confusion totally.

5. Originally Posted by Salem
Code:
```    double d=a+b+c;
printf("\n  Enter the values of a,b,c = ");
scanf("%d %d %d",&a,&b,&c);
printf("\n  The Average of a,b,c to one decimal place is = %1.1f \n", d/3);```
What you're suggesting here is that d should always be the sum of a,b,c.
Such that, if any of them change, then d is re-computed automatically when necessary.
C isn't one of these kinds of languages -> Declarative programming - Wikipedia

C is imperative. Everything is done in exactly the order you write it in your program.
So if you get the order of statements wrong, your program breaks.