1. ## Global Variable!

Hello Guys!

I know that the initial value of a global variable is zero. Is It correct?

But, What's the wrong with this code? x is defined as global variable but It's value in f1 is not zero! why? please help.

Code:
```#include <stdio.h>
#include <conio.h>
int x;
void f1 (void);
int main ()
{
x = 100;
f1 ();
f2 ();
printf ("\n in main x is: %d", x);
getch ();
return 0;
}
void f1 (void)
{
printf ("\n in f1 x is : %d\n", x);
}```

2. Originally Posted by alireza beygi
Hello Guys!

I know that the initial value of a global variable is zero. Is It correct?

But, What's the wrong with this code? x is defined as global variable but It's value in f1 is not zero! why? please help.

Code:
```#include <stdio.h>
#include <conio.h>
int x;
void f1 (void);
int main ()
{
x = 100;
f1 ();
f2 ();
printf ("\n in main x is: %d", x);
getch ();
return 0;
}
void f1 (void)
{
printf ("\n in f1 x is : %d\n", x);
}```
it is all about variable scope.
variables declared in an inner block may hide variables declared in an outer block
so, when you are initialising the value of x =100 , in the inner block (ie. main) the value of x is taken as 100 within that block.

now, as c reads top-down , when it enters into main it 'sees' x as 100 and then f1 is called . where you have to print the value of x.

now by rule, always the variable value with the lower scope is used . here (x=100) has lower scope (ie. main) than the global variable x.
so, it takes x as 100 and prints f1 to have a value of 100.

now, do one thing put the line x=100 after f1(); in main. you will see that the value of f1 is now printed as 0. and the next statement as 100.

because, again as c reads top-down , now when it comes to main and sees the call to f1(); at that point the only value of x it knows is 0 (the value of global variable) . so, it prints f1 as 0.

then it sees x=100, and from that point within that block the value of x becomes 100. which it prints in the second printf statement.

3. You Are The Man!

4. Originally Posted by alireza beygi
Hello Guys!

I know that the initial value of a global variable is zero. Is It correct?
Yes, according to the 6.7.8 (10) of the C99 standard. But, as always, there might be compiler specific options which overrides this. Therefor you cannot assume things like this and always initialize your variables.

5. Originally Posted by glennik
Yes, according to the 6.7.8 (10) of the C99 standard. But, as always, there might be compiler specific options which overrides this. Therefor you cannot assume things like this and always initialize your variables.
If you go down that then road you can't assume anything. The whole reason a standard exists is so you can assume things. Once you assume that every feature of C might be modified, how on earth would you write code?

6. Originally Posted by joybanerjee39
it is all about variable scope.

variables declared in an inner block may hide variables declared in an outer block

so, when you are initialising the value of x =100 , in the inner block (ie. main) the value of x is taken as 100 within that block.
The code has nothing to do with variable scope and the qutation is not relevant here.

Here we have only one global var, when the value is changed - it is changed everywhere... This is the main reason in my opinion NOT to use globals

scope sentence is talking about the following

Code:
```int x; //global
int main()
{
int x = 100; //local for main - hides the global var
for(int i=0;i<10;i++)
{
int x = i; // local for for block hides both global and main var
}
return 0;
}```

7. Originally Posted by vart
The code has nothing to do with variable scope and the qutation is not relevant here.

Here we have only one global var, when the value is changed - it is changed everywhere... This is the main reason in my opinion NOT to use globals

scope sentence is talking about the following

Code:
```int x; //global
int main()
{
int x = 100; //local for main - hides the global var
for(int i=0;i<10;i++)
{
int x = i; // local for for block hides both global and main var
}
return 0;
}```
you are right. i missed it. it's happening because c reads top-down.

8. Originally Posted by cas
If you go down that then road you can't assume anything. The whole reason a standard exists is so you can assume things. Once you assume that every feature of C might be modified, how on earth would you write code?
No, I disagree. The standard is the lowest common denominator, and there are several system where this feature is in fact overridden.

Plus, working on legacy systems, the compiler it may even follow the C89 standard...

So, again, never assume things like this.

9. Originally Posted by joybanerjee39
you are right. i missed it. it's happening because c reads top-down.
I think you're misunderstanding, or at least you aren't conveying your understanding with the proper terminology. We typically say a C compiler reads top-down, and it doesn't know about anything it hasn't seen yet. That's compile-time stuff. But in this case, it's run-time stuff that we're concerned with. You can't even say C executes top-down within a given function, since execution is subject to loops, branches, goto's and early returns, which cause control to go upwards, or to some other place in the code all together. C starts with the first statement in main, then executes downward, taking any loops, branches, gotos, function calls, etc into consideration of where to go next. It is not linear or top-down.

There is exactly 1 variable x, and it is global, meaning it is visible everywhere, both in main and in f1. The x used in main and f1 refer to the global because there is no local variable x that "shadows" the global and hides it from view/scope. Since x is set to 100 before f1 is called, the initial value of x (zero) is overwritten with 100. Then f1 is executed and the value of x (100) is printed out. If I moved the body f1 above main, it would still print 100, even though x was set to 100 at a lower point in the source file.

10. Originally Posted by glennik
No, I disagree. The standard is the lowest common denominator, and there are several system where this feature is in fact overridden.

Plus, working on legacy systems, the compiler it may even follow the C89 standard...

So, again, never assume things like this.
C89 requires globals to be zeroed, too, so no problem there. Precisely how many implementations have this “don't initialize globals” flag, anyway, and what are they?

As for the standard being the lowest common denominator, I think you're looking at this the wrong way. If you consider that the standard is the lowest common denominator, that means that all C compilers implement what the standard says and possibly more; not that they pick and choose what to implement.

If compilers pick and choose what parts of the standard to implement, what's the point of a standard, anyway?

Again, I have to ask: if you always assume the compiler will not get something right, how can you write code? Do you have a list of things the C standard says are valid but you don't trust? You say “never assume things like this”, but what exactly are “things like this”? How does one know which parts of the standard will be available? Is there a subset of the standard that you assume will always be implemented?

You certainly have to be aware of some issues when it comes to portability, but this is something pretty fundamental. It's kind of like saying that you shouldn't use the * operator to multiply because some compilers might disable it, so you should repeatedly use the + operator instead (although what if the compiler provides an option to break +, too?)

11. Wouldn't the easiest solution be to just manually zero the variable, and that way it works with all of the compilers using all of the standards? The *one* extra instruction won't make any noticeable difference in the program's execution time..

Also, not that it's specifically relevant, but global variables are hardly ever a good idea unless it's absolutely necessary.