I was experimenting with global variables recently, specifically in a program that I'm writing to calculate a quadratic equation. However, the more general matter is the assignment of the global variables for the quadratic coefficients.
I will not post the entire code bit, but I've been able to narrow it down, and the following problem with assigning the global variables seems to depend on their type.
Code:
double gA, gB, gC;
...
void getCoefficients(void) {
printf("Enter coefficients:\n");
printf("A: ");
scanf("%f", &gA);
printf("B: ");
scanf("%f", &gB);
printf("C: ");
scanf("%f", &gC);
printf("%f\n%f\n%f\n", gA, gB, gC); // not relevant to the function because of testing
// purposes.
}
In experimenting I placed the following printf() statement in the function to see the global variable values (I mainly did this because I don't have a debugger setup for my IDE) and this is what happens during runtime.
Code:
Output:
Enter coefficients:
A: 2
B: -4
C: -3
0.000000
0.000000
0.000000
x1 = 1.50
Bold indicates the values I input. However, I'm confused at why the globals are 0? I changed the data type of the globals to float, int, etc. and they worked then, but why does it cause a logic error when they are declared as doubles? I've searched the forums and I haven't found anything. If anyone's able to make sense of this or point me to a previous forum post or resource I can use to fix the problem then that would be of great help. Thank you.