I have looked this up in three books and Google has also failed me.
To me a constant is a value that doesn't not change. It defined by "const", "#define", or a convention of all capital letters to remind the programmer not to change the value.
What I am seeing is:
float radius = 0.0f;
long Big_Number = 1287600L;
#define PI 3.141f
circumference = 2.0f*PI*radius;
Is it keeping the variable type constant? I read that the type can be promoted during implicit conversions.
IF I guessed correctly, when should it be used?
If I guessed incorrectly, I am completely lost and don't know when to use it.