-
Range of an integer.
As far as I know, the range for an integer is from -32768 to 32768, right? But take a look at this code:
#include <stdio.h>
main()
{
int n, lar = -32768, count;
for( count = 0 ; count < 10 ; count++ )
{
printf("Please enter your %d integer: ",count+1);
scanf("%d", &n);
if( n >= lar)
lar = n;
}
printf("Largest is %d", lar);
return 0;
}
Eventhough when I entered an integer number out of the range of integer, the compiler still work properly. Why?
-
>the range for an integer is from -32768 to 32768, right
The standard dictates that an integer is guaranteed to have at least 16 bits of precision. Different compilers are free to increase that precision as long as they maintain the 16 bit minimum.
-Prelude
-
If you declare a variable of type integer and give it a value larger than it's range (depends on the system you are working), then some compilers will make the value smaller so it fits in the maximum integer. So if you are working on a 32 bit system, and your value requires 45 bits, then the compiler will throw away the last 13 bits.
-
Code:
#include <limits.h>
#include <stdio.h>
int main(void)
{
printf("The highest integer possible is %d",
INT_MAX);
return 0;
}
-
Hmmm...I never knew of that constant. Thanks! :D
-
most x86 computers range about +/- 2.1 billion
although a long long can range farther
-
Or you can just create your own datatypes when you want to use huge numbers.