Code:
#include <stdio.h>
#include <limits.h>
int main()
{
printf("The size of char in bytes is: %zu.", sizeof(char));
printf("\nThe size of short in bytes is: %zu.", sizeof(short));
printf("\nThe size of int in bytes is: %zu.", sizeof(int));
printf("\nThe size of long in bytes is: %zu.", sizeof(long));
printf("\nThe size of long long in bytes is: %zu.", sizeof(long long));
printf("\nThe size of float in bytes is: %zu.", sizeof(float));
printf("\nThe size of double in bytes is: %zu.", sizeof(double));
printf("\nThe size of long double in bytes is: %zu\n\n", sizeof(long double));
long double x = 3.1415L;
printf("%Lg\n", x);
return 0;
}
Produces the following output on my machine (Linux using gcc-6.1.0):
Code:
The size of char in bytes is: 1.
The size of short in bytes is: 2.
The size of int in bytes is: 4.
The size of long in bytes is: 8.
The size of long long in bytes is: 8.
The size of float in bytes is: 4.
The size of double in bytes is: 8.
The size of long double in bytes is: 16
3.1415
I would guess that the problem is the result of a difference between the C-runtime and your compiler. Have you tried compiling and running the programs with Microsoft C instead?
By the way you really should be using the "%zu" format specifier to print the values returned by sizeof(). A size_t, the value returned by sizeof() is an implementation defined unsigned value, not a signed value.
Jim