Output of VS2008 Express:Code:#include <stdio.h>
int main (void)
{
printf("%u\n",sizeof (long long int));
return 0;
}
8
Printable View
Output of VS2008 Express:Code:#include <stdio.h>
int main (void)
{
printf("%u\n",sizeof (long long int));
return 0;
}
8
I'm sure it wasn't like that before, if memory serves.
I'm going to see do some research on the matter.
C99 specifies that they are typedefs.Clarification:Quote:
3 These types are optional. However, if an implementation provides integer types with
widths of 8, 16, 32, or 64 bits, no padding bits, and (for the signed types) that have a
two’s complement representation, it shall define the corresponding typedef names.
I tried to print out the size of the long int in my 64 bit machine and it generates 8 but in the 32 bit machine it is 4..
And? That's allowed. There is no reason that it should be the same. If you need to have an integer of a certain size, then you can't rely on the basic types of int and long, but you need to use the typedef's int_32t or whatever they're called. You'll get whatever size (short, int, long, etc.) that is a 32-bit integer.
Standard C specifies
The rest is upto compiler to decideQuote:
* minimum value for an object of type long int LONG_MIN -2147483647
* maximum value for an object of type long int LONG_MAX +2147483647
* maximum value for an object of type unsigned long int ULONG_MAX 4294967295
PS. compiler can decide to extend these regeons:
Quote:
The values given below shall be replaced by constant expressions suitable for use in #if preprocessing directives. Their implementation-defined values shall be equal or greater in magnitude (absolute value) to those shown, with the same sign.
well I guess my problem here is this, on my 32 bit machine, I have:
typedef int int_4
typedef unsigned int_u4;
typedef long long int int_8;
and I have to find int_4, int_u4 int_8 in th3 64 bit machine.. however I tested that they are all the same, so I just assign the same value??
If they're the same, what's the problem?
Just let the typedefs be, they're right after all.
I mean I should have int_8 as 8, because I want it to be 8 byte in 64 bit machine
So it wasn't actually 8 bytes on the 32-bit machine???
Long long int is 8 bytes. Both on 32 and 64.
What do you mean by "all the same"? You mean int and long long are the same size on your 64-bit machine? I suppose that's possible.
Anyway, I don't think you've ever said what compiler you're using -- but if you poke around your header files, you may find a stdint.h header file which will have the typedefs you need in it. (IOW: you're not supposed to be defining these things yourself, you need to use the types the compiler knows about.) I would expect that you should have a line that defines a int32_t type, which is the four-byte integer you're panting for.
the thing is I want to have int_8 to be 8 on a 64 bit machine and int_8 to be 4 on a 32 bit machine.. the same thing as in int_u8
I also want int_4 to be 4 in 32 bit machine but I want it to be int_4 to be 8 in a 64 bit machine..
if you understand what I mean so I have to find the correct type that would let me do this:
typedef ______ int_u4(that will generate sizeof 4 in 32 bit machine and 8 when in 64 bit machine)
This really defies the reason to use typedefs. You also give them a really misleading name. So why are you doing this again?
I'm pretty sure that's the complete opposite of what you want, in that it makes no sense.
Edit: The whole point of this typedef thing is that this way you know that your variables are the same size wherever you go. If that's not what you want, then we've wasted this whole thread.