Are int's and long's the same on 32 bit windows?
Are int's and long's the same on 32 bit windows?
If your compiler says sizeof(int) == sizeof(long), then yes.
Any Win32 compiler that wants any hope of application compatibility has both int and long at 32 bits.
All the buzzt!
CornedBee
"There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
- Flon's Law
int is the native register size, 32 bits on 32 bit machines. 32 bits on a 64 bit machine, which techncially goes aginst the standard, but is done so for compatibility reasons, so not too many people gripe or even care.
long is longer than a short, the standard makes no further requirements
Against what standard?32 bits on a 64 bit machine, which techncially goes aginst the standard
All the buzzt!
CornedBee
"There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
- Flon's Law
The C standard does not state the size of any integer type other than minimum values they need to cope with, and that the types must be sized in this order: char<= short <= int <= long <= long long. [So long may not be longer than short, but long must not be SHORTER].
As you state, on a 64-bit machine, an integer may well be 32-bit, and long can be 32 or 64 bits [in fact, Linux has long as 64 bits, when Windows choose to have long as 32 bits on the x86-64 architecture].
Yes, on Windows you can rely on int and long being 32 bits for the near foreseeable future. In other platforms, it's not necessarily true.
--
Mats
Compilers can produce warnings - make the compiler programmers happy: Use them!
Please don't PM me for help - and no, I don't do help over instant messengers.
> Yes, on Windows you can rely on int and long being 32 bits for the near foreseeable future. In other platforms, it's not necessarily true.
Which brings me to the question, what if you want a 32 bit data type? Ie for reading / writing to files? Would it be best to use the C99 standard stdint.h? Or is there an alternative for C89 programs...
I ask because I've seen countless, fwrite(&myint, sizeof(int), 1, fp); & respective freads() even in "industrial" code.
For C89 and portability, you would have to implement something comparable [although of course not necessarily as comprehensive] to stdint.h yourself. But I'd go for stdint.h until you find an instance where you actually need the c89 compatibility - I'd expect it be pretty rare these days.
What you should definitely do, is to use some sort of typedef that includes a side indication to make sure that IF you ever need to move to another platform where int is a different size than your program expects, you don't have to "search and destroy" all int's [of course, for local variables that are not stored in file, size generally doesn't matter, and you may want to use int for simple stuff where "any integer as long as it can hold a few thousand is ok" and long "need an integer that can hold several millions".
--
Mats
Compilers can produce warnings - make the compiler programmers happy: Use them!
Please don't PM me for help - and no, I don't do help over instant messengers.
Hmm thanks, It's another one of those issues that's "glossed" over. Never thought of typedef'ing them myself -- adds a big dash of portability I guess -- which I know I don't need anyway
> There are stdint.h files to find on the web, so I don't think it's that bad.
Noticed that in some of my research, but I guess this is where autoconf and co. come into it
Compilers can produce warnings - make the compiler programmers happy: Use them!
Please don't PM me for help - and no, I don't do help over instant messengers.
If you're already doing windows programming, windows.h defines an absurd amount of types that are of known size. WORD is guaranteed to be unsigned 16 bits, DWORD unsigned 32 bits. INT16, INT32 and their UINT variants also are defined somewhere in the depths of the Win32 API. Perhaps even the 64 bit variants.
All the buzzt!
CornedBee
"There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
- Flon's Law
To me int never made any sense, that's why I use:
long - not int
ulong - not DWORD
short
ushort - not WORD
And a good thing to, my code's easier to read now, it even looks more organized.
long has a meaning in a 16-bit compiler or a compiler that supports 64-bit integers. For Windows (these days), there is no point to using long.
ulong isn't a standard type: unsigned long is. As above, no point in Windows.
short definitely has a valid point in Windows compilers if you only need 16 bits and a sign.
ushort, again, isn't a standard type, unsigned short is.
But I would also point out that if you use WORD, DWORD, INT16 etc, you are relying on Windows.h - which means that the code is unportable to other architectures. Defining YOUR OWN types will allow the code to compile on "any" machine that has an architecture suitable for the application as such, as long as you use your own types. You can, if you see fit, provide your own typedefs that use DWORD, WORD, etc as the basetype. Obviously, if you make extensive use of Windows API, you should use the original type (e.g. DWORD).
--
Mats
Compilers can produce warnings - make the compiler programmers happy: Use them!
Please don't PM me for help - and no, I don't do help over instant messengers.