1. ## #define format...

when people do...

this is really an integer right?
so why don't people just do...

2. 0x0401 is a hexadecimal number (1,025 in decimal)

I can't think of a good reason in that case to use hex, but if you wanted to define, for example, a bitmask such as:
Code:
```#define MASK 0xFFFF //1111 1111 1111 1111 binary
//as opposed to

3. Who knows. Maybe it just looks 1337. Maybe it's stupid repellant: like,

J. Random Luser: "Oh look! A number! I'll just change it so my hack works."

J. Random Luser: "Oh crap... that must do something important. Better not mess with it. "
*original programmer smiles*

Or maybe it was used in context with other hex numbers where it would suck if you changed bases randomly via macro.

4. Certain features of the Win32 API, for example, requires you to use values in a certain range. Often, these ranges are given in hex numbers.

5. of course the best one is

Code:
`#define SOME_NUM 0x05 // dec 5 == hex 5, it's the same freakin number!!!!`

although as citizen says, if you have a bunch of other valid hex values it's easier to keep consistency...

6. Nah, sometimes we need more action:
Code:
`#define JAMES_BOND 007`

7. Originally Posted by laserlight
Nah, sometimes we need more action:
Code:
`#define JAMES_BOND 007`
yes, but macros are EVIL! look, I'll prove it!

Code:
`#define MACRO_OF_THE_BEAST 0x29A // no points for guessing this in decimal`

8. I personally go with
Code:
`#define QUESTION  2B || !2B`

9. I use hex in my constants for the simple reason that I am decoding a binary file and over half the numbers I am working with are hex and it is easier to keep track of them in my head if I am reading them all in hex than to try and read some in decimal some in hex...

10. what is the conversion formula from hex to int (just wandering cuzz I just use random hex numbers in my code)

11. Conversion from hex to integer does not make sense. Hex is not a data type but a numbering system.

Binary - base 2
Octal - base 8
Decimal - base 10
Hex - base 16

So hex, binary, decimal, octal all have integers, just represented differently.

12. so really the compiler distinguishes the macro definition as an integer anyway... decimal would be good for a menu_id?

or is it just as easy saying ...

ORRRRRRR does it just depend on programming styles of each programmer?

13. 101 and 0x01 are not the same... 101 and 0x65 are the same... sometimes it is easier to work with hex numbers, but most of the time you are better off with decimal

14. First, if you are writing this you need to decide if the macro is truly necessary, because it probably isn't.
Code:
```const int MENUITEM1 = 1;
const int MENUITEMn = n; ...```
Even if it is necessary, write your constant in whatever base you want as long as you don't confuse anyone who will read your code, and be consistant with the base you choose. Don't define a hex number and then use it with decimals.

15. Chances are that simple base 10 numbers will be sufficient for your needs. If you need to register your own messages, though, you have to put them in a range of IDs specified in hex numbers.

WM_APP

Range Description
From 0 through WM_USER –1 Messages reserved for use by the system.
From WM_USER through 0x7FFF Integer messages for use by private window classes.
WM_APP through 0xBFFF Messages available for use by applications.
0xC000 through 0xFFFF String messages for use by applications.
Greater than 0xFFFF Reserved by the system for future use.
I use this in one of my applications.