Hex vs Dec

• 08-19-2009
Ducky
Hex vs Dec
When giving a value to a variable, is there any advantage to give it in a value hexadecimal
Does it save some cycles of calculations for the compiler or is it just some kind of
"snobbery" when programmers use hex when they could just use decimal instead?
• 08-19-2009
fronty
In some situations hex is clearer to read, because when you're used to it, you can quickly count in your mind which bytes are 1 and which are 0.
• 08-19-2009
Salem
Xmas and Halloween
Dec 25 == Oct 31
• 08-19-2009
bithub
Quote:

Originally Posted by Ducky
When giving a value to a variable, is there any advantage to give it in a value hexadecimal
Does it save some cycles of calculations for the compiler or is it just some kind of
"snobbery" when programmers use hex when they could just use decimal instead?

There is no performance advantage -- you should just use whatever is more readable in the given situation.

For instance, If I wanted to set someones age to 15:
age = 15; // readable
age = 0xF; // what?

Now let's say we want to set up a mask to get the lowest byte of information from a larger variable:
byte_mask = 255; // What does this mean?
byte_mask = 0xFF; // Makes sense.
• 08-19-2009
Dino
At compile time, who cares?
• 08-19-2009
Elysia
It's all just bits to the computer anyway, so the compiler "transforms" it into the correct opcode, hex, decimal, octal or whatever.
• 08-19-2009
Ducky
Ok, thank you everybody, i understand it better now.

@Salem
Yes i knew this one. Its a funny one. :-)
• 08-19-2009
cpjust
Quote:

Originally Posted by Salem
Xmas and Halloween
Dec 25 == Oct 31

Huh?

EDIT: Nevermind, now I see. I thought you were still talking about Hex...
• 08-20-2009
DoctorBinary
Quote:

Originally Posted by Ducky
When giving a value to a variable, is there any advantage to give it in a value hexadecimal