1. ## Decimal vs. Hex

In a lot of source code i've seen i've noticed that most authors use a lot of hex numbers in their work. What i want to know is, why hex? I've been programming for a little over a year so i'm familiar with what it is and HOW to use it, but WHY? I've heard that it's faster than decimal because it's a power of 2 (2^4) but in modern day compilers does that really make a difference?

Thanks.

2. Hexadecimal numbers are numbers with base 16 and they are in many cases used since conversion from binary number to hexadecimal number is very easy.

0000 - 1
0001 - 2
0010 - 3
0011 - 4
0100 - 5
0101 - 6
0110 - 7
0111 - 8
1000 - 9
1001 - A
1010 - B
1011 - C
1100 - D
1101 - E
1110 - F
1111 - G

3. That's stupid isn't it? :-)

0000 - 0
0001 - 1
0010 - 2
0011 - 3
0100 - 4
0101 - 5
0110 - 6
0111 - 7
1000 - 8
1001 - 9
1010 - A
1011 - B
1100 - C
1101 - D
1110 - E
1111 - F

Dutch mathematics, 0 == 1. :-)

4. _Bin___HEX
1111 = 10

5. Can't tell Shiro's a programmer - he can't decide whether to start counting with 0 or 1

6. Originally posted by Goof Program
_Bin___HEX
1111 = 10
_Bin___HEX
1111 = F

7. >Can't tell Shiro's a programmer - he can't decide whether to >start counting with 0 or 1

Well, it is very confusing. I once went to the store around the corner to buy some apples. Counting, 0, 1, 2, 3. So I told the girl in the store that I had 3 apples, she didn't believe me.