Hello!
When studying OpenGL, I noticed that constants are defined as such:
I also read about bit fields, and, if I have understood them correctly, they work like this:Code:#define GL_CONSTANT_1 0x01 // 1 = 00000001
#define GL_CONSTANT_2 0x02 // 2 = 00000010
#define GL_CONSTANT_3 0x04 // 4 = 00000100
So, if you want to store eight boolean numbers in just one byte, you can do that with a bit field. (Please correct me if I've misunderstood.)Code:typedef struct
{
unsigned char one : 1;
unsigned char two : 1;
// ...
unsigned char eight : 1;
} BitField;
However, what's the point in using hex when defining? Is there any difference between writing:
I understand that it might be easier when the person reading it is internally converting it to binary, but what is (and I assume this exists) the reason(s) beyond that?Code:#define CONSTANT 0x01
// ... and ...
#define CONSTANT 1
I'd appreciate any help on this matter.