I don't know what "least significant bits" and "most significant bits"
mean...
I was reading about the GetAsyncKeyState() function in the VC++
docs; it says that if the least significant bit is set, the key was
pressed after the previous call. If the most significant bit is set,
the key is down.
What are they talking about? What's the difference between
least and most signifcant bits?
Also, about LOWORD and HIWORD... How do these work? 'The
LOWORD macro retrieves the low-order word from the given
32-bit value.'
#define LOWORD(l) ((WORD) (l))
That's typcasting (l), right? But what specifically is happening?
What's a low-order word? Isn't a WORD just an unsigned int
typedef? And aren't unsigned ints already 32 bits? So what's
happening?
#define HIWORD(l) ((WORD) (((DWORD) (l) >> 16) & 0xFFFF))
And that one looks really complicated. What exactly is happening
here?
Thanks a lot.