just wondering, when doing a #define, what difference does it make if there is a dot after the decimal value? i was reading some code and saw the following #define line and i could not understand why it says that 1572864's bit 32 has place value 1. 1572864 in binary is 11 followed by 19 zeros, isn't? so how come bit 32 is 1? is it because of the dot after 1572864?
#define UNITBIT32 1572864. /* 3*2^19; bit 32 has place value 1 */