# Thread: #define with a dot?

1. ## #define with a dot?

hi there:

just wondering, when doing a #define, what difference does it make if there is a dot after the decimal value? i was reading some code and saw the following #define line and i could not understand why it says that 1572864's bit 32 has place value 1. 1572864 in binary is 11 followed by 19 zeros, isn't? so how come bit 32 is 1? is it because of the dot after 1572864?

#define UNITBIT32 1572864. /* 3*2^19; bit 32 has place value 1 */

many thanks

CHUN

2. The point means that the number is floating point
All these
2.4 , .4 , 2.
are floating point
Without the '.' it would be an int

3. yes, i just realised that. thanks

but i am still puzzled by the code i am trying to understnad. the 1572864. means that its a 32 bit floating number. in floating numbers, the bit 32 (what's the convention of counting bits? do we start with bit 1 or bit 0?) is the sign bit and its 0 when the value is postive otherwise its 1. in the case of 1572864., its a postive number, so the sign bit should be 0, right? how come it says "bit 32 has place value 1" . unless i misunderstood this comment completely.

cheers

CHUN

4. How are you viewing the fact that the bit is set or not? You are correct that according to the floating point standard, the most significant bit is used to determine the sign.

For instance, in memory - negative 1.0 looks like:

00 00 80 BF

while positive 1.0 looks like:

00 00 80 3F

Since my machine is little endian, the most significant byte is the last one listed. As you can see they differ only by the most significant bit of that byte.

5. //Double
#define PI 3.14159

//Float
#define PI 3.14f

6. hi there:

thanks for all your reply, i think i begin to understand this line now. i think what the coment meant is that it indicates the bit/byte order whenever the UNITBIT32 is used. bit 32 has the place value 1 means that its a in little endian order so that sign bit is the right most bit (bit 0), and as a result, the left most bit (bit 31) will have the place value of 1.

am i correct?

#define UNITBIT32 1572864. /* 3*2^19; bit 32 has place value 1 */

many thanks

CHUN

7. if you want to examine how numbers are represented in your machine you can also try this sort of thing.

Code:
```#include <stdio.h>

int main (void)
{
union
{
int i;
float f;
} conv;

conv.f = 123.4;
printf ("%x\n",conv.i);
}```
The difference is that this does the conversion at runtime, whereas the #define method is compile time.