Yeah I thought of that after I typed it. Unless you're on a "big endian" system, in which case it'd be something horrible like:Originally posted by Hammer
Technically not correct, imho. It will store a binary representation of a decimal 1 in an int variable, which will require lots of preceding binary 0's to pad the int. Therefore, we are storing is as 0000000....1 And how many digits is that?!
Anyway, pointless argument really We're both right, in our own way.
1000000000000000
Or, no, it's something screwed up because of the way it flops the byte ordering, something like:
00001000000000000
I don't recall off the top of my head, but it's ugly anyway.
Quzah.