I was messing around with casting a single character as an integer, and I've been getting bizzare results. If I do this:
Result now has the value of 4.5 billion or so (all bits set to 1, except possibly the last 8, which would be whatever 200 equals). Why is it defaulting to setting all the other bits to 1? Shouldn't they be 0? Makes it harder to do this.Code:char myChar = (char)200; unsigned int result; result = (unsigned int)myChar;
What I'm ultimately trying to do is save the value of the character to a text file, but I can't save it as a character, because it might be a character that will mess up the file (like a carriage return, or null character, or something). I do not want to save it in binary mode, so saving it as an integer is the only way I can come up with to do it.
Basically, I was using a character for bitwise operations to set certain flags, and now I want to be able to save them.
I suppose when I reload the file I could cast the 4.5billion number as a character, but I have no idea what that would do, since characters don't have that many bits.