I'm just making sure I understand how this works. If you have a byte repesenting an unsigned value (unsigned assumes absolute) then you can have 256 different values (2 raised to the 8) with a range of (0 - 255).
Now if you have a signed char, 2's compliment system makes it so that all of the non negative values can have both positive and negative values, correct? What is the maximum absolute value of a signed char (or any variable that is only a single byte). If you divide 256 / 2 you get 128, does that mean (0-127) is the range of values?
I'm pretty sure I understand this stuff, but I've been having problems with standards and file IOstreams and I'm wondering if something like this may be the source of the error.