Casting char as int problem

This is a discussion on Casting char as int problem within the C++ Programming forums, part of the General Programming Boards category; I was messing around with casting a single character as an integer, and I've been getting bizzare results. If I ...

  1. #1
    Registered User
    Join Date
    Sep 2001
    Posts
    28

    Casting char as int problem

    I was messing around with casting a single character as an integer, and I've been getting bizzare results. If I do this:

    Code:
    char myChar = (char)200;
    unsigned int result;
    
    result = (unsigned int)myChar;
    Result now has the value of 4.5 billion or so (all bits set to 1, except possibly the last 8, which would be whatever 200 equals). Why is it defaulting to setting all the other bits to 1? Shouldn't they be 0? Makes it harder to do this.

    What I'm ultimately trying to do is save the value of the character to a text file, but I can't save it as a character, because it might be a character that will mess up the file (like a carriage return, or null character, or something). I do not want to save it in binary mode, so saving it as an integer is the only way I can come up with to do it.

    Basically, I was using a character for bitwise operations to set certain flags, and now I want to be able to save them.

    I suppose when I reload the file I could cast the 4.5billion number as a character, but I have no idea what that would do, since characters don't have that many bits.
    -Grunt (Malek)

  2. #2
    Pursuing knowledge confuted's Avatar
    Join Date
    Jun 2002
    Posts
    1,916
    does the line (char)200 do what you want it to do? try displaying myChar and see what you get.
    Away.

  3. #3
    Registered User
    Join Date
    Nov 2001
    Posts
    1,348
    One solution is ostream an ostringstream object. Instantiate an ostringstream ostream and fill it with the "200." call str() to get "200" back as a string.

    Kuphryn

  4. #4
    Registered User
    Join Date
    Sep 2001
    Posts
    28
    Originally posted by blackrat364
    does the line (char)200 do what you want it to do? try displaying myChar and see what you get.
    The (char)200 works fine. If I do (char)86, it displays the correct letter for that ascii character. The number 200 was just an example, if I printed that out it might be a junk character, or a beep, or something.

    I think I found a solution, that appears to work. Here's what I ended up doing:

    Code:
    char myChar = (char)200;
    unsigned int result;
    
    result = (unsigned int)myChar & 255;
    Basically I just did the bitwise and operation with all 1's in the byte that I wanted. Seems to work for now, although I'm still curious as to why it was so messed up without doing this.

    I thought that maybe I needed to initialize result to 0, incase it only overwrote the 1 byte that it needed to, and left the other bytes (3 other ones in my case) as whatever randomly got assigned to them. But initializing result to 0 didn't help either.
    -Grunt (Malek)

  5. #5
    Registered User
    Join Date
    Feb 2003
    Posts
    595
    Exactly how did that error show up? Are you sure you don't have some other error someplace?

    I just tried this code with 3 compilers:

    Code:
    #include <iostream>
    #include <fstream>
    using namespace std;
    
    int main()
    {
        char myChar = (char)80;
        unsigned int result;
        ofstream out("result.txt");
    
        cout << myChar << endl;
    
        result = (unsigned int)myChar;
    
        cout << result << endl;
        out << result << endl;
    
        out.close();
    
        return 0;
    }
    The screen output of all three was:

    P
    80

    and the output file "result.txt" contains just
    80

    just as expected.

  6. #6
    carry on JaWiB's Avatar
    Join Date
    Feb 2003
    Location
    Seattle, WA
    Posts
    1,972
    I don't understand why you are typecasting anything.

    doing result = myChar; and outputting result will give you an integer, since result is an integer. Also myChar = (char) 80 doesnt make since to me since it is a char...am I missing something?
    "Think not but that I know these things; or think
    I know them not: not therefore am I short
    Of knowing what I ought."
    -John Milton, Paradise Regained (1671)

    "Work hard and it might happen."
    -XSquared

  7. #7
    Registered User
    Join Date
    Sep 2001
    Posts
    28
    Originally posted by JaWiB
    I don't understand why you are typecasting anything.

    doing result = myChar; and outputting result will give you an integer, since result is an integer. Also myChar = (char) 80 doesnt make since to me since it is a char...am I missing something?
    I was type casting the 80 as a char since 80 is an integer, although I don't recall if the compiler complained about it or not, so it may not have been necessary.

    As for R.Stiltskin (and possibly others if they just didn't post about it) not getting the error, perhaps I hit a bug with either Windows XP or Microsoft Visual C++ 5.0, as that is what I am using. When I wrote the result to file, I got the 4+ billion number, and I still get it. Doesn't appear to be an output problem, as when in debug mode trying to figure out what was wrong, I got the same bizzare number when stepping through.

    <edited>
    I think I maybe misunderstood your type casting the char as 80 question. By typing

    Code:
    myChar = (unsigned int)80;
    what this does is set myChar to be the ascii character denoted by the number 80. I believe it is a capital case letter, don't recall which one offhand.
    Last edited by Malek; 05-01-2003 at 10:10 PM.
    -Grunt (Malek)

  8. #8
    Disturbed Boy gustavosserra's Avatar
    Join Date
    Apr 2003
    Posts
    244

    ??? very strange

    Code:
    #include <stdio.h>
    #include <conio.h>
    
    int main(){
    
      char myChar = 200;
      int result = 0;
    
      result = myChar;
      
      printf( "%d\n" , result );
      
      getch();
      
      return 0;  
    }
    The output for this one is -56. I think this is because 200 in char is the complement 2 of -56. When casting a lower amount of bytes, the compiler make a conversion that could interpret the 200 as -56. I donīt know if this is correct, but for me seems logical... any ideas?
    Nothing more to tell about me...
    Happy day =)

  9. #9
    carry on JaWiB's Avatar
    Join Date
    Feb 2003
    Location
    Seattle, WA
    Posts
    1,972
    myChar = (unsigned int)80;
    Well I don't know if it makes sense to me (too lazy to think), but I think you had myChar = (char)80 before
    ???
    oh well it doesn't really matter anyways (i think...)
    "Think not but that I know these things; or think
    I know them not: not therefore am I short
    Of knowing what I ought."
    -John Milton, Paradise Regained (1671)

    "Work hard and it might happen."
    -XSquared

  10. #10
    Registered User
    Join Date
    Feb 2003
    Posts
    595
    Yeah, ascii 80 is capital P.

    Apparently, a char is treated as a signed int (but only 1 byte as opposed to 4 bytes for an int). So there's no problem with ascii characters up to 127, but when you put in 200, it's stored as binary 11001000, which as an 8-bit signed number is -56.

    Then, when you cast it as an int, it looks like the compiler is doing the equivalent of the assembly commands cbw (convert byte to word) and cwd (convert word to double) preserving the negative sign by filling all the leading bits with 1's.

    Then, it interprets the result 1111 1111 1111 1111 1111 1111 1100 1000 (still -56) as an unsigned int, which is 4294967240. Look familiar?

    I don't know why the compiler handles it this way even though you want it to be unsigned, but that's apparently what it is doing.

    Anyway, the simple fix is forget about char and unsigned int. Just declare myChar to be an int to begin with, and when you want to print it, cast it as char, like this:

    Code:
    #include <iostream>
    #include <fstream>
    using namespace std;
    
    int main()
    {
    	int i,myChar;
    	ofstream out("result.txt");
    
    
        for(i=0;i<6;i++)
    	{
    		myChar = 40*i;
    		cout << "(char)myChar: " << (char)myChar << endl;
    		out << myChar << endl;
    	}
    
    	out.close();
    
        return 0;
    }
    Last edited by R.Stiltskin; 05-02-2003 at 12:23 AM.

  11. #11
    Registered User
    Join Date
    Sep 2001
    Posts
    28

    Thanks :)

    Alright, thanks everyone, that clears things up. I'll just use one of the workarounds to make it work that way I want it to. Thanks for the help.
    -Grunt (Malek)

  12. #12
    pronounced 'fib' FillYourBrain's Avatar
    Join Date
    Aug 2002
    Posts
    2,297
    I think what you're after is an unsigned char probably.

    up to 256 would be allowed.
    "You are stupid! You are stupid! Oh, and don't forget, you are STUPID!" - Dexter

  13. #13
    Cat
    Guest
    Yeah, I'd wager an unsigned character would fit the bill just fine. The problem is that:

    char c = (char) 200;

    is the same as:

    char c = (char) -56;

    And so what it's doing is sign-extending to 32 bits, then storing into the integer. To prevent sign extention, the *source* must be declared as unsigned.

  14. #14
    Registered User
    Join Date
    Nov 2002
    Posts
    126
    Yep, you have to use an unsigned char to get values higher than 127. Also, if you're gonna use casts, use C++ casts...

    Code:
    unsigned char c = static_cast<unsigned char>(200);
    unsigned int result = static_cast<unsigned int>(c);
    
    cout <<result <<endl;

  15. #15
    Registered User
    Join Date
    Sep 2002
    Posts
    417
    might also look into wide characters if you want really high values

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Code review
    By Elysia in forum C++ Programming
    Replies: 71
    Last Post: 05-13-2008, 10:42 PM
  2. Screwy Linker Error - VC2005
    By Tonto in forum C++ Programming
    Replies: 5
    Last Post: 06-19-2007, 03:39 PM
  3. Replies: 2
    Last Post: 03-24-2006, 08:36 PM
  4. Contest Results - May 27, 2002
    By ygfperson in forum A Brief History of Cprogramming.com
    Replies: 18
    Last Post: 06-18-2002, 02:27 PM
  5. simulate Grep command in Unix using C
    By laxmi in forum C Programming
    Replies: 6
    Last Post: 05-10-2002, 05:10 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21