Thread: Cast to char from int

  1. #1
    Registered User
    Join Date
    Feb 2010
    Posts
    1

    Cast to char from int

    Hello everybody , I'm new here .. I have a little question :
    Given this code :

    Code:
    int *a;
    char *c;
    c=malloc(sizeof(char));
    a=malloc(sizeof(int));
    
    *(a) = 71;
    c = a;
    printf("%c\n",*c);
    the output is "G" ( ASCII for 71 ) .
    The problem is that i was expecting some other value , because a is stored on 4 bytes , and c on 1 byte , so if a was 00000000 00000000 00000000 01110001 (for example ) , c should be 00000000 (the first byte) and the output should be different .
    Can you please tell me what am i missing ?
    Thank you

    LE : Little Endian . I need more coffee so i can stop posting lame questions .
    Last edited by TeeGee; 02-22-2010 at 12:17 PM.

  2. #2
    Registered User
    Join Date
    Mar 2009
    Posts
    399
    The compiler will truncate the result using the LSB (Least Significant Byte) from the int, so the byte order doesn't really matter here since the compiler knows the endianness of the system.

    71 is small enough to fit the LSB, so no data will be lost in this case.

    EDIT: Didn't see he was using pointers, so truncation doesn't apply here.
    Last edited by Memloop; 02-23-2010 at 05:19 AM.

  3. #3
    Registered User
    Join Date
    Apr 2009
    Posts
    66
    You will feel the difference when you'll give the input more than 127. if you want to do this kind of stuff's especially when you want to play with the bytes , then go for "structure bit fields concepts"

    for ex :
    Code:
     
    struct name {
    unsigned int a :1 ;
    unsigned int c  ; 
    }
    Do some thing like this

  4. #4
    Registered User
    Join Date
    Mar 2007
    Posts
    416
    Quote Originally Posted by Alexander jack View Post
    You will feel the difference when you'll give the input more than 127.
    An unsigned character is in the range from 0-255, as is a signed character (although not always seen as an integral value). One byte is 2^8 which is 256 total values.

  5. #5
    ATH0 quzah's Avatar
    Join Date
    Oct 2001
    Posts
    14,826
    Quote Originally Posted by scwizzo View Post
    An unsigned character is in the range from 0-255, as is a signed character (although not always seen as an integral value). One byte is 2^8 which is 256 total values.
    Which really doen't have anything to do with the endian issue.

    How does 128 entered in a signed char differ from when it is entered into a signed [short|long] int?


    Quzah.
    Hope is the first step on the road to disappointment.

  6. #6
    Registered User
    Join Date
    Feb 2010
    Posts
    36

    Casting:

    Your understanding is wrong.Actually,the size of char is 8 bits which is 1 byte and int is 4 bytes.When you assign the integer value to character,it will convert the appropriate integer to character value.In your example,you gave 71 to character.So,the ASCII value of 71 is G.

    To understand it better,you assign the values larger than 255.

    If you assign 305 to char value,the value printed will be 1.Because,the char reaches the maximum of 255.After that,it will go for next cycle.So,the value will 305-256 which 49(ASCII) value.In your code,you explicitly convert the integer to character in the assignment statement to avoid warnings.

    c=(char*) a;

  7. #7
    Registered User
    Join Date
    Sep 2007
    Posts
    1,012
    Quote Originally Posted by vivekraj View Post
    Your understanding is wrong.Actually,the size of char is 8 bits which is 1 byte and int is 4 bytes.When you assign the integer value to character,it will convert the appropriate integer to character value.In your example,you gave 71 to character.So,the ASCII value of 71 is G.

    To understand it better,you assign the values larger than 255.

    If you assign 305 to char value,the value printed will be 1.Because,the char reaches the maximum of 255.After that,it will go for next cycle.So,the value will 305-256 which 49(ASCII) value.In your code,you explicitly convert the integer to character in the assignment statement to avoid warnings.

    c=(char*) a;
    First, a char is a byte, but a byte is not necessarily 8 bits. An int is not necessarily 4 bytes. A char must be at least 8 bits, and an int must be at least 16 bits; these are the only size requirements for char and int.

    The original code is not about converting an int to a char. It's about pointing a char* to an int. When you point a char* to any object, you're able to examine the object byte by byte. So in the example, c points to the first byte of an int. The original poster had it right when he edited the post: it's all about the endians.

    On a little endian system, the least significant byte comes first; on a big endian system, the most significant byte comes first. The code will be different depending on the integer representation of a particular system. The code is simply not about conversion between different-sized integer types. It's about the representation of values.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Replies: 2
    Last Post: 03-24-2006, 08:36 PM
  2. Converted from Dev-C++ 4 to Dev-C++ 5
    By Wraithan in forum C++ Programming
    Replies: 8
    Last Post: 12-03-2005, 07:45 AM
  3. getting a headache
    By sreetvert83 in forum C++ Programming
    Replies: 41
    Last Post: 09-30-2005, 05:20 AM
  4. Contest Results - May 27, 2002
    By ygfperson in forum A Brief History of Cprogramming.com
    Replies: 18
    Last Post: 06-18-2002, 01:27 PM
  5. How do you search & sort an array?
    By sketchit in forum C Programming
    Replies: 30
    Last Post: 11-03-2001, 05:26 PM