1. ## typecasting

when we type cast a char to an int do we end up with the lower order bits or the higher order?

2. Suppose char consists of 8 bits and int of 16 (for my own convenience). Casting will (MSB -> LSB) at its simplest do the following.
Code:
```Char:        --------xxxxxxxx
Casted int:  00000000xxxxxxxx```
x's are don't care terms. Can be either 0's or 1's.

3. i am sorry i meant the other way round. final answer is in char.
so can i assume that in this case the higher order bits will be lost?

4. Yeah. Sorry. Mistake was mine. Goes from bottom up in that case.

5. Code:
```int arr[3]={2,3,4};
char *p;
p=(int *) (p+1)

printf("%d",*p)```
now p has value 0 why?

6. Well why not? p could have any value on earth, given that you started with an uninitialized pointer, then incremented it by one.

7. ok
p=arr;//missed that

8. Now if your question is "why do i get 0 now", you have to remember how pointer arithmetic works -- p+1 moves over one byte (one char, since that's what p points to), which means it now points to the second byte of your first int. If your int is four bytes long (and it probably is), then no matter which endianness you've got that second byte is assuredly zero.

9. OK. Lets assume int is a 4 byte. So int arr[3] is equivalent of char char_arr[12]. Every 4 bytes in char char_aarr[12] will represent 1 int in arr[3]. So p+1 means the second byte for char_arr and the second byte of the first int of arr.
The first byte of arr is 2. That is 0...0 0...0 0...0 00000010. So p+1 will refer to the second byte, which is 0...0 (8 zeros). That is 0.
The (int *) casting you do has no sense. It is useless in this context.
Try
Code:
```char* p = arr;
p = ((int*)p) + 1;
//or to be more simple
int* tmp = (int *)p;
tmp = tmp + 1;
p = (char*)tmp;```
EDIT: Compiled/Run to be sure. Both my examples give 3, as expected. Note the differences in the parenthesis of your example with mine. You do p+1 and THEN cast to int*. I cast first p and then do + 1

10. thanks all of u.
very grateful.