I have this simple program below:
Code:
#include <stdio.h>
#include <limits.h>
unsigned int rightrot(unsigned int x, unsigned int n)
{
/* calculate number of bits in type */
size_t s = sizeof(x) * CHAR_BIT;
size_t p;
/* limit shift to range 0 - (s - 1) */
if(n < s)
p = n;
else
p = n % s;
/* if either is zero then the original value is unchanged */
if((0 == x) || (0 == p))
return x;
return (x >> p) | (x << (s - p));
}
int main(void)
{
unsigned int val;
unsigned int pos;
val = 0xFF94;
pos = 5;
printf("%u\n", rightrot(val, pos));
}
The result it prints is 2684356604 on my 32-bit computer. The result I expect is as follows:
0xFF94 is 0000000000000000 1111111110010100 in binary.
Shift it right by 5:
0000000000000000 0000011111111100
Then take that result in shift it right by 27 (s is 32 and p is 5, so the difference is 27):
1111111110000000 0000000000000000
Now we use bitwise or:
0000000000000000 0000011111111100 | 1111111110000000 0000000000000000 = 1111111110000000 0000011111111100
That in decimal is 4286580732. So how does it come up with 2684356604?