I did some searching on this forum in ref. to endianness, and some of the posts are quite entertaining (quote brewbuck - of thigh-slapping hilarity).
Anyway, the point in a few of them is endianness is not left or right, but least significant and most significant. I'll buy that.
But, who forgot to tell the people that defined the bit shift operators? They are defined as shifting left and shifting right, not towards significance or away from significance (whatever that might actually mean in English).
My Mac is an Intel, and I have determined it is little-endian. I am processing data from an IBM mainframe, and that data is big endian. Writing the conversions for integers, I have figured this all out - no big deal in the end (no pun intended).
But, back to the bit shifting operators. It appears we get to think "big endian" at all times when using the bit shift operators, and ignore the actual endianness of the machine. Is this correct? (bit shifting for integers, that is)