okay...im kind of confused...

i am sitting here at work, kind of bored. I have finished all my work, so I decided to program for awhile just for fun.

But anyways, as most of us know, in a big endian machine, the most significant byte is stored lower in memory, so the integer FFFF would be stored as such in memory:

00 00 FF FF

In a little endian machine, the least significant byte is store lower in memory, and so that same number would be stored as such:

FF FF 00 00

Well, my machine here at work is a Win2k machine...running on an Intel processor, and therefore it is little endian because Intel processors are little endian. Or at least according to this website they are: http://www.netrino.com/Publications/...ndianness.html

Well, what confuses me is this. I outputted the unsigned integer FFFF bit by bit from left to right, and it came out like this:

00000000 00000000 11111111 11111111

All i did was bitshift and output in a loop. Well, since intel machines little endian, isnt that backwards? shouldnt it be outputting:

11111111 11111111 00000000 00000000

or am i just not thinking straight right now.....

[edit]
maybe this should have gone on the tech board...i dunno...i just posted it here...but if it should be on the tech board feel free to move it there
[/edit]