I have a question regarding endians.
I have this piece of code that I'm compiling on a Pentium 4 machine, and it doesn't behave like I expect it to.
Code:
void CREARECOMANDA(char adresaSlave, char tipComanda, unsigned short adresaRegistru, unsigned short numarCuvinte){
unsigned short crc;
comanda[0]=adresaSlave;
comanda[1]=tipComanda;
comanda[2]=adresaRegistru>>8;
comanda[3]=adresaRegistru;
comanda[4]=numarCuvinte>>8;
comanda[5]=numarCuvinte;
crc=CRC16(comanda,6);
comanda[6]=crc;
comanda[7]=crc>>8;
}
"comanda" is an array of unsigned char. The thing is, from what I've read on the internet, Intel processors use Little-Endian encoding. So placeing foo=0x0A0B into 2 one byte numbers would be
Code:
bar[0]=foo&255;
bar[1]=(foo>>8)&255;
But the code i just wrote does this the other way arround. The funny thing is it does'n do so for al the numbers. "crc" is of type usigned short, and "CRC16" returns unsigned short, yet the separation of bytes doesn't work like it did for the unsigned shorts before that. I can't figure out why the code behaves like it does. What am I missing?
P.S. I'm using LabWindows/CVI as IDE.