I am writing a data buffer which will handle data that will be sent out on a socket stream. So needless to say, it will be in a big endian format because that is the network standard.
The following code puts this data in correctly:
Code:
buffer_add_int16(sendbuff, 0xBBCC);
If I look at the buffer byte for byte, it is in the order 0xBB 0xCC, which is good.
The problem comes along when I try to read it out of the buffer. My system is little endian.
Here is my function for reading out a WORD sized datatype:
Code:
uint16_t buffer_get_int16(buffer_t *buffer)
{
uint16_t ret;
buffer_get_bytes(buffer, buffer->buffer_position, &ret, 2);
buffer->buffer_position+=2;
if (__BYTE_ORDER == buffer->buffer_byte_order)
return ret;
else
return ntohs(ret);
}
void *buffer_get_bytes(buffer_t *buffer, uint16_t offset, void *data, uint16_t length)
{
memcpy(data, buffer->buffer_data + offset, length);
return data;
}
I have checked, and __BYTE_ORDER == buffer->buffer_byte_order is false, so it sends ret through ntohs. But doing this gives it to me in big endian, which is what I don't want.
Really all I am asking is there something done under the hood that automatically converts it to little endian when I pull it out of the buffer?