I am designing a program in Xcode on my Mac using C, the program needs to receive power usage data from a wireless node and then insert the data into my database. The connection is using RS232 serial communication. Every 5 seconds a node sends 2 bytes, which my program must check and combine to extract the power usage data. The bytes are organized as follows:

Byte: X N P _ _ _ _ _
Bit #:7 6 5 4 3 2 1 0

X - not used
N - identifies 1 of 2 nodes
P - identifies upper or lower byte, 1 = upper and 0 = lower
The remaining bits are combined to get the power usage data.

Power Usage = bits 0 through 4 of upper combined with bits 0 through 4 of lower.

I am receiving the bytes, but my code is not converting the power usage data correctly. I am working with an Engineering team, who have designed and built the nodes. They are developing everything in windows. They have a similar program for testing purposes written in C#. Their communications work fine, and I modeled my byte conversion after theirs. (Not syntax, but the bitwise operations) However, my program only outputs 0.0 for power usage data.

Can anyone see where I am going wrong here? Thanks in advance.

Code:
uint8 chout; // serial input data
uint8 nodeBuff[2]; // array to hold node byte data
nodeBuff[0] = 0x00;
nodeBuff[1] = 0x00;
int mainfd = 0; // File descriptor
int nodeNum; // keeps track of which node data refers to
int usageData; // holds raw power usage data
double powerData; // holds power usage data to be sent to database
struct termios options;

mainfd = open_port(); // Get file descriptor for port
fcntl(mainfd, F_SETFL, FNDELAY); // Configure port reading
tcgetattr(mainfd, &options); // Get the current options for the port
	
// Set the baud rates to 9600
cfsetispeed(&options, B9600);
cfsetospeed(&options, B9600);
    
// Enable the receiver and set local mode
options.c_cflag |= (CLOCAL | CREAD);
options.c_cflag &= ~PARENB; // Mask the character size to 8 bits, no parity
options.c_cflag &= ~CSTOPB;
options.c_cflag &= ~CSIZE;
options.c_cflag |=  CS8; // Select 8 data bits
options.c_cflag &= ~CRTSCTS; // Disable hardware flow control
	
// Enable data to be processed as raw input
options.c_lflag &= ~(ICANON | ECHO | ISIG);
	
// Set the new options for the port
tcsetattr(mainfd, TCSANOW, &options);
Serial Comm Connection Initialization

Code:
while (1){
		
	read(mainfd, &chout, sizeof(chout)); // Read character from ABU
		
	if (chout != 0){
			
		// checks for upper byte flag
		if (((chout & 0x20) == 0x20)) {
			nodeBuff[0] = chout;
		} 
			
		// else it is the lower byte
		else{
			nodeBuff[1] = chout;
				
			// combine the upper and lower power usage data
			usageData = (nodeBuff[0] & 0x1f) << 5; // upper half
			usageData = usageData + (nodeBuff[1] & 0x1f); // lower half
			powerData = usageData * 1.83255;
				
			// check for node number
			if ((nodeBuff[0] & 0x40) == 0x40) {
				nodeNum = 1; // data is for node 1
                                // add data to database
			}
			else {
				nodeNum = 0; // data is for node 0				
                                // add data to database
			}
		}
	}
		
	chout = 0; // reset serial input data variable
	usleep(10); // pause for 10 miliseconds
	
}
I get both bytes, but the power usage always ends up 0.0... If this isn't enough information, I'll be happy to post more for you.

Thanks!