Greetings,
I have been having an issue with some code that is taking up 98% of the cpu usage when there is nothing happening.
I found a post
http://cboard.cprogramming.com/showt...highlight=idle
that spoke of the same issue, however i do have my program using Sleep whenever there is no data, but it is still using the same CPU usage.
The program is going to end up being a plugin irc protocol for an online game, however for right now i its just a standalone applicatoin, and finding a way to optimize it to stay on low cpu usage while no data is comming in would be great.
Note: I can't used the windows message calls like Peekmessage and PumpMessage since it will be ported later, however for now any other methods that anyone can think of that doesn't use the windows message queue would be great, also its a console app so message queue won't do much good =/.
here is the current loop code.
Code:
while(1)
{
if(newinput)
{
sprintf(buffer,"%s",input);
send(cData.sock,buffer,strlen(buffer),0);
newinput = false;
input = "";
}
//Get Incommming Data
for (i=0;i<512;i++) {
err2 = recv(cData.sock, &ireadbuf[i], 1, 0);
if ((ireadbuf[i] == '\n') || (err2 == 0) || (err2 == SOCKET_ERROR)) {
if (ireadbuf[i] == '\n') {
ireadbuf[i] = '\0';
if((message = irc->parse(ireadbuf)) != NULL) {
if(message[strlen(message)-1]== '\r')
message[strlen(message)-1] = '\0';
irc->ircOut(message);
}
}
break;
}
}
if ((err2 == 0) || (err2 == SOCKET_ERROR)) {
if (!(WSAGetLastError() == WSAEWOULDBLOCK)) {
irc->ircOut("**Connection closed.**\n");
if (err == SOCKET_ERROR) {
sprintf(buffer,"**Error %i\n", WSAGetLastError());
irc->ircOut(buffer);
}
break;
} else {
Sleep(1); // Idle
continue;
}
}
Sleep(1);
}
There is an Input Thread running the background processing input, however even with it turned off, the cpu usage is still the same.
The Full Messy Source can be found Here
Any advice or help is appricated.
Also if you see any eyesores please point them out, even if its the whole thing!
Thanks.