Consider the following 2 looping statements:
a) while(counter <=MAX_COUNT) counter++;
b) while(counter++ <= MAX_COUNT);
When I set counter to 1 and MAX_COUNT to 4 billion (counter and MAX_COUNT is type unsigned long), it takes a full second longer on my machine for statement b to complete than statement a. Anybody know why? I thought the code has the same effect, but it's obviously different machine language. The reason I'm asking is that I'm writing a program with heavy iteration and I need to minimize running time. Thanks.