Ok I understand that by trying to create a delay using timeGetTime I'll get an approximate delay, not an exact one. But if anything it should be AT LEAST the amount of time I tell it to delay. My problem is:
I have one
printf ("time is: %i", timeGetTime());
statement before and one after executing the following code. The difference between the first and the second outputs is always much less than 1 sec (usually between 500 and 600 ms)...
Code:
int Delay1sec(void)
{
static unsigned long ultime = timeGetTime()+1000;
while (timeGetTime()<ultime)
{ Sleep(5); }
ultime = timeGetTime()+1000;
return 0;
}
What am I doing wrong?