I am trying to clamp the framerate for my FPS breakout to about 50, and I've noticed a significant discrepancy between the hi-res microsecond timer and the normal whole second timer.

I'm guessing this is because the low-res timer uses the mobo clock, and so is pretty exact, whereas the high-res timer relies on system ticks, which are more variable.

Can anyone confirm this experience? I'm calculating the frame rate crudely, in the drawing function:
Code:
        static int reps = -1;
        static time_t t = 0;
        time_t x;
      
        if (!(reps%1000)) {
                if (t) {
                        x = time(NULL);
                        printf("%d frames per second\n",1000/((int)x-(int)t));
                }
                t = time(NULL);
        }
        reps++;
And using a timer function to call the drawing function, with more static variables and the high-res timer: If less than 20000 usecs have passed, the timer waits for the difference. Considering the unrestrained frame rate was 200-300, this pause should be a given, resulting in 50 fps.

But the counter in the drawing function is giving me 80-90 fps.