This may differ from OS to OS, etc, since the hi-res timer is kernel based. Eg, getimeofday does not exist on my system.
I would use time() for the yardstick anyway since that is a standard function should use the hardware clock.
Anyway, I've done this test before too:
Code:
#include <stdio.h>
#include <string.h>
#include <sys/timex.h>
#include <time.h>
int main() {
struct ntptimeval now;
static long start, last, ulast, umark;
long dif;
time_t RTCstart = time(NULL), RTCnow, RTClast;
int count;
ntp_gettime(&now);
start = now.time.tv_sec;
umark = now.time.tv_usec;
last = start;
while (1) {
ntp_gettime(&now);
RTCnow = time(NULL);
count = RTCnow-RTClast;
if (count) {
dif = now.time.tv_usec - umark;
printf("%ld seconds (RTC: %d) %ld/1000000\n",now.time.tv_sec-start,(int)(RTCnow-RTCstart),dif);
umark = now.time.tv_usec;
}
last = now.time.tv_sec;
ulast = now.time.tv_usec;
RTClast = RTCnow;
}
return 0;
}
Nb. "sys/timex" is the same timer used by nanosleep.
0 seconds (RTC: 0) 4/1000000
0 seconds (RTC: 1) 328752/1000000
1 seconds (RTC: 2) 0/1000000
2 seconds (RTC: 3) 1/1000000
4 seconds (RTC: 4) -999999/1000000
4 seconds (RTC: 5) 999999/1000000
5 seconds (RTC: 6) 0/1000000
6 seconds (RTC: 7) -1/1000000
7 seconds (RTC: 8) 1/1000000
8 seconds (RTC: 9) 0/1000000
9 seconds (RTC: 10) 0/1000000
10 seconds (RTC: 11) -1/1000000
11 seconds (RTC: 12) 0/1000000
12 seconds (RTC: 13) 1/1000000
Once aligned, this shows a max discrepancy of 1 millisecond.
My general point is that the hi-res timer is based on system ticks, so it reflects the limit of resolution you will achieve.