# Thread: calculation timer problem

1. ## calculation timer problem

Hello,
I am trying to write a program that determines how many calculations it would take to find the determinant of a matrix, and how long it would take to do those calculations. I keep getting 0 seconds for an output. Can anybody see where I may have gone wrong?
Code:
double mults_for_det(int n)
{
if(n == 1)
return 0;
else
return n + n * mults_for_det(n - 1);
}
int main(void)
{
int n;
double mults;
clock_t start, end;

printf("Enter a number for an nxn matrix.\n");
scanf("%d",&n);
start = clock();
mults = mults_for_det(n);
end = clock();
printf("For n = %d, the %g multiplications take %g seconds.\n",n,
mults,(double)(end - start) / (double)CLOCKS_PER_SEC);
return 0;
}

2. Print out CLOCKS_PER_SEC. If it's only 1000 (or even 1000000) it's probably not enough precision to measure one execution of your subroutine. Put it in a loop and execute it a million times. Then divide the clock difference by CLOCKS_PER_SEC and 1000000.

Also, you need to change the input of your subroutine to double.

3. Oogabooga's solution works, in addition, if you're using x86, you could always use the timestamp counter:
Code:
#define rdtscl(val) asm volatile ("rdtsc" : "=A" (val) : : );
/* ... */
unsigned long t1, t2;
rdtscl(t1);
mults = mults_for_det(n);
rdtscl(t2);

printf("Cycles = %lu\n", t2 - t1);
But note that the count is not representative of any accurate real-time measure, it's mainly useful for comparing functions in optimisation. And don't expect accuracy over ~20 cycles, because so many factors can influence the time readings.

You can *estimate* how long it takes by assuming that 1 cycle is approximately (1/c) nanoseconds, where c is the CPU frequency of your computer, in gHz. For example, running a 3.06gHz machine, a cycle is approximately 0.32 nanoseconds.

Popular pages Recent additions