Currently I'm studying from 'The C Programming Language' and it asks you to modify some code and then compare the run times. What is the best way to do this?
Printable View
Currently I'm studying from 'The C Programming Language' and it asks you to modify some code and then compare the run times. What is the best way to do this?
Code:#include <windows.h>
unsigned long int Time = GetTickCount();
// do whatever it is in here
printf("That took %ld milliseconds", GetTickCount() - Time);
Thanks for the reply. Any idea how I could do it on os x? :P
Can be pretty accurate.
gettimeofday
Portable.
clock
Anything sub-second needs a bit of care to make sure it is done well.
The accuracy and precision of any fast clock you can get at can vary from one machine to another.
Just in case anyone else digs up this thread in future. Here is some example code showing how to use clock():
outputs:Code:#include <stdio.h>
#include <time.h>
#include <math.h>
int main()
{
clock_t start = clock();
for (long i = 0; i < 100000000; ++i)
exp(log((double)i));
clock_t finish = clock();
printf("It took %d seconds to execute the for loop.\n",
(finish - start) / CLOCKS_PER_SEC);
return 0;
}
Code:It took 23 seconds to execute the for loop.
+1 for using the standard clock() function. Good job, BIOS.
If you're trying to measure something short, like a single statement, you should subtract off the
loop overhead, as it can be a significant part of the execution time.
Just run the same time measurement on the the same loop empty.
Then subtract that off the loop with the code under test in it.
And might as well convert the result to time per single pass also.
If the loop is empty, your compiler may optimize it out, throwing the loop out completely.
Quzah.
Well it does depend on the code under test.
But I was also including the loop counter increment and test. That's also done 100 million times.