When i am measuring the time of a bunch of statements in my C program using the following way, I am getting 0.00000 sec..
Also am i getting the result in seconds? And how can I get the time in more high precision?
printf("\nTime taken : %fs",(double)t/CLOCKS_PER_SEC);
Yes, the result is seconds (assuming t is obtained from clock()). So if you know how many ticks it's for one second, you can calculate how many ticks it is for a millisecond and so on.
printf("\nTime taken : %f",(double)t/(CLOCKS_PER_SEC/1000));
For milliseconds. Just basic math.
And %f is used to print floats and doubles. You don't need to do anything else (and shouldn't).
We can't answer that, we have no idea how you are obtaining the value in t.
Show more code.
Actually my program consists of few functions and I did like this..
I want to print the time at any precision but always time is being show 0.000000! Please help...Also my calculated time includes time taken to execute all the functions in the main..So is my code correct?
// bunch of statments
t = clock();
t = clock() - t;
printf("\nTime taken : %fs",(double)t*1000/CLOCKS_PER_SEC);
That's because it doesn't take a second to execute all that! It probaly takes less than a millisecond!
Again, don't use %fs - use ONLY %f.
And divide t by the appropriate constant. One second = 1000 ms, 1 millisecond = 1000 microseconds (I think).
So t = CLOCKS_PER_SEC / 1000 = milliseconds.
t = CLOCKS_PER_SEC / 1000 / 1000 = microseconds
If you're on a UNIX system, you can use gethrtime(), which will get you a high-resolution timestamp. You make a call to gethrtime(), which returns a nuber of nanoseconds since some past event. Then execute the statement you want to time and call gethrtime() again. Subtract the two and you get the time taken in nanoseconds. run "man gethrtime" on UNIX, it even gives you an example of how to use it. :)