measuring time of execution

This is a discussion on measuring time of execution within the C Programming forums, part of the General Programming Boards category; Hi, I want to compare time of execution a recursive and nonrecursive program! I tried using function clock() but it ...

  1. #1
    Registered User
    Join Date
    Oct 2004
    Posts
    5

    measuring time of execution

    Hi, I want to compare time of execution a recursive and nonrecursive program!
    I tried using function clock() but it seems that my program is so fast that I always get result 0.00000.
    I did search a board but didn't find the answer how to measure very fast executions.
    I wonder if this is a correct way to solve this:

    Code:
    #include <stdio.h> 
    #include <time.h> 
    
    #define MAX 1000000 
    
    long fact(int);
    long fact_rec(int);
    
    int main ( void )
    {
    	long res;
    	int i;
    	double duration;
    	clock_t start,end;
    
    	start=clock();
    	for(i=0;i<MAX;i++)
    		res=fact(10);
    	
    	end=clock();
    
    	duration=(double)(end-start)/(double)(CLOCKS_PER_SEC);
    	printf("Duration is %lf us!",duration);
    	return 0;
    }
    
    
    long fact(int n)
    {
    	int i;
    	long res=1;
    	if(n<=0)
    		return 1;
    	for(i=2;i<=n;i++)	
    		res*=i;
    	return res;
    }
    
    long fact_rec(int n)
    {
    	if(n<=0)
    		return 1;
    	return n*fact_rec(n-1);
    }
    I want to display time in microseconds!
    I figure that if (double)(end-start)/(double)(CLOCKS_PER_SEC) is time in seconds, all I need to do is to multiply with 1000000, and because of loop I need to divide with 1000000 to get average time. I get these results:

    fact(10) => duration 0.45 us
    fact_rec(10) => duration 1.842 us.

    Is this a correct way of measuring time?
    Interesting If I execute recursive function fact_rect and start program I wait 2-3 seconds program to finish and then I get results displayed 1.842 us that is 1.8 million part of sec (us) and I wait program to finish at least 3 seconds.
    Why is that happening?

    Thanks!

  2. #2
    Registered User
    Join Date
    Jan 2002
    Location
    Vancouver
    Posts
    2,220
    http://www.wideman-one.com/gw/tech/d.../wintiming.htm

    edit: one way you could measure it is to loop the process several million times and time the whole thing, then divide by however many times you repeated it.
    Last edited by Brian; 12-01-2004 at 12:27 PM.

  3. #3
    Registered User
    Join Date
    Oct 2004
    Posts
    5
    Quote Originally Posted by Brian
    one way you could measure it is to loop the process several million times and time the whole thing, then divide by however many times you repeated it.
    I think I used that. Time is displayed in microseconds. My question is: Is this correct way and maybe I shoud use bigger number for MAX constant? Is there any other realiable way to achive this!

  4. #4
    Registered User
    Join Date
    Oct 2001
    Posts
    2,934
    I wait 2-3 seconds program to finish and then I get results displayed 1.842 us that is 1.8 million part of sec (us) and I wait program to finish at least 3 seconds.
    Why is that happening?
    Well, you're calculating how long it takes to do one fact_rec(), but you're still calling it 1,000,000 times. So you should be waiting 1.842 seconds for it to finish. Why it would take 3 seconds I don't know.

  5. #5
    VOX
    VOX is offline
    Deleting... VOX's Avatar
    Join Date
    Oct 2004
    Location
    VA
    Posts
    94
    I don't know how you would do this, but another idea is get the time at program start, and get the time at program end. Subtract start from end for the program run time.

  6. #6
    and the hat of wrongness Salem's Avatar
    Join Date
    Aug 2001
    Location
    The edge of the known universe
    Posts
    32,688
    For those using gcc on some kind of pentium processor
    optimizing bubble sort

    For those using windows operating systems
    http://msdn.microsoft.com/library/de...ncecounter.asp

  7. #7
    .
    Join Date
    Nov 2003
    Posts
    307
    If you run on Linux or other multi-processing systesm, your process gets pre-empted by the OS for other processes. The elapsed time is longer than the actual running time.

    If you're in unix try
    Code:
    time <programname>

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Mac OS X Users/C programmers?
    By petermichaux in forum C Programming
    Replies: 16
    Last Post: 04-18-2011, 06:36 AM
  2. Determine the closest departure time
    By Kyeong in forum C Programming
    Replies: 9
    Last Post: 10-07-2008, 08:06 PM
  3. time synchronization problem
    By freeindy in forum C Programming
    Replies: 1
    Last Post: 04-19-2007, 06:25 AM
  4. The new FAQ
    By Hammer in forum A Brief History of Cprogramming.com
    Replies: 34
    Last Post: 08-30-2006, 10:05 AM
  5. time class
    By Unregistered in forum C++ Programming
    Replies: 1
    Last Post: 12-11-2001, 09:12 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21