Thread: Time elapsed

  1. #1
    Registered User
    Join Date
    Sep 2007
    Posts
    25

    Time elapsed

    Hey guys, I was wondering if anyone could help me with a quick question.
    I need to find a way to print the time it took for a program to run in seconds. I don't have much experience with time, but here's what I have so far:
    Code:
    void printSolution(int, time_t, time_t);
    
    ...
    
    int main() {
       int numrows = 0, numcols = 0;
       time_t time1, time2;
    
       time(&time1);
       getInput(&numrows, &numcols);
       time(&time2);
       printSolution(numcols, time1, time2);
    }
    
    ...
    
    void printSolution(int numcols, time_t time1, time_t time2) {
       int i;
    
       printf("Solution Vector: ");
       for (i=0; i<numcols;i++) printf("&#37;d", solution[i]);
       printf("\n");
       printf("Time Taken: %d\n", difftime(time1, time2));
    }
    I included <time.h> at the top of the function and the output I get when I run it is currently:

    "Solution Vector: 000000
    Time Taken: 1195078298"

    I can't quite seem to figure out what this is currently doing. Can anyone help me out?

  2. #2
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    difftime() returns double, so you should print using %f, not %d.

  3. #3
    Registered User
    Join Date
    Sep 2007
    Posts
    25
    Ahha, I knew it was something simple..

    Now, is there a way to have it go into milliseconds or anything lower than seconds? The function runs so fast it usually just says 0...

  4. #4
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by TitoMB345 View Post
    Ahha, I knew it was something simple..

    Now, is there a way to have it go into milliseconds or anything lower than seconds? The function runs so fast it usually just says 0...
    Try clock(), but there are caveats there too. Google should have some good info.

  5. #5
    Registered User
    Join Date
    Sep 2007
    Posts
    25
    Okay, so my quick exploration into clock() sent me on this path:
    Code:
    clock_t start, end;
    
    ...
    
    int main() {
       int numrows = 0, numcols = 0;
    
       start = clock();
    
       getInput(&numrows, &numcols);
       analyzeInput(numrows, numcols);
    }
    
    ...
    
    void printSolution(int numcols) {
       int i;
       double time_elapsed;
    
       end = clock();
       printf("start = &#37;f, end = %f\n", start, end);
       time_elapsed = ((double) (end-start))/CLOCKS_PER_SEC;
       printf("Solution Vector: ");
       for (i=0; i<numcols;i++) printf("%d", solution[i]);
       printf("\nTime Taken: %.9f seconds\n", time_elapsed);
       exit(0);
    }
    Again, the result is:

    "start = 0.000000, end = 0.000000
    Solution Vector: 101
    Time Taken: 0.000000000 seconds"

    Any ideas?
    Last edited by TitoMB345; 11-14-2007 at 10:36 PM. Reason: misprint..

  6. #6
    Frequently Quite Prolix dwks's Avatar
    Join Date
    Apr 2005
    Location
    Canada
    Posts
    8,057
    difftime() is more portable than doing the calculation yourself.

    How long does the program actually take to execute? If it only takes half of a second or something, then 0.000000 is the right answer.

    If you need more accurate timing, execute the program a few times in succession or see:
    http://cboard.cprogramming.com/showthread.php?t=88612
    http://cboard.cprogramming.com/showthread.php?t=90387
    dwk

    Seek and ye shall find. quaere et invenies.

    "Simplicity does not precede complexity, but follows it." -- Alan Perlis
    "Testing can only prove the presence of bugs, not their absence." -- Edsger Dijkstra
    "The only real mistake is the one from which we learn nothing." -- John Powell


    Other boards: DaniWeb, TPS
    Unofficial Wiki FAQ: cpwiki.sf.net

    My website: http://dwks.theprogrammingsite.com/
    Projects: codeform, xuni, atlantis, nort, etc.

  7. #7
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by dwks View Post
    difftime() is more portable than doing the calculation yourself.
    difftime() takes a time_t, not a clock_t. So I'm not sure I'd pass clock_t values to difftime().

  8. #8
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Well, if your process takes less than 1 microsecond, clock() won't measure it correctly. Also, note that just because CLOCKS_PER_SEC is 1 million (for instance), it does NOT mean the actual timer resolution is that high. That's just the units it returns it to you.

    Also, don't try to test clock() by inserting a call to sleep() or some other method of non-busy-waiting. The clock does not tick when your process is sleeping, so that will never do anything.

  9. #9
    Frequently Quite Prolix dwks's Avatar
    Join Date
    Apr 2005
    Location
    Canada
    Posts
    8,057
    Quote Originally Posted by brewbuck View Post
    Well, if your process takes less than 1 microsecond, clock() won't measure it correctly. Also, note that just because CLOCKS_PER_SEC is 1 million (for instance), it does NOT mean the actual timer resolution is that high. That's just the units it returns it to you.
    As a matter of fact, if your process takes less than 18.2 milliseconds (at least with my implementation of time.h), clock() will still return 0. That's the granularity of clock() on my system, I think anyway. See the first link I posted.

    You can get more accurate timing if you need it, but it's platform-dependent. Are you using Windows or Linux or what?
    dwk

    Seek and ye shall find. quaere et invenies.

    "Simplicity does not precede complexity, but follows it." -- Alan Perlis
    "Testing can only prove the presence of bugs, not their absence." -- Edsger Dijkstra
    "The only real mistake is the one from which we learn nothing." -- John Powell


    Other boards: DaniWeb, TPS
    Unofficial Wiki FAQ: cpwiki.sf.net

    My website: http://dwks.theprogrammingsite.com/
    Projects: codeform, xuni, atlantis, nort, etc.

  10. #10
    Registered User
    Join Date
    Sep 2007
    Posts
    25
    Well, I have it take the start time before the getInput() function runs, and getInput() waits for input from the keyboard, so I tried waiting a couple of seconds, and still, nothing came up. I'm not sure.

  11. #11
    Registered User
    Join Date
    Sep 2007
    Posts
    25
    Oh, and I'm running Windows, but the computer it has to compile on is SunOS I believe.. might be Linux, not 100&#37; sure.

  12. #12
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by dwks View Post
    As a matter of fact, if your process takes less than 18.2 milliseconds (at least with my implementation of time.h), clock() will still return 0. That's the granularity of clock() on my system, I think anyway. See the first link I posted.
    18.2... I remember that in DOS the hardware timer liked to tick 18.2 times per second, which works out to about 55 milliseconds per tick. Are you crossing up your numbers somewhere? I find it unlikely that this 18.2 figure is pure coincidence.

  13. #13
    Registered User VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,607
    On Windows you can use:

    • timeGetTime() (mmsystem.h)
    • GetTickCount() (Win32 API)
    • QueryPerformanceCounter() (Win32 API)


    timeGetTime() is good but it's not the best solution since it suffers from some issues I won't go into here.

    GetTickCount() is also good but some have had trouble with it due to integers or something. I dunno I don't use it.

    QueryPerformanceCounter() is the best and most accurate option. Once you find the period of this (via a Win32 API call - check the Platform SDK docs) you can compute very small time slices.

  14. #14
    and the hat of int overfl Salem's Avatar
    Join Date
    Aug 2001
    Location
    The edge of the known universe
    Posts
    39,660
    > The clock does not tick when your process is sleeping, so that will never do anything.
    It also doesn't tick much when it's waiting for user input either.

    > but the computer it has to compile on is SunOS I believe.. might be Linux
    There is a function called gettimeofday() which returns a struct containing seconds and microseconds. Again, the resolution of the microseconds field would need to be tested to see what it is on your implementation.

    > Now, is there a way to have it go into milliseconds or anything lower than seconds?
    Without resorting to various platform specific API calls, do something like this
    Code:
    t1 = clock();
    for ( i = 0 ; i < 1000 ; i++ ) doWork();
    t2 = clock();
    // time for 1 iteration is (t2-t1)/1000
    // and will have 1mS resolution
    If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
    If at first you don't succeed, try writing your phone number on the exam paper.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Weird Times I'm getting
    By afflictedd2 in forum Linux Programming
    Replies: 8
    Last Post: 07-23-2008, 07:18 AM
  2. calculating user time and time elapsed
    By Neildadon in forum C++ Programming
    Replies: 0
    Last Post: 02-10-2003, 06:00 PM
  3. Structure and Function
    By DocDroopy in forum C Programming
    Replies: 9
    Last Post: 08-03-2002, 11:14 AM
  4. Is this really true or it's just science fiction?
    By Nutshell in forum A Brief History of Cprogramming.com
    Replies: 145
    Last Post: 04-09-2002, 06:17 PM
  5. time class
    By Unregistered in forum C++ Programming
    Replies: 1
    Last Post: 12-11-2001, 10:12 PM