is there any "standard" code to measure how long it takes for a program to run? or can someone give me an example of how to time my program?
is there any "standard" code to measure how long it takes for a program to run? or can someone give me an example of how to time my program?
When no one helps you out. Call google();
Store the value of clock() at the start of your program, and then take the value of clock() at the end of your program. The difference divided by CLOCKS_PER_SEC is the time it took for your code to run in seconds.
Naturally I didn't feel inspired enough to read all the links for you, since I already slaved away for long hours under a blistering sun pressing the search button after typing four whole words! - Quzah
You. Fetch me my copy of the Wall Street Journal. You two, fight to the death - Stewie
The only standard method I know of only has accuracy to the second. I would recommend you do a board search as this has come up fairly reguarly and has been discussed quite in depth.
If seconds precision is enough you can use the time() function to acquire the time at the beginning of the program, then again at the end, subtract the difference and you have the amount of time it took to run.
what if i want to time how fast a function gives me a result? vs another function that uses a different algorithm or does some extra work
When no one helps you out. Call google();
Since both functions probably execute quite quickly, your best bet is to time how long it takes to run each function 1000 times, and compare those values. Just put the clock() statements before and after the loop.
Naturally I didn't feel inspired enough to read all the links for you, since I already slaved away for long hours under a blistering sun pressing the search button after typing four whole words! - Quzah
You. Fetch me my copy of the Wall Street Journal. You two, fight to the death - Stewie
Also consider profiling your code. That's exactly what profilers are ther to do for you.