Thread: Minute Delay Loop

  1. #16
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    Also, think about what will happen if the user suspends the computer when your code is sleeping, and wake it up 2 days later? What if the user has the OS set to automatically sync network time, and system time actually jumps back? Make sure your code won't crash/hang in those situations (think about calculations overflowing, wrapping around, etc).

  2. #17
    Registered User
    Join Date
    Feb 2010
    Posts
    98
    Interesting stuff. I never knew that. It explains things.

    Currently, this particular loop in inside a Thread doing its own task each minute. Not sure of the impact of Threads on Sleep, but with all that's been said, I'm not much impressed with Sleep. Of course, since my app in Win32, I could always do a SetTimer and transfer the code to WM_TIMER. Do you know how accurate the cycles are with SetTimer? Does it have the same problem as Sleep?

  3. #18
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    All code runs in threads. The "main" code runs in an implicit main thread. Sleeping in main thread vs non-main thread is no different.

    SetTimer will have exactly the same problem. There is absolutely no way to guarantee maximum delay in user mode (as opposed to kernel mode) on Windows.

    Even if your thread doesn't sleep at all, Windows will still suspend it for random amounts of time regularly to run other things.

    Imagine if this was allowed - what would happen if someone runs 2 instances of your program, each iteration of the loop takes 5 seconds, and you only have 1 CPU? Which one should be run at 00?

  4. #19
    Registered User
    Join Date
    Feb 2010
    Posts
    98
    Well that's not good news. Evidentially Windows really sucks then.
    I'm going to have to do a hack then. I'll just Sleep 55 seconds (55-elasped) and have a second loop to Sleep 1/2 seconds until 00.

    Thanks everyone.

  5. #20
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    There's nothing sucky about this. This is just how non-real time multithreading systems work. It's computationally impossible to give the maximum delay guarantee to an unlimited number of arbitrary threads, unless you have infinite number of CPUs and an infinitely fast scheduler.

    I'm going to have to do a hack then. I'll just Sleep 55 seconds (55-elasped) and have a second loop to Sleep 1/2 seconds until 00.
    If you really understand the issue here, you'll see that that doesn't help at all, and will probably make it worse in practice.

  6. #21
    Registered User
    Join Date
    Feb 2010
    Posts
    98
    Got any suggestions then? I could use them.

    And by the way, your original code example almost copies my original code.
    It should not be getting 59 or 01 (not to mention 58).
    I mean 00.001-00.999 is a big target for a computer.

    Words like "unlimited" and "infinite" would imply I'm asking my computer to do something NASA related.
    I just want 00 seconds each minute, that's a simple request from a computer. Right?

  7. #22
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    And by the way, your original code example almost copies my original code.
    It should not be getting 59 or 01 (not to mention 58).
    I mean 00.001-00.999 is a big target for a computer.
    I don't have time to write out and test the code, so I have no idea.

    I just want 00 seconds each minute, that's a simple request from a computer. Right?
    Unless you are running 1000 instances of your program. If Windows guarantees something, it must be able to do it regardless of the load scenario, and it can't. That's why it's not guaranteed.

  8. #23
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    By the way, you are aware that "clock()" may not return wall time?

    On some implementations, it returns CPU time, which is usually less than wall time.

  9. #24
    Registered User
    Join Date
    Feb 2010
    Posts
    98
    Running 1 instance losses 2-seconds every 10-minutes.
    What explains that in my code?
    I'm confused.
    All of the explainations so far would only explain millisecond differences.
    You yourself said 99.9% accurate should be normal.
    Well, I want that 99.9%

    Someone please help.
    Thanks.

  10. #25
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    Upon further inspection of your program, no, my code is not the same as your code.

    The critical difference is, you only sync to 00 at the beginning, then just add 1 minute at a time. Mine syncs to 00 every minute.

    In your approach, imagine there's a little bit of error, and your loop actually takes 59.9 second. That won't be a problem for a few iterations, but the errors will add up, and eventually you'll get huge drift.

    Synchronize on every iteration is the answer, like in my code (note that I didn't wait for 60-elapsed_time, I waited for time_till_next_00 - that is the critical difference. This way error gets zeroed every iteration).

  11. #26
    Registered User
    Join Date
    Feb 2010
    Posts
    98
    Hmm, I knew it had to do with math. Okay. I get it.
    Thanks, very very much.

  12. #27
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Here's the thing. Any operating system have hundreds of threads to run regularly. To do that on 2/4/8 processors, it needs to suspend threads and execute another. This is called a context switch.
    To do this, operating systems use an interrupt to say "Oh, x ms has passed." Now, interrupts are very expensive, because the state of the CPU must be saved so it can be restarted later. Which is exactly why Windows won't let you sleep exactly n milliseconds. It just performs regular checks whenever it happens to have entered the kernel.
    This is the same with all timers, be that sleep or timers. The reason because having regular interrupts would reduce the performance of the entire computer, not just your process.

    So, to actually solve such problems where a delay of m milliseconds is not a disaster is simply to synchronize or compensate. On every loop, you check how many milliseconds you actually slept vs how many you wanted. On the next sleep iteration, you reduce the sleeping time to compensate for the fact that you overslept last time.
    There is no way around this unless you have a real-time operating system.
    This is not specific to Windows.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  13. #28
    Registered User
    Join Date
    Feb 2010
    Posts
    98
    Different method, same problem.
    I lose 2-seconds every 10-minutes.
    But, with this method, it catches it to then re-sync.

    Code:
    #include <windows.h>
    #include <time.h>
    #include <iostream>// cout
    
    using namespace std;
    
    int main(){
    	SYSTEMTIME st;
    	time_t tSecs;
    	int Secs,iStall=1;
    	bool looping=true;
    	while(looping){
    		GetLocalTime(&st); Secs=60000-(((int)st.wSecond*1000)+(int)st.wMilliseconds);
    		Sleep(Secs);
    		tSecs=time(0); char timestamp[20]; struct tm pSecs;
    		localtime_s(&pSecs,&tSecs); strftime(timestamp, 20, "%X", &pSecs);
    		cout << Secs <<"\t= "<< timestamp <<"\n";
    		Sleep(753*(iStall%5)); iStall++;
    	}
    	return 0;
    }
    Minute Delay Loop-2011-12-08_165657-gif

  14. #29
    Registered User
    Join Date
    Feb 2010
    Posts
    98
    Anyone care to run this on their computer?
    I'm suspicious. These results make me think computer virus or something else weird.
    I exited everything and re-ran it and got the same thing.
    If someone could run the test, I would be very grateful.

  15. #30
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    It looks like your system is adjusting time backward by a few seconds every 10 minutes.

    Did you set your computer up to sync network time? It's possible that your computer clock is fast, and it's syncing by moving back a few seconds every 10 minutes (they do this because changing time backwards by a big leap may break programs).

    You can try writing a function, sleep_until(time), that keeps sleeping the difference between current time and desired time (in a loop), until the desired time is reached.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Wait just a minute here!!!
    By Kennedy in forum C Programming
    Replies: 7
    Last Post: 04-13-2009, 09:36 AM
  2. While Loop and Delay
    By hmd in forum C Programming
    Replies: 5
    Last Post: 02-25-2008, 01:34 PM
  3. Minute to hour/minute conversion program
    By Remius in forum C Programming
    Replies: 7
    Last Post: 12-29-2007, 08:39 AM
  4. 12 minute timer
    By Gone in forum C++ Programming
    Replies: 2
    Last Post: 12-31-2003, 04:30 PM
  5. Delay A While Loop
    By SonicWave in forum C++ Programming
    Replies: 4
    Last Post: 09-09-2001, 10:29 PM