Like Tree8Likes

Delay for DOS

This is a discussion on Delay for DOS within the C++ Programming forums, part of the General Programming Boards category; Hi, I'd like to write a delay function for DOS (real DOS, not Win32 console). But the important thing is: ...

  1. #1
    Registered User
    Join Date
    Jul 2011
    Posts
    14

    Delay for DOS

    Hi,
    I'd like to write a delay function for DOS (real DOS, not Win32 console). But the important thing is: The function shall not increase the processor load to 100 %. So anything like that:
    Code:
    void delay(int milliseconds)
    {
        clock_t start = clock();
    
        while ((clock() - start) * 1000 / CLOCKS_PER_SEC < milliseconds)
        {
            // Do nothing.
        }
    }
    is completely out of the question. Because such a function would put the processor load to 100 % (50 % on a dual core, 25 % on a quad core etc.) for the duration of the loop. And I want to avoid this at all costs.
    So, I basically need something that does the same as the Sleep function in Windows (which does not make the processor jump to 100 %), only that I need a DOS version.

    Turbo C++ has a delay function, but it suffers from the mentioned drawback. And it has a sleep function that doesn't increase the processor load, but it can only wait whole seconds, not milliseconds. Also, I need the delay to work with all DOS C++ compilers and therefore cannot rely on a function that only comes with a specific compiler.

    How do I solve my problem? Can this be done with normal C or C++ or do I have to use some assembler code in the function? I'd really appreciate if you could help me.

  2. #2
    and the hat of wrongness Salem's Avatar
    Join Date
    Aug 2001
    Location
    The edge of the known universe
    Posts
    32,761
    HLT
    So instead of do nothing, you could do
    Code:
        while ((clock() - start) * 1000 / CLOCKS_PER_SEC < milliseconds)
        {
            asm{"hlt"};
        }
    It will be as accurate as the fastest external interrupt source on your hardware.

    Also, the asm syntax might vary from one compiler to another, but IIRC the "traditional" old DOS compilers all had similar syntax.
    SourceForge.net: Compilers - predef
    if you need help writing #ifdef's for each compiler you have to work with.
    If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
    If at first you don't succeed, try writing your phone number on the exam paper.
    I support http://www.ukip.org/ as the first necessary step to a free Europe.

  3. #3
    Registered User
    Join Date
    Jul 2011
    Posts
    14
    At first, thanks for your help. But it still doesn't work. The processor load still goes up to maximum. Is there maybe a way to call the hlt command with a parameter that says how long the program is stopped, so that I only need to call it once per function call?

  4. #4
    and the hat of wrongness Salem's Avatar
    Join Date
    Aug 2001
    Location
    The edge of the known universe
    Posts
    32,761
    Did you test it on a real DOS machine as well, and not some emulator/VM ?

    Because HLT is a privileged instruction, which means if you're running an emulator/VM, there will be a certain amount of trickery and fakery going on.
    And just how are you measuring CPU load when all you have is a full-screen "C:\>" prompt?
    AndrewHunter likes this.
    If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
    If at first you don't succeed, try writing your phone number on the exam paper.
    I support http://www.ukip.org/ as the first necessary step to a free Europe.

  5. #5
    Registered User
    Join Date
    Jul 2011
    Posts
    14
    I just let the program run in Windows XP, so no, I didn't test it on a real DOS machine. But the thing is: If the method only works in a real DOS environment, but not when I use the 16 bit DOS program in Windows, then it's not what I'm looking for anyway. Because the command should of course also work if the DOS program runs under Windows.

    By the way, does anybody know why the sleep function in Turbo C++ (which, unlike delay, doesn't increase processor load) can only wait whole seconds instead of milliseconds?

  6. #6
    and the hat of wrongness Salem's Avatar
    Join Date
    Aug 2001
    Location
    The edge of the known universe
    Posts
    32,761
    If you want to use windows, then write a win32 console program and use Sleep(). But you'll need a new compiler (not that old fossil)

    > Because the command should of course also work if the DOS program runs under Windows.
    Why would you want such a thing?
    The whole point of the emulation layer is to give you functional equivalence for well behaved programs (albeit at the expense of CPU time in some cases).
    It wouldn't be much good if some badly behaved DOS program could still bring down the entire OS, because it had direct access to say the physical hardware.

    If you don't like the native emulator, perhaps another one behaves better (not that I've tried it)
    DOSBox, an x86 emulator with DOS or VirtualBox
    Hey, it's even open source, so perhaps you can hack it do behave how you would want it to.

    > By the way, does anybody know why the sleep function in Turbo C++ (which, unlike delay, doesn't increase processor load)
    > can only wait whole seconds instead of milliseconds?
    Because that's the historic specification for the sleep() call.

    Unless you have a real reason for running real DOS code on a real DOS machine, then there is no reason I can see for sticking with 20+ year old compilers for an OS which most people regard as being dead. Certainly, there is no point stressing over the finer detail of why the emulation layer doesn't precisely match the real hardware (it never can).
    If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
    If at first you don't succeed, try writing your phone number on the exam paper.
    I support http://www.ukip.org/ as the first necessary step to a free Europe.

  7. #7
    Registered User
    Join Date
    Jul 2011
    Posts
    14
    Quote Originally Posted by Salem View Post
    It wouldn't be much good if some badly behaved DOS program could still bring down the entire OS, because it had direct access to say the physical hardware.
    Of course I don't want my program to screw up Windows. I doubt that this is even possible in Windows XP. All I was saying is: If a solution only works on a real DOS machine while it still poses the same problem when the DOS program is run from Windows XP, then it's not the solution I'm looking for.
    (By the way, in the moment we don't even know if the current problem, that using the hlt command still increases processor load, is just a Windows problem because of missing rights or if it wouldn't work in a real DOS as well.)

    I know that the sleep function doesn't increase processor load, so the whole issue doesn't seem to have anything to do with missing privileges. Therefore, there's no need to look for another emulator or anything like that.

    Quote Originally Posted by Salem View Post
    Unless you have a real reason for running real DOS code on a real DOS machine, then there is no reason I can see for sticking with 20+ year old compilers for an OS which most people regard as being dead.
    Well, my idea is to program an old-school game that behaves exactly as it would have if it was programmed back then. Later, I will maybe write a Windows version, but in the moment, I'd like to program it as a native DOS application. And since a game loop does nothing in 99% of the time:
    Code:
    while (gameIsRunning)
    {
        if (TimeToDrawTheNextFrame())
        {
            ReadInput();
            ProcessGameLogic();
            DrawNextFrame();
        }
        // else do nothing
    }
    I don't want a program that makes the processor run on full power just because it's waiting in an infinite loop. Thus, I want to do the following:
    Code:
    while (gameIsRunning)
    {
        if (TimeToDrawTheNextFrame())
        {
            ReadInput();
            ProcessGameLogic();
            DrawNextFrame();
        }
        else
        {
            delay(1);
                // Unload the processor since
                // there's nothing to do.
        }
    }
    But since delay itself sets the processor to 100%, I need a function like sleep, only that it has to be able to wait milliseconds.

  8. #8
    Registered User
    Join Date
    May 2010
    Posts
    2,897
    If you want to write a true native DOS game then why are you worried about the processor usage? DOS is inherently a single process operating system it does not multitask. So most of the functions did not worry about using "all" of the resources. If you don't want the peculiarities of the DOS delay() function then you should use the Windows API functions or setup a Virtual Machine with DOS installed and not worry about processor usage.


    Jim

  9. #9
    and the hat of wrongness Salem's Avatar
    Join Date
    Aug 2001
    Location
    The edge of the known universe
    Posts
    32,761
    Like I said - DOSBOX
    Performance - DOSBoxWiki
    You can tune it to behave as a more (or less) capable processor.

    > Well, my idea is to program an old-school game that behaves exactly as it would have if it was programmed back then
    If a game isn't running at 100% CPU all the time, then it isn't going to be keeping the frame rate at the highest level for the best possible user experience.
    If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
    If at first you don't succeed, try writing your phone number on the exam paper.
    I support http://www.ukip.org/ as the first necessary step to a free Europe.

  10. #10
    Registered User
    Join Date
    Jul 2011
    Posts
    14
    Quote Originally Posted by jimblumberg View Post
    If you want to write a true native DOS game then why are you worried about the processor usage? DOS is inherently a single process operating system it does not multitask.
    It's not about the multitasking stuff. It's about the simple fact that I don't want my program to run on full power in an infinity loop even though it doesn't need to. Did you ever own a PC with a loud fan? I guess you can imagine what a
    Code:
    while (true)
    {
    }
    can do in this case in terms of noise.

    Quote Originally Posted by Salem View Post
    Like I said - DOSBOX
    No. My question was how to stop the program for a certain amount of time in milliseconds, how to implement a decent delay that's not just a simple loop. The solution to that problem is not "Use DOSBox."

    Quote Originally Posted by Salem View Post
    If a game isn't running at 100% CPU all the time, then it isn't going to be keeping the frame rate at the highest level for the best possible user experience.
    I guess I'll use a fixed frame rate anyway. You don't need a "Super Mario"-like game with 500 FPS.

  11. #11
    Registered User
    Join Date
    May 2011
    Location
    Around 8.3 light-minutes from the Sun
    Posts
    1,866
    I believe the problem you are having here is a communication issue. The "DOS" you speak of in windows is not actually dos, it is an emulator, hence Win32 console. In order to run actual DOS you need to do what Salem has suggested. If on the other hand you want to run your "dos" program from within windows you are actually running a console program which will have its behavior modified, under the hood of course, by windows.

    If this is the case then just simply use the Win32 API Sleep() command for your console app. This will not raise CPU usage since windows will place your console thread to sleep. Trying to ignore the fact that your program is a console program does not make it go away. Thus if your thread demands that it be stalled but does not give control back to Windows you will see your CPU usage go to 100%.

    EDIT: If you are worried about complexity, it is actually quite easy to implement a DOS style game using the Win32 API. In fact I would say it is easier to program a console BitBlt style game taking advantage of the API then it is to code an original DOS game. A lot of the complexity with memory management is taken out of the equation.
    Last edited by AndrewHunter; 07-05-2011 at 02:24 AM.
    Quote Originally Posted by anduril462 View Post
    Now, please, for the love of all things good and holy, think about what you're doing! Don't just run around willy-nilly, coding like a drunk two-year-old....
    Quote Originally Posted by quzah View Post
    ..... Just don't be surprised when I say you aren't using standard C anymore, and as such,are off in your own little universe that I will completely disregard.
    Warning: Some or all of my posted code may be non-standard and as such should not be used and in no case looked at.

  12. #12
    Registered User
    Join Date
    Jul 2011
    Posts
    14
    Quote Originally Posted by AndrewHunter View Post
    I believe the problem you are having here is a communication issue. The "DOS" you speak of in windows is not actually dos, it is an emulator, hence Win32 console.
    No, there isn't a communication problem, at least not from my site. I know the difference between a Win32 console application and 16 bit DOS programs. And yes, my program is really a DOS program. It's a game that uses Mode13h and that is compiled with Turbo C++ 3.0 or Microsoft Visual C++ 1.52c. It's not a Win32 console application. The fact that I don't have a DOS computer right at hand and test the application under Windows XP (which is still able to run real 16 bit DOS programs) doesn't change the fact that my program is still a real DOS program. For example, my program includes lines like:
    Code:
    unsigned char far *screen = (unsigned char far *)MK_FP(0xA000, 0x0000);
    or stuff like
    Code:
    REGS in, out;
    
    in.h.ah = 0x00;
    in.h.al = 0x13;
    int86(intNumber, &in, &out);
    or
    Code:
    void WaitForVBlank()
    {
        while (inp(0x3DA) & 8)
            ;
    		
        while (!(inp(0x3DA) & 8))
            ;
    }
    You see? It's really a DOS program.

    Quote Originally Posted by AndrewHunter View Post
    In order to run actual DOS you need to do what Salem has suggested.
    Not necessarily. Even though Windows XP is not based on DOS anymore, but on Windows NT, it's still capable of running actual DOS applications. And that's what I do: As long as I'm still working on the basic things, I just test the program under my usual operation system, Windows XP. Later, when it comes to performance and all that stuff, I can set up a DOS installation and test it there. But in the moment, I don't need that yet.
    However, it is a DOS program, not a Win32 console application, otherwise I would have already used the Sleep function (since I mentioned it in my very first post). But this DOS program, due to a loop, pushes the processor load. And since I've seen that Turbo C++'s delay function has the same problem while Turbo C++'s sleep function hasn't, but can only wait full seconds, all I want to know is: How do I implement a delay that works like Turbo C++'s sleep function, but can wait milliseconds instead of seconds?

    Quote Originally Posted by AndrewHunter View Post
    EDIT: If you are worried about complexity, it is actually quite easy to implement a DOS style game using the Win32 API. In fact I would say it is easier to program a console BitBlt style game taking advantage of the API then it is to code an original DOS game. A lot of the complexity with memory management is taken out of the equation.
    Yes, I could write a Windows game. In fact, I'm writing the game in a way that all platform independent things are separated from the OS specific stuff, so that I can easily replace the graphics and input engine later and still have the game behave absolutely the same. But in the moment I want to program an original DOS game. And I don't have a problem with anything, except the fact that Turbo C++ has a sloppily implemented delay function and Visual C++ 1.52c doesn't have one at all.

    P.S.: There's actually more memory management with Windows GDI with all their DeleteDC and ReleaseDC and DeleteBitmap and stuff. DirectX isn't much better with all their surfaces and clippers and palettes. In DOS, all I have to do is call a delete[] for each of my sprites and the back buffer when I don't need them anymore.
    Last edited by Erde; 07-05-2011 at 02:52 AM.

  13. #13
    Registered User
    Join Date
    May 2011
    Location
    Around 8.3 light-minutes from the Sun
    Posts
    1,866
    No one is trying to argue with you here, neither is anyone going to be impressed with Screen 13 access commands or the fact that you know where the legacy memory address for screen 13 is. This simple fact is regardless of how you compiled or wrote your code it's implementation on Windows is such that the behavior is modified so that it can run within the DOS emulator.

    Virtual DOS machines rely on the virtual 8086 mode of the Intel 80386 processor, which allows real mode 8086 software to run in a controlled environment by catching and forwarding to the normal operating system (as exceptions) all operations which involve accessing hardware. The operating system can then perform an emulation and resume the execution of the DOS software.

    VDMs generally also implement support for running 16- and 32-bit protected mode software (DOS extenders), which has to conform to the DOS Protected Mode Interface.

    When a DOS program running inside a VDM needs to access a peripheral, Windows will either allow this directly (rarely), or will present the DOS program with a Virtual Device Driver which emulates the hardware using operating system functions. A VDM will systematically have emulations for the Intel 8259A interrupt controllers, the 8254 timer chips, the 8237 DMA, etc.
    Quote Originally Posted by anduril462 View Post
    Now, please, for the love of all things good and holy, think about what you're doing! Don't just run around willy-nilly, coding like a drunk two-year-old....
    Quote Originally Posted by quzah View Post
    ..... Just don't be surprised when I say you aren't using standard C anymore, and as such,are off in your own little universe that I will completely disregard.
    Warning: Some or all of my posted code may be non-standard and as such should not be used and in no case looked at.

  14. #14
    Registered User
    Join Date
    Jul 2011
    Posts
    14
    Quote Originally Posted by AndrewHunter View Post
    No one is trying to argue with you here, neither is anyone going to be impressed with Screen 13 access commands or the fact that you know where the legacy memory address for screen 13 is.
    Did you really think that I posted the code to "impress" anybody? I showed it so that you believe me that I'm really writing a DOS program and that I'm not just too stupid to confuse DOS with Win32 console. I didn't show you the code to "impress" you.

    Quote Originally Posted by AndrewHunter View Post
    This simple fact is regardless of how you compiled or wrote your code it's implementation on Windows is such that the behavior is modified so that it can run within the DOS emulator.
    O.k. And what does that and the quote you posted have to do with my problem? My original question was: "How do I implement a delay function that really halts the program and doesn't just go through a loop until the specified time has come?" Plain and simple. But your descriptions don't do anything to help me with my question. "When a DOS program running inside a VDM needs to access a peripheral, Windows will either allow this directly (rarely), or will present the DOS program with a Virtual Device Driver which emulates the hardware using operating system functions." Fine. And? How does that help me with my delay function problem? If a driver of a car wants to reduce mileage, he doesn't need a text about how the engine works from within. All he needs to know is what he can do to reduce mileage. And my question is the same: I'd like to know how the code looks like that implements a decent delay function. Nothing more, nothing less. If the DOS program runs under a DOS machine or under a Windows machine or under DOSBox and how those programs handle the EXE file has nothing to do with how my source code will look like. I just need to know how a delay that halts the program would have been implemented in a DOS application (or where I can find concrete information about that topic).

  15. #15
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Posts
    22,915
    I think what everyone here is trying to tell you is:
    It simply isn't possible to make a "sleep" function that works in both Windows and DOS.

    The reason why "halt" isn't working is that Windows probably won't allow it (ie emulates it in a wrong way than you expect).
    Hence their suggestion about DOSBox to "fix" that problem.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

Page 1 of 3 123 LastLast
Popular pages Recent additions subscribe to a feed

Similar Threads

  1. help.....!!!!!how much delay i will get by using delay(1)
    By gunjansethi in forum C Programming
    Replies: 7
    Last Post: 03-23-2010, 04:08 AM
  2. delay() in Bc++ 5.5
    By Chiki Chiki Chalem in forum C++ Programming
    Replies: 1
    Last Post: 07-07-2002, 10:58 AM
  3. Delay
    By Unregistered in forum C Programming
    Replies: 5
    Last Post: 06-05-2002, 03:30 PM
  4. delay
    By Unregistered in forum C Programming
    Replies: 4
    Last Post: 04-20-2002, 09:30 AM
  5. Delay
    By Unregistered in forum C++ Programming
    Replies: 5
    Last Post: 03-30-2002, 12:53 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21