-
Delay
I'm trying to create a delay in my program so you can see when a character is being moved accross the screen.
I know it's a for loop and I believe it goes something like this.
int delay;
for (delay; delay<10000000;delay--)
am I right???
I'm sure there are changes.
THanx for your help
-
>>> for (delay; delay<10000000;delay--)
for (delay = 0; delay<1000000; delay++);
That's a way to delay, not a good one because your process is hogging the cpu the whole time, but without more details, I won't suggest an alternative.
You set delay initially to zero, and as long as delay is less than the value you have entered, you increment delay.
-
I have read your post, and i am struggling with the same problem.
i used this way to delay for some time, when I was only programming in QB (For i=1 to 1000:next i)
It is the simplest way to delay, but it has a major setback:
It is CPU dependent (i.e,on a pentium the dlay is shorter than on, for example, a 386)
If you now know any better way(in C/C++, please let me know!
you can mail to [email protected] , so i can put it on my site ( <url>http://go.to/jtechcpp</url> )
-
if using dos (real dos) then #include <dos.h> and in there is a function called sleep(x) where x is seconds to delay i think.
if using win32 console then #include<windows.h> and in there is a function called Sleep(x) where x is milliseconds to delay.
-
The reason I didn't suggest this is lack of details of course. It might also be delay() or Delay() or wait() or Wait() ... !