So how am I supposed to delay it without delaying everything, there must be some way
Type: Posts; User: nothing909
So how am I supposed to delay it without delaying everything, there must be some way
I've already googled things and I've seen people talking about a delay which im trying to implement, but the only problem I'm having is that it's not only delaying the paddle it's delaying the ball...
why can't i just line the y paddle up with the y ball, but when it moves the y paddle is slower than the ball, so sometimes when the player hits the ball back it cant follow it fast enough and it...
it is already coded like that. the ball will always hit the paddle in the centre.
i thought that adding the delay for the paddle increments that it would make the ball hit it off centre sometimes...
can you reword that, i dont know what u mean, what should i do?
its 1 player against the computer. the part of code i'm putting the delay in is the computer controlled racket. right now, the way the code is without the delay, the y of the racket will follow the y...
i think i've kinda worked out why this is happening and i'm hoping you can help me fix it.
when i'm putting in a delay of say 50000, it is actually delaying like i want it to, but here's the...
if i set the delay to 50000, it will delay everything on the screen instead of just the racket (yR2) as intended.
this is honestly so confusing as to why this is doing this.
whats happening now is, when i change it to long int, the values no longer get truncated, but when i run the program, nothing will show...
when i put the number to like 40000, i get a warning says the integer conversion resulted in a change of sign. do you know what the cause of this might be?
i can't post all the the code, there's tons of it with many different libraries.
the code i provided is AI for a racket. the racket is tracking the y coordinate of the ball and moving with it. i...
Changing it the maximum amount without truncating doesn't delay the racket movement, in fact it makes the screen a little laggy
when i try 500000000000, it says it results in truncation
is the number is miliseconds?
okay, i've got the header file. i'm using the code
void delay(int n)
{
volatile int i;
for (i = 0 ; i < n ; i++)
;
}
sorry, i feel like i'm being a nuisance because of my inexperience programming, but can you show me how to write this possibly?
#include <unistd.h>
i'm so confused, i've never used dos.h before. i thought creating a delay was an easy little thing.
in my code the way it, can you explain how i can create a simple delay without using delay or...
i can't add that "source file cannot be found.
if i remove the unsigned int delay = 1000;
and i just have delay(1000);
it now says on the delay(1000) line that delay is being declared implicitly. what does this mean
they're all global variables, that isn't the problem
the code i provided is for a pong game, its just delaying the racket movement
i don't think my compiler has the sleep function, it doesn't work.
could you please look at my first post, with the code i gave and explain the reason why it wouldn't work because i still don't...
could you please explain, relating it to my code, how i can use a sleep function. i don't know if this is correct, but if i do:
void AI(void){
unsigned int sleep = 1000;
if (yR2...
void AI(void){
unsigned int delay = 1000;
if (yR2 > yBall)
{
if (yR2 > RACKET )
{
delay(1000);
yR2--;