Quote Originally Posted by Sang-drax
What do you mean by delay?
Exactly what I said. A delay. To limit the rate at which events occur. To allow for users (and in multi-user, networked games, other users) to react. If you run the physics at full CPU speed, the gameplay will be quite different on a machine with fast graphics and a 3GHz CPU vs. that on a 1.2GHz CPU and lower-end graphics.

That is why I brought up the thread thing, with rendering at the low end of the priority scale. If rendering is a background task, the main gameplay tells the rendering engine what the current state of affairs is and lets it do its work. On a faster machine, you get a higher frame rate, but the game remains responsive on a slower machine.

When dealing with networked games, the user actions and remote communications must have roughly equal priority, processing remote changes before determining the effect of the user actions. "Fairness" comes into play, and I'm not going to give an Operations Research lecture in this forum to explain what that is and how it's done.

Note that the same sort of considerations must be made for any interactive program that deals with asynchornous events, such as a terminal emulator (or terminal firmware), embedded microcontroller, or avionics display.