Actually, that's incorrect. When a game is going 100 fps (which happens even for games that are a year old on today's fastest machines), each frame is a time frame of 1/100th of a second. The precision of the timing to ensure that the objects in the game only move forward as much as they should in 1/100th of a second must need to be at least accurate to 1/1000th of a second. For instance, let's say it was only accurate to 1/100th of a second - some frames would record a time-frame of 0.00 seconds, some 0.01 seconds, and some 0.02 seconds (if the game slows down a bit). This effect would create the same frames being redrawn, and some frames moving objects twice as fast as other frames. The choppiness of movement would be unbearable. There are functions to handle high-resolution timing:
Originally posted by Sayeh
For _precise_ timing you need to use an interrupt and a callback with a flag. That's how it's done for most games. Also, game timing isn't that precise-- it's just about speed.
You only need a clock in a game to regulate heartbeat for network activity, or to make sure the code doesn't execute too fast on a faster machine. Other than that, timing doesn't have to be much more than about 1/6th of a second (a 'tick') in most cases.