I'm curious as to how people get the frames per second the videocard on the computer is currently rendering at. I have an "Asteroids"- type clone, done in Direct3D, and I would like to be able to make it run at a specific speed for every computer, not super speedy on faster computers and like a turtle on slower ones. How do you guys do this?
What I've done thus far is made a global variable that all my movement on the screen is multiplied by. I don't want people to have to go into the code itself and lower the number for faster computers and raise it for slower ones in order for the game to operate at a decent speed. I'm thinking if I can get the F.P.S. the computer is currently running at I can use that as a factor for all my movement calculations.
How do I get the FPS? Or is there maybe another way to make my game run at a constant speed?