-
animation timing
I use timeGetTime to balance my animations across multiple frame rates.
I was having a serious problem with tearing whenever the cameras focus moved. I narrowed it down to the animation rate that was used to move the focus.
My solution was to keep up with the last ten frame draw times and pass an average of those values as the animation rate.
What sort of problems might I run into later on from this?
It works great by the way.
-
I would see it having the same problems as using the exact value of TimeGetTime, only less often. Perhaps I'm thinking about this the wrong way, but this is how I see it:
Let's say you're running at 60 fps, and you're getting some tearing like you said. By averaging out 10 values, you shouldn't see the same problem until you hit 600 fps.
-
A suggestion about another solution that doesn't have any draw backs would be something like
Code:
float timeScale(int ideal) {
return (ideal/(float)g_FrameRate);
}
where g_FrameRate is the current fps (you dont have to keep track of last N fps's).
You would use it like
Code:
playerVelocity += 2.3 * timeScale(g_60FPSisIdeal);
animationTime += 1.5 * timeScale(g_60FPSisIdeal);
etc.