Tonight I multi-threaded my sound library and it works very well except when there is a ton of stuff happening in the game.
The first game to use the new library is my asteroids clone. It works great for normal circumstances up to about 800 to 1000 asteroids on screen. At this point the explosion sounds begin to lag behind the actual graphical presentation. The main loop is going as fast as possible with a Sleep() thrown in for CPU usage issues. The game uses around 6 to 9 percent CPU with over 1500 asteroids on screen. Nice but the sound lag is a bit annoying.
Now for this game the lag won't happen during normal gameplay since even 600 asteroids on screen is nearly impossible to survive even with all the ship upgrades.
But moving on to something like my space game (StarX for now) this may be a problem. Do you think I should synchronize the sound with the on-screen presentation using Windows events and so forth or should the sound engine just be cruising along as fast as possible? During normal gameplay the sound engine works beautifully. I'm not even sure StarX will be pushing as many sounds per second as asteroids does so this may be a moot point. Perhaps gameplay testing will reveal more.
Ideas? For those of you that have a multi-threaded sound system do you synchronize your sounds with the render?
1. Game calls PlaySound(soundID)
2. Sound engine adds ID to queue
3. Main thread loop in sound engine always plays topmost sound ID in queue and then pops off immediately. If nothing is in the queue the main loop essentially does nothing but Sleep() for a bit.
4. Internal play functions then use an internal vector of sounds to send the sound sample data to the API so it can play the sound.