I have this pseudorandom generator that works in some arbitrary state space S. Obviously, the complexity (apparent entropy) of the output generated depends on the "size" of S. Unfortunately, for this particular generator the larger the state space, the larger the range of the integers generated. So I need to extract 64-bit numbers from a say 2^4096-bit result R. Now I could just cycle the generator, take the modulus of R with 2^64 to be my random number, rinse, repeat. But then using that approach I'd potentially be discarding a ton of bits on each run (the actual size of each number is unpredictable however). I'd really like to use at least some of those unused bits, but then again I don't want to risk biasing the output in the process either (suppose I were to simply take the modulus and divide out 2^64 repeatedly until R was exhausted, for instance). Is there an easy way to safely do this...or, should I just fuggeddaboudit (ie: not worth the trouble)?