# Thread: the random number problem

1. <random>

That's MS's documentation of the standard random facilities, which ought to be supported by new GCCs as well. It even has the random_device engine, which is supposed to be a non-deterministic source of randomness. (But it isn't, in MS's implementation - in fact, it's a lousy engine that apparently just forwards to their rand() functions.)

2. Originally Posted by achitan
Because I heard that calling rand() multiple times will give you a sequence of random numbers that will repeat itself after a while (keeping the same seed). So it was just an exercise in statistics.
But if a number in a rand() sequence never repeats, then that's a condition which makes rand() not a very good random number generator. Anyway, back to Mersenne Twister...
Then you completely misunderstand. Seeding the prng once and then making many many calls to rand will give you the longest sequence possible until it repeats. Calling srand again at any point will not give you a completely new sequence. What that actually does is jump to a different point within the same sequence. Thus you immediately jump yourself closer to part of the sequence that you've come across more recently. So by ever calling srand more than once you will always get something that is actually less random!

Whatever test you're done that you think suggest otherwise are wrong. If your jumping around appears to make it take longer to get around to a certain value, then in reality what you're actually doing is causing many other values to come up more often instead, and in fact often causing the same sequence of values to come up again. It's worse, not better, trust me.

3. Originally Posted by iMalc
Calling srand again at any point will not give you a completely new sequence. What that actually does is jump to a different point within the same sequence. Thus you immediately jump yourself closer to part of the sequence that you've come across more recently. So by ever calling srand more than once you will always get something that is actually less random!
No, calling srand again with a different seed should give a "completely new" sequence. Calling it again with the same seed should restart the sequence. An implementation that fails to conform to this has a bug.

4. A good PRNG will have a period very near the capacity of it's persistent state -- the number of possible states it could be in. When this is the case, the PRGN can only be said to generate one sequence. Each reseeding will start that sequence at a different point. But reseeding with a random value will chance that the new location is shortly prior in the sequence to a previously encountered state. If a PRNG ever enters the same state, it will behave identically to every other time that state was encountered, generating the same numbers thereafter.

A completely new sequence is not possible. Not with good, periodic PRNG.

5. O_o

Any "PRNG" with finite state not obtaining entropy from a real source basically samples from some known sequence of bits.

Such a "PRNG", one capable of represent many sequences with the same potential of finite state, isn't doing its job as well as it could; such a "PRNG" isn't mutating the entirety of its state.

Soma

6. Originally Posted by laserlight
No, calling srand again with a different seed should give a "completely new" sequence. Calling it again with the same seed should restart the sequence. An implementation that fails to conform to this has a bug.
You're not thinking of the same thing as me. You're thinking of a portion of the output sequence, not of the entire sequence until it repeats.

Every possible PRNG provides a fixed sequence of random numbers. This sequence is likely to be of length 2^32 for a typical LCG implementation, or 2^16 for a bad one. This sequence is dictated by the internal state of the PRNG which in a typical LCG is the 32-bits of the internal seed value. Calling srand again simply sets the seed value (a.k.a internal state) to that which it would be for a different point in that overall sequence. The random number returned from rand is derived from a portion of the internal state.
The generator is only capable of procuding one very long overall sequence, and that is true for every type of PRNG out there, no matter how large its internal state is, simply because it can only hold so many combinations in its internal state. Of course its internal state may be so large that it would take years rather than seconds or milliseconds until it repeats.

You know all this, you just misunderstood me.
I'm not talking about how this time we got a 3 followed by 42 and last time 3 was followed by a 27 or anything like that.

Edit: Actullay King Mir explained it far better than I did.

7. I'm not going to profess to be a math wizard here, but I could seed rand, where "rand" simply picked X out of an array of 1-N items, and where the seed decided how many times to shuffle the array, and yes, each seed could give you a sequence.

Quzah.

8. Which is a fine example of a "PRNG" that isn't mutating the eternity of its state.

You could pick an arbitrary number of bits to mix into a secondary set of state of an otherwise straightforward implementation of a classic "PRNG" and return the result of that algorithm as applied to that secondary set after also applying the algorithm to the primary set so that the "seed" isn't mutated nor a factor of mutation.

Such a "PRNG", this or the set version proposed, is guaranteed to perform worse on analysis and has a smaller period in consideration of the size of its state than it otherwise would.

Soma

9. Originally Posted by iMalc
You're not thinking of the same thing as me. You're thinking of a portion of the output sequence, not of the entire sequence until it repeats.
Yeah, I was thinking in terms of how srand is defined in the standard. but King Mir's mention of "new location is shortly prior in the sequence to a previously encountered state" clarified that I was off track.

10. Originally Posted by King Mir
A good PRNG will have a period very near the capacity of it's persistent state -- the number of possible states it could be in. When this is the case, the PRGN can only be said to generate one sequence. Each reseeding will start that sequence at a different point. But reseeding with a random value will chance that the new location is shortly prior in the sequence to a previously encountered state. If a PRNG ever enters the same state, it will behave identically to every other time that state was encountered, generating the same numbers thereafter.
How practical is that? (See below.)
Originally Posted by King Mir
A completely new sequence is not possible. Not with good, periodic PRNG.
That doesn't make sense to me at all. How is that good, if it only ever generates the same sequence? (See below.)

Consider Pi. Your PRNG gives you the seed + math digit of pi. You are saying that a good PRNG will only ever grab digits of Pi, and that is all it will ever do, and that it is good to do so? So how practical is it, when I srand( 5000 ) for it to go find the 5000th digit of Pi as its starting point, then when I call srand( 2000000 ) again, to go find the 2 millionth digit of Pi?

Sure, Pi only gives you the same numbers every time you look at it, but does that mean it's good for random number generation? Is finding the Nth digit of Pi and giving that out good? And why is it good - since you say that a good PRNG only gives you the same sequence every time. Why does that make it good?

Shouldn't there be an infinite number of numbers like Pi, that don't ever repeat? So why couldn't we get the Nth number of seed one of those? Why would just using Pi over and over to have the same predictable chain be better than using a completely different sequence?

Quzah.

11. I think that computing the Nth digit of pi requires the state to remember something about each digit of pi before that. That means that the memory used by the PRNG will be always increasing. In such a case, yes it is possible to have a non periodic PRNG. That's not to say that it would necessarily make a better one though. You would need an infinate non-periodic sequence that follows a uniform distribution. Such a sequence has not been proven to exist, even in the digits of pi.

But generally PRNGs use constant memory. In such a case there is a finite number of states the PRNG can be in. That means that eventually, the PRNG must return to the same state. An good PRNG will not do this after it is seeded until it exhausts every possible state, and will not repeat the same partial sequence at any point except as is consistent with statistical randomness. But it will return to the same state. This makes the output necessarily periodic.

But PRNG will reset when reseeded, keeping no information from the last seed. This means that it is no longer guaranteed to exhaust every possible state before returning to the initial just seeded state of the previous seed.

Also, because they can get away with it, PRNGs tend to have a rather small state, counted in bits, not kilobytes. But some applications may need more random numbers than others, so a general purpose rand function may have too small a state for the degree of randomness needed. The size of the state is also tied to the size of the seed. So if an application need lots of random numbers, then using a known PRNG with a known state size and seed size provides safer guarantees.

Popular pages Recent additions