I need to generate unbiased uniformly distributed integers in an arbitrarily large range, using random bits which can be expensive to obtain, therefore I want to use as few random bits as possible on average.
The generated numbers should be between 0 and N-1, where N is the given range.
What I am doing now is:
- Calculate the number of bits B in N by some means; mathematically, B = ceil(log(N)/log(2)).
- Generate B random bits.
- If the random bits generated form an integer less than N, return them. Otherwise go to step 2.
The best case is when N is a power of two; then you don't reject anything. But the worst case is when N is a power of two plus one; in that case, you get asymptotically close to a probability of rejection of 1/2 on every attempt, as B grows. That strikes me as wasteful, so I wondered if there are better algorithms that need fewer bits on average.
For example, when the generated bits are >= N, if I permute bit positions in a predefined order instead of rejecting them, with the hope of finding one such permutation that is within range, and only generate a new batch if none of the permutations succeeds, would the result be uniformly distributed?
Aucun commentaire:
Enregistrer un commentaire