vendredi 29 septembre 2017

Is it more efficient to generate a large random number and split it than to generate several smaller ones?

So the question is: is it more efficient to generate a large random number and split it than to generate several smaller ones?

Underlying assumptions, you want to generate x values of n size (e.g. 64-bit). That's it. This isn't a question how HOW, more that I'm interested in the general rule of thumb for MODERN computers (anything as recent as x86 is close enough), preferably on 64 bit architectures. If you want to pull in some stats for your answer, please do, it'll probably help, but this is for stackoverflow, not stats.stackexchange, so reasoning based on computers please!

Note, no specific algorithm for this, although we would probably assume that it's at the very least a good PRNG, and that all bits are equally random. However, if you want to factor that into your answer, please do!




Aucun commentaire:

Enregistrer un commentaire