mardi 30 décembre 2014

Should I use `rand % N` or `rand() / (RAND_MAX / N + 1)`?

I was reading the C FAQ and found out in a question that it recommends me to use rand() / (RAND_MAX / N + 1) instead of the more popular way which is rand() % N.


The reasoning for that is that when N is a low number rand() % N will only use a few bits from rand().


I tested the different approaches with N being 2 on both Windows and Linux but could not notice a difference.



#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#define N 2

int main(void)
{
srand(0);
printf("rand() %% N:\n");
for (int i = 0; i < 40; ++i) {
printf("%d ", rand() % N);
}
putchar('\n');

srand(0);
printf("rand() / (RAND_MAX / N + 1):\n");
for (int i = 0; i < 40; ++i) {
printf("%d ", rand() / (RAND_MAX / N + 1));
}
putchar('\n');

return 0;
}


The output is this (on my gnu/linux machine):



rand() % N:
1 0 1 1 1 1 0 0 1 1 0 1 0 1 1 0 0 0 0 0 1 0 1 1 0 0 0 1 1 1 1 0 0 0 1 1 1 0 1 0
rand() / (RAND_MAX / N + 1):
1 0 1 1 1 0 0 1 0 1 0 1 0 1 1 1 1 1 0 1 0 0 0 1 0 0 0 0 1 0 1 1 1 0 1 1 0 1 0 1


Both alternatives seem perfectly random to me. It even seems like the second approach is worse than rand % N.


Should I use rand() % N or rand() / (RAND_MAX / N + 1)?





Aucun commentaire:

Enregistrer un commentaire