vendredi 4 juin 2021

Why is the average of a million random numbers not in the middle of the range?

So I was writing a script that used random.randint() in python, but it doesn't really matter since I think most mainstream languages have this "problem". It's like this: I set i to a random number and then add a random number 1 million times and then divide it by two. The outputs vary wildly, sometimes it's close to 0, sometimes close to 1, but by logic the output should pretty much be 0.5. What is causing this change.

from random import randint

i = randint(0, 1)
for x in range(1000000):
    i = (i + randint(0, 1)) / 2
print(i)



Aucun commentaire:

Enregistrer un commentaire