I'm working on a project where I'm trying to find the probability of a dice which rolls a die and stores how many times it comes up at one, two, three and so on. It will do this many times, and calculate the difference between the face that turns up the most times and the face that shows up the least times. This should then be divided with the number of the face that was shown the most to give you a sort of ratio (in percent) of the difference. This should then be done 20 times with increasing number of times to roll the die, from 10 to 20 to 40 and so on by doubling the number of rolls each time.
I've been trying to solve this with only Iterations and not help from a list and or functions but haven't found a solution to it.
I found this online:
k = 10
for i in range(1,21):
L = [0] * 6
for j in range(k):
dice = random.randint(0,5)
L[dice] += 1
ratio = 100 * (max(L) - min (L)) / max (L)
print (k, ratio)
k *= 2
output:
10 100.0
20 60.0
40 54.54545454545455
80 50.0
160 30.0
320 30.76923076923077
640 16.10169491525424
1280 16.170212765957448
2560 12.958963282937365
5120 12.1374865735768
10240 5.161656267725468
20480 3.7442396313364057
40960 1.5852239674229203
81920 3.2765682259107565
163840 1.1792967896920725
327680 0.303523431643232
655360 0.5303078706450406
1310720 0.6217280476551579
2621440 0.24166628560976725
5242880 0.1638989888130076
but it uses a list to solve it, so I was wondering if there was a way to solve this program with Iterations and not any help from a list or a function?
Would really appreciate the help
Aucun commentaire:
Enregistrer un commentaire