mardi 23 juin 2015

How to predict with low standard deviation the outcome of a system that uses random numbers with uniform distribution?

I am working with a system that generates a number of [0, 16] integers using Math.random, an ECMA Script random number generator with uniform distribution within the [0, 1[ range.

I would like to kind of "average" or "predict" the next rolls so that there is no more randomness but instead an accurate modelling of how the system works.

Because of the uniform distribution, there is going to be huge variance and a large standard deviation if I simply take the mean average from the [0, 16] range, i.e. 8.

So, is it possible to do better than a mean average ? My understanding is that this method gives a variance of 25.5 and a standard deviation of 5.05, which is pretty terrible when you have a range of 17.

I would like to reduce the standard deviation to something like 0.1 if at all possible, knowing that the system I am trying to predict will keep using uniform random numbers: I'm merely trying to predict, not change the system.

I found something called "standard normal distribution", which can be accurately approximated by adding together 12 uniform random numbers [0, 1[ and subtracting 6 to the result. But I'm not sure if or how I can use this. It sounds like it's changing the data, but if the system keeps using uniform random numbers, how can converting uniform distribution to normal in the model help predict outcomes with a low standard deviation ?

Thanks if you can help, I hope it is clear enough.

PS: I suck with probability notations and vocabulary.




Aucun commentaire:

Enregistrer un commentaire