I want to know how to model random variables using "basic operations". The only random function I know, at least for C, is rand()
, along with srand
for seeding. There probably exists packages somewhere online but lets say I want to implement it on my own. I don't know if there are other very common random functions, but if not, lets just stick with rand()
and the C language.
rand()
allows me to pseudo-randomly generate an int
from 0
to RAND_MAX
. I can then use mod
to get an int
in some range. I can next mod 2
to choose a sign and get negative numbers. I can also do rand()/RAND_MAX
to model values in the interval (0,1)
and shift this to model Uniform(a,b)
.
But what I am not sure about is if I can extend this to model any probability distribution and at what point do I have to worry about accuracy especially when dealing with infinities and irrational probabilities. Also, this method is very crude so I would like to know of more standard ways using basic tools if any.
A simple example:
I have the random variable X
such that Pr(X = 1)=1/pi
and Pr(X=0)=1-1/pi
. Since pi
is irrational, I would approximate the probability of getting 1/pi
with rand()
and choose X=1
if I get an int
from 0
to Round(RAND_MAX*1/pi)
. So this is approximating twice, once for pi
and another time for rounding.
Is there a better approach? How would one go about modeling something more complicated such as a continuous random variable on the interval (0,infinity)
or a discrete random variable with irrational probabilities on a countably infinite set. Would my approach still work or would I have to worry about rounding errors?
EDIT: Also how does the pseudo-randomness instead of randomness of rand()
change things and how would I account for these changes?
Aucun commentaire:
Enregistrer un commentaire