My problem is this:
I have 1000 machines which turn on after a delay upon receiving the 'on' signal. The delay is set to be a random number of minutes between 0 and 15. So for any single machine, upon receiving the 'on' signal, the delay is any number of minutes drawn at random between 0 and 15.
I have to find out how many machines could be 'on' at any given time after they all receive the on signal at once.
For example, I model a simulation of these machines from time t=0 to t=10. At t=1 the 'on' signal is sent to every machine. I have to approximately model the number of machines on at each time step from t=1 to t=10.
My thinking was that since the delay is a uniform distribution from 0-15, I would just model the number of machines on directly proportional to the cdf of p=15. Which should just be a linear relationship? (aka a straight, diagonal line with minutes 0-15 on x axis and # of machines on in the y axis).
Thank you for your time!
Aucun commentaire:
Enregistrer un commentaire