mardi 3 novembre 2020

random.random not doing what I want

I am trying to generate a synthetic earthquake database where the number of events ($N$) with magnitude ($M$) in the range $[M, M+\delta_M]$ follows: $\log_{10}(N) = a - bM$ where $a$ and $b$ are constants.

I am trying to do this in Python using the "random" module. I know I can (or at least I think I can - as I haven't tried it) use random.expovariate but I thought I could use random.random with a transformation like:

-math.log10(random.random()))

I ran this for 2,000,000 samples which I then binned into 0.1 bins and plotted on a log scale.

enter image description here

The red line shows the theoretical distribution used to generate the synthetic samples.

I'm not worried about the variation above x=4.5. This is due to small number of points and natural randomness. What I am asking about is the very small (at this scale) variation for the points near x=0. I plotted the variation of the synthetic points from the theoretical (blue dots):

diff between observed and theory

As x decreases the number of events increase exponentially so the variation from the theoretical should decrease - not increase. And the point at x=0 is the opposite sense.

To try and work out where my problem lies I wrote code that generated numbers from 0 to 1 with a very fine step. Each number then went through the function noted above. The result (the blue dots in the above figure) is purely linear that exactly matches the theoretical values.

This indicates that my transformation function and code is fine - so I'm thinking it's somehow something to do with the random number generator?

Would be grateful for any pointers. Thanks.




Aucun commentaire:

Enregistrer un commentaire