I need to calculate the expected value of distance between 2 random uniformly distributed points within a unit square. This problem can be solved analytically and the answer is approximately 0.521405. What I'm trying to accomplish is to come up with this number using random number generator and Python with and without numpy. Using the below code
dist=list()
random.seed()
for each in range(100000000):
x1=random.random()
x2=random.random()
y1=random.random()
y2=random.random()
dist.append(math.sqrt((x1-x2)**2+(y1-y2)**2))
print(round(sum(dist)/float(len(dist)),6))
It iterates 100 million times in 125 seconds on my machine, but only 4 decimal digits are correct. Now, with the numpy I created the following code
dist=list()
start_time = time.time()
np.random.seed()
for each in range(100000000):
x = np.array((np.random.random(),np.random.random()))
y = np.array((np.random.random(),np.random.random()))
dist.append(np.linalg.norm(x-y))
print(round(np.mean(dist),6))
and it took 1111 seconds to iterate the same 100 million times!
As the result was correct only up to 4 decimal digits, I tried to increase the number of iterations to 1 billion using the former version, without numpy. I figured, that since each float is at most 64 bits(i'm using 64 bit Python) the list would take roughly 8 GB. However, the program used up 26GB of memory and errored out with an exception when the list had 790 million items
So I'm seeking your advice on the following:
- Is there a way to make use of various numpy optimizations and actually make it work faster?
- Is there a way to make the program more memory-efficient? I realize that the list is a way more complex data structure than I need for my purpose
- Am I correct assuming that in order to get 6 decimal digits right I will need the number of iterations close to 10^12? (Because the standard error of N measurements is decreasing as 1/sqrt(N))
Thanks in advance!
Aucun commentaire:
Enregistrer un commentaire