I'm working on an optimization problem, and you can see a simplified version of my code posted below (the origin code is too complicated for asking such a question, and I hope my simplified code has simulated the original one as much as possible).
My purpose: use the function foo
in the function optimization
, but foo
can take very long time due to some hard situations. So I use multiprocessing
to set a time limit for execution of the function (proc.join(iter_time)
, the method is from an anwser from this question; How to limit execution time of a function call?).
My problem:
- In the
while
loop, every time the generated values forextra
are the same. - The list
lst
's length is always 1, which means in every iteration in thewhile
loop it starts from an empty list.
My guess: possible reason can be each time I create a process
the random seed is counting from the beginning, and each time the process
is terminated, there could be some garbage collection mechanism to clean the memory the process
used, so the list is cleared.
My question
- Anyone know the real reason of such problems?
- if not using
multiprocessing
, is there anyway else that I can realize my purpose while generate different random numbers? btw I have triedfunc_timeout
but it has other problems that I cannot handle...
random.seed(123)
lst = [] # a global list for logging data
def foo(epoch):
...
extra = random.random()
lst.append(epoch + extra)
...
def optimization(loop_time, iter_time):
start = time.time()
epoch = 0
while time.time() <= start + loop_time:
proc = multiprocessing.Process(target=foo, args=(epoch,))
proc.start()
proc.join(iter_time)
if proc.is_alive(): # if the process is not terminated within time limit
print("Time out!")
proc.terminate()
if __name__ == '__main__':
optimization(300, 2)
Aucun commentaire:
Enregistrer un commentaire