mercredi 1 avril 2020

Reset Random Initialization on Pytorch

I want to do some experiments on feedforward neural networks. To make a fair comparison, I need them to have the exact same random initialisation. How can I do it?

Is there a way to save the same initial weights such that I can train a network and then reinitialize it exactly as it was before?

I have been trying to save the initial parameters on a list, called 'init' , and then reassign the parameters but it did not work:

i = 0 for name, param in model.named_parameters(): param = init[i] i += 1

Any suggestion?




Aucun commentaire:

Enregistrer un commentaire