I have a Tensorflow model that utilises dropout, which is controlled by the state of tf.random
. At the start of my code, I set the global seed using tf.set_seed(DEFAULT_SEED)
. As training progresses, I want to be able to save the state of tf.random
, so that if I want to stop and then resume training, I can load that state and training can continue from where it left off. For example, np.random
has methods set_state()
and get_state()
, however I can't find an equivalent for tf.random. Am I missing something or is this not doable?
Numpy example:
np.random.set_seed(1234)
for i in range(10):
print(np.random.randn())
x = np.random.get_state()
np.random.set_state(x)
and the state is the same as where we left it.
Aucun commentaire:
Enregistrer un commentaire