dimanche 12 novembre 2017

Unused, untrainable variable/constant affect Tensorflow result

Setup

First, let me say that I have a tensorflow model set up to solve a multi-task problem and I have set the random seed using tf.set_random_seed to make my results reproducible.

At every iteration, I output various evaluation metrics for my model.

Problem

I noticed that if I have a unused, non-trainable tensorflow variable, it seems to affect the result that was written out at every iteration.

The situation is similar if I added an unused tensorflow constant instead of the tensorflow variable. In fact, with a tensorflow constant, the output sequence seems to change as well.

Note that the difference is due to a single line declaring the tensorflow variable/constant. Also, the sequence of output with and without the variable is exactly reproducible, so the random seed is doing its job.

Hypothesis

My guess is somehow declaring the tensorflow variable, even if it is unused, affected the random number generator, although I cannot fathom why that would be the case since the variable is non-trainable and will be initialized to a fixed value. Even more surprising is why would a tensorflow constant affect the random number generator.

Can anyone confirm my guess, or point me towards what might be causing the difference in result?




Aucun commentaire:

Enregistrer un commentaire