mardi 5 septembre 2017

Can I perform Keras training with no randomization?

I'm using a Keras Sequential model where the inputs and labels are exactly the same each run. Keras is using a Tensorflow backend.

I've set the layer activations to 'zeros' and disabled batch shuffling during training.

model = Sequential()
model.add(Dense(128, 
                activation='relu', 
                kernel_initializer='zeros', 
                bias_initializer='zeros'))
...

model.compile(optimizer='rmsprop', loss='binary_crossentropy') 

model.fit(x_train, y_train, 
          batch_size = 128, verbose = 1, epochs = 200, 
          validation_data=(x_validation, y_validation),
          shuffle=False)

I've also tried seeding Numpy's random() method:

np.random.seed(7) # fix random seed for reproducibility

With the above in place I still receive different accuracy and loss values after training.

Am I missing something or is there no way to fully remove the variance between trainings?




Aucun commentaire:

Enregistrer un commentaire