dimanche 15 décembre 2019

Simple Neural Net. is giving different accuracy if i run same Neural Net multiple times, is it ok?

I think this is due to different initialization of weight (random initialization) but cant say for sure Activation function is ReLu for except for last layer which is sigmoid.




Aucun commentaire:

Enregistrer un commentaire