samedi 23 septembre 2017

Different Pytorch random initialization with the same seed

I am very new to Pytorch, so I apologise if the question is very straightforward. My problem is that I have defined class net1 and initialised its parameters randomly with a fixed manual seed.

random.seed(opt.manualSeed)
torch.manual_seed(opt.manualSeed)
if torch.cuda.is_available():
    torch.cuda.manual_seed_all(opt.manualSeed)    

class net1(nn.Module):
    def __init__(self):
        super(net1, self).__init__()
        self.main_body = nn.Sequential(
            # Define the layers...#
        )
    def forward(self, x):
        return self.main_body(x)

# custom weights initialization called on net1
def weights_init(m):
    classname = m.__class__.__name__
    if classname.find('Conv') != -1:
        m.weight.data.normal_(0.0, 0.02)
    elif classname.find('BatchNorm') != -1:
        m.weight.data.normal_(1.0, 0.02)
        m.bias.data.fill_(0)

net1_ = net1()
net1_.apply(weights_init)

However, when I add another class net2 to the code:

class net2(nn.Module):
    def __init__(self):
        super(net2, self).__init__()
        self.main_body = nn.Sequential(
            # Define the layers
        )
    def forward(self, x):
        return self.main_body(x)

net2_ = net2()

and instantiate it, even though I do not use it anywhere else and it is not connected to my main graph (which is built on net1_), I get different outputs from my graph. Is it a reasonable outcome?




Aucun commentaire:

Enregistrer un commentaire