-
Notifications
You must be signed in to change notification settings - Fork 415
Open
Description
Here is a simple code giving a different number of parameters between pytorch and torchsummay.
It seems that torchsummay does not count torch.nn.parameter
layers.
import torch
import torchsummary
class Net(torch.nn.Module):
def __init__(self):
super(Net,self).__init__()
self.p = torch.nn.Parameter(torch.zeros(1))
self.conv1 = torch.nn.Conv1d(1,2,kernel_size=3)
def forward(self, x):
x *= self.p
x = self.conv1(x)
return x
def get_n_params(model):
pp=0
for p in list(model.parameters()):
nn=1
for s in list(p.size()):
nn = nn*s
pp += nn
return pp
net = Net()
x = torch.rand([64,1,10])
print("number of parameters = ", get_n_params(net))
torchsummary.summary(net, input_size=[[1,10]])
Returns 9 vs. 8:
number of parameters = 9
----------------------------------------------------------------
Layer (type) Output Shape Param #
================================================================
Conv1d-1 [-1, 2, 8] 8
================================================================
Total params: 8
Trainable params: 8
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.00
Forward/backward pass size (MB): 0.00
Params size (MB): 0.00
Estimated Total Size (MB): 0.00
----------------------------------------------------------------
Metadata
Metadata
Assignees
Labels
No labels