resnet18和resnet50区别,参数设置resnet34

importtorchimporttorch.nnasnnimporttorch.nn.functionalaspldxg (nn.module ) : ‘ ‘是子模块: residual block ‘ ‘ shortcut=None ) :super(residualblock,self ) __init__ ) self.left stride,1,bias=False ),nn.batch none nn.batchnorm2d(outchannel ) ) self.right=shortcutdefforward ) self, x ) :out=self.left(x ) residual=xifself.rightisnoneelseself.right ) x ) out=residualreturnf.relu ) out ) out 每个层还包含多个residual block,其中num _ classes=10000在子模块中实现residual block,在_make_layer函数中实现layer ‘ self ) __init__上一层的图像转换self.pre=nn.sequential (nn.conv 2d nn.relu (in place=true ),nn.maxpool2d 具有三个residualblockself.layer1=self.layer2=self._ make _ layer (64,128,4,stride=2) self.layer 256 stride=2) #用于分类的全连接self.fc=nn.linear(512,num_classes ) def_make_layer ) self,inchannel,outchannel 层包含多个residual block ‘ ‘ shortcut=nn.sequential (nn.conv 2d (in channel,outchannel,1,stride,bias=False ), nn.batchnorm2d(outchannel ) ) layers=[]layers.append ) residualblock(inchannel,outchannel,stride,shortcut ) Bloch _ num (: layers.append (residual block (out channel, outchannel ) ) ) returnnn.sequential ) ) layers ) defforwarrd x ) :x=self.pre(x ) x=self.layer1(x ) x -1) returnself.fc(x ) if _ name _==(_ _ main _ ‘ : model=resnet ) #input=t.autograd.variiad 24 ) input

Published by

风君子

独自遨游何稽首 揭天掀地慰生平

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注