那么“对抗”发生在哪里呢?注意图里Gradient Reversal Layer(GRL)放在了G和P之间,也就是说“反转”步骤只出现在P反传给G的时候,发挥的作用可以理解为,使得底层共用特征忽略两个领域之间的差异,而更应该关注领域间的“共性”。 三、参考资料 Gradient Reversal Layer指什么?玄学攻城狮:Gradient Reversal(梯度反转) ...
测试代码如下 Grl=GradReverseLayer(1)x=torch.tensor([1.,2.,3.],requires_grad=True)y=torch.tensor([4.,5.,6.],requires_grad=True)z=torch.pow(x,2)+torch.pow(y,2)f=z+x+ys=6*f.sum()# s = grad_reverse(s,1)s=Grl(s)print(s)s.backward()print(x)print(x.grad) 输出如下 ten...
根据我个人的理解,这就是GRL的直观解释。P.S.我在PyTorch里实现的GRL Layer,晒出来,知友们可以帮忙...
下面梯度反转我的理解是这样:discriminator要等正确区分 source domain 和 target domain, 首先想如果没有gradient reversal layer模型是怎么样的? 显然feature extractor生成的特征source domain和target domain肯定差别很大f_{src} \ne f_{tgt}. 也就是在分类的时候,我们的分类器只能保证能正确学习到source domain的...
functionZ = predict(layer, X) Z = X;% Identity end functiondLdX = backward(layer, X, dLdZ) dLdX = -dLdZ;% Reverse gradient end end end If you want to define the constant you multiple the gradient by, you could make it a property of the custom layer and include that in yourback...
However, the annealing behavior of cold-deformed carbon steels with a large strain gradient within a thin deformed surface layer has rarely been studied. For example, it is unclear how the strain gradient field affects ferrite recrystallization and ferrite-to-austenite transformation during annealing ...