🐛 Describe the bug From the pytorch version 1.13.0, KLDivLoss backward computation produces nan gradient. The code runs without error in the pytorch version 1.12. import numpy as np import torch import torch.nn as nn torch.autograd.set_d...
、、、 : nan - main_output_loss: nan - aux_output_loss: nan - val_loss: nan - val_main_output_loss: nan - val_aux_output_loss: nan - aux_output_loss: nan - val_loss: nan - val_main_output_loss: nan - val_aux_output_loss: nan E 浏览4提问于2020-09-11得票数 1 回答已采...
( torch.autograd.gradcheck.GradcheckError: Jacobian mismatch for output 0 with respect to input 0, numerical:tensor(nan, device='cuda:0', dtype=torch.float64) analytical:tensor(nan, device='cuda:0', dtype=torch.float64) The above quantities relating the numerical and analytical jacobians are ...
、、、 : nan - main_output_loss: nan - aux_output_loss: nan - val_loss: nan - val_main_output_loss: nan - val_aux_output_loss: nan - aux_output_loss: nan - val_loss: nan - val_main_output_loss: nan - val_aux_output_loss: nan E 浏览4提问于2020-09-11得票数 1 回答已采...