在PyTorch Lightning 中,可以通过 self.current_epoch 属性来获取当前的 epoch 值。在 on_train_epoch_end 方法中,可以通过 self.current_epoch 来获取当前 epoch 的值,如下所示: import pytorch_lightning a…
on_train_epoch_endis called afteron_epoch_end, which seems incorrect. It is natural to open the epoch scope before the train epoch scope (as is being done currently), in which case the epoch scope should be closedafterclosing the train epoch scope (which is not currently being done)...
Thanks, Regards, M.Ali so I got this error when calling "on_train_epoch_end(self, trainer, pl_module, outputs):" you need to delete the 'outputs' as an input and just call the function this way: on_train_epoch_end(self, trainer, pl_module): ...
_epoch': tensor(0.2291, device='cuda:0'), 'train_loss_epoch': tensor(0.2746, device='cuda:0')} Train_epoch_end: {'loss': tensor(0.2575, device='cuda:0'), 'val_loss': tensor(0.0525, device='cuda:0'), 'val_loss_epoch': tensor(0.2291, device='cuda:0'), 'train_loss_epoch':...