在PyTorch Lightning 中,可以通过 self.current_epoch 属性来获取当前的 epoch 值。在 on_train_epoch_end 方法中,可以通过 self.current_epoch 来获取当前 epoch 的值,如下所示: import pytorch_lightning a…
on_train_epoch_endis called afteron_epoch_end, which seems incorrect. It is natural to open the epoch scope before the train epoch scope (as is being done currently), in which case the epoch scope should be closedafterclosing the train epoch scope (which is not currently being done)...
so I got this error when calling "on_train_epoch_end(self, trainer, pl_module, outputs):" you need to delete the 'outputs' as an input and just call the function this way: on_train_epoch_end(self, trainer, pl_module): seems like an update issue with newer versions on pytorch light...
The behaviour of the callback hookson_train_epoch_endandon_validation_epoch_enddo not match. Whileon_train_epoch_endcan access metrics of the same epoch from thevalidation_epoch_endmethod the opposite is not true. More concretely, when accessingtrainer.callback_metricsfromon_validation_epoch_endth...