Even with model_test = CoolSystem(hyperparams_test).load_from_checkpoint('checkpoints/try_ckpt_epoch_1.ckpt'), PyTorch Lightning is still complaining that 'dict' object has no attribute 'data_dir' Am I doing something wrong here? williamFalcon commented on Mar 7, 2020 williamFalcon on Mar...
(checkpoint, *args, **kwargs) File "/home/siahuat0727/.local/lib/python3.8/site-packages/pytorch_lightning/core/saving.py", line 174, in _load_model_state model = cls(*cls_args, **cls_kwargs) File "main.py", line 46, in __init__ super().__init__(*args, **kwargs) ...
How to modify the code to load the checkpoint and also resume from it ? Environment PyTorch Lightning Version 1.6.5 Torch 1.13.0 Python version 3.8 CUDA Version: 11.4 4 NVIDIA A100-SXM4-40GBs Deepspeed 0.9.1 More info No response
The checkpoint is mapped, but this doesn't seem to affect the model created and returned later. https://github.com/Lightning-AI/lightning/blob/fd4697c62c059fc7b9946e84d91625ecb6efdbe5/src/lightning/pytorch/core/saving.py#L51-L92
Bug description I've been using pytorch_lightning for quite a while. Recently, I've started using newly proposed imports such as: import lightning as L pytorch-lightning==2.0.9.post0 torch==2.1.0 What version are you seeing the problem o...
50 changes: 26 additions & 24 deletions50pytorch_lightning/core/saving.py Original file line numberDiff line numberDiff line change Expand Up@@ -52,7 +52,6 @@ class ModelIO(object): defload_from_checkpoint( cls, checkpoint_path:str, ...
pip3 install --upgrade git+https://github.com/PyTorchLightning/pytorch-lightning.git ContributorAuthor sshleifer Jun 8, 2020 • edited Tried that, get better traceback but no solution: KeyError: 'Trying to restore training state but checkpoint contains only the model. This is probably due to...
-Trainer now calls`on_load_checkpoint()`when resuming from a checkpoint ([1666](https://github.com/PyTorchLightning/pytorch-lightning/pull/1666)) ##[0.7.5]- 2020-04-27 Expand Down 4 changes: 4 additions & 0 deletions4pytorch_lightning/trainer/training_io.py ...
Option 2: Have a separate method Trainer.load_from_checkpoint_at_url('http://') Resources We can use this under the hood: (https://pytorch.org/docs/stable/hub.html#torch.hub.load_state_dict_from_url) Any thoughts on which one is better? @PyTorchLightning/core-contributorswilliam...
-Fixed issue where`Model.load_from_checkpoint("checkpoint.ckpt", map_location=map_location)`would always return model on CPU ([#17308](https://github.com/Lightning-AI/lightning/pull/17308)) ##[2.0.1]- 2023-03-30 Expand Down 13 changes: 9 additions & 4 deletions13src/lightning/pytorch/...