用户收到此警告是因为在调用torch.utils.checkpoint时没有明确指定use_reentrant参数。从PyTorch的文档和警告信息来看,这是为了提醒用户,在未来的版本中,use_reentrant的默认值可能会改变,因此最好显式指定以避免潜在的问题。 4. 提供解决方案:如何正确使用use_reentrant参数或采取其他措施来避免此警告 为了避免此警告并...
causing a 'use_reentrant=False' warning. The issue was resolved by removing the unnecessary initialization, ensuring proper handling of gradient checkpointing parameters.
according to new pytorch, you need to now explicitly set use_reentrant as it will be changed from use_reentrant=True to use_reentrant=False in near future transformers.models.llama.modeling_llama def forward... layer_outputs = torch.utils.checkpoint.checkpoint( create_custom_forward(decoder_layer...