针对你提出的问题“cannot import name 'is_torch_npu_available' from 'transformers.utils'”,这里有几个可能的解决步骤和原因分析: 检查transformers库的版本: is_torch_npu_available函数可能不是所有版本的transformers库都包含的。你需要确认你安装的库版本是否支持该函数。你可以通过以下命令查看当前安装的transforme...
is_torch_npu_availableandis_torch_tpu_availableare not equivalent, I'm afraid. I suspect that yourtransformersversion is a bit too old, as it seems likeis_torch_npu_availabledoes not yet exist in it. Could you runpip show transformersto see what version you're on? For reference, Sentence...
self.device = torch.device("cuda") elif torch.backends.mps.is_available(): self.device = torch.device("mps") elif is_torch_npu_available(): elif is_torch_npu_available: self.device = torch.device("npu") else: self.device = torch.device("cpu") 0 comments on commit 2eb4360 Please...
Revert "Update flag_models.py to fix is_torch_npu_available as a func… … Verified 2eb4360 Fixes all remaining calls 581f2ed Contributor Author MeTaNoV commented Feb 21, 2024 @staoxiao now you can merge again, I fixed all the calls since it was a bit more than 1 :) 👍...