检查CUDA版本是否匹配: 确保你安装的CUDA Toolkit版本与你尝试导入的flash_attn_2_cuda库所需的版本相匹配。你可以通过运行以下命令来检查CUDA版本: bash nvcc --version 如果版本不匹配,你需要下载并安装正确版本的CUDA Toolkit。 确认libcudart.so.11.0是否已正确安装: libcudart.so.11.0是CUDA 11.0的运行时库...
Hello, It's ok to import flash_attn but wrong when importing flash_attn_cuda. I install flash_attn from pip. I have tried to re-install torch and flash_attn and it still not works. Details: The versions of nvcc -V and torch.version.cuda are both 11.7 and compatible. Please help me...
I found I was unable to import flash_attn_cuda after running python setup.py install. --- details --- I run python setup.py install with a prefix pointing to the root dir of flash-attention. I set PYTHONPATH=$PWD also with the absolute path of the root dir of flash-attention. Any...
方法一: 更换llava.__init__中的代码: # from .model import LlavaLlamaForCausalLM from .model.language_model.llava_llama import LlavaLlamaForCausalLM 有可能会报下一个错误: ImportError: /home/linjl/anaconda3/envs/sd/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-...
Build cuda_11.8.r11.8/compiler.31833905_0 I got this error: >>> import flash_attn;flash_attn.__version__ Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python3.10/dist-packages/flash_attn/__init__.py", line 3, in <module> ...
Thanks for sharing your amazing work, i was excited to give it a try. I tried to follow the steps and after building kernal package in /models/csrc/, after running the code i am getting the error as if there is no package, i am not sure if i am missing anything in between. Should...