检查CUDA版本是否匹配: 确保你安装的CUDA Toolkit版本与你尝试导入的flash_attn_2_cuda库所需的版本相匹配。你可以通过运行以下命令来检查CUDA版本: bash nvcc --version 如果版本不匹配,你需要下载并安装正确版本的CUDA Toolkit。 确认libcudart.so.11.0是否已正确安装: libcudart.so.11.0是CUDA 11.0的运行时库...
import flash_attn_2_cuda as flash_attn_cuda ImportError: /usr/local/lib/python3.10/dist-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops5zeros4callEN3c108ArrayRefINS2_6SymIntEEENS2_8optionalINS2_10ScalarTypeEEENS6_INS2_6LayoutEEENS6_INS2_6Devic...
Hello, It's ok to import flash_attn but wrong when importing flash_attn_cuda. I install flash_attn from pip. I have tried to re-install torch and flash_attn and it still not works. Details: The versions of nvcc -V and torch.version.cuda are both 11.7 and compatible. Please help me...
方法一: 更换llava.__init__中的代码: # from .model import LlavaLlamaForCausalLM from .model.language_model.llava_llama import LlavaLlamaForCausalLM 有可能会报下一个错误: ImportError: /home/linjl/anaconda3/envs/sd/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-...
I found I was unable to import flash_attn_cuda after running python setup.py install. --- details --- I run python setup.py install with a prefix pointing to the root dir of flash-attention. I set PYTHONPATH=$PWD also with the absolute path of the root dir of flash-attention. Any...
Thanks for sharing your amazing work, i was excited to give it a try. I tried to follow the steps and after building kernal package in /models/csrc/, after running the code i am getting the error as if there is no package, i am not sure if i am missing anything in between. Should...