>>> from flash_attn import flash_attn_qkvpacked_func, flash_attn_func resulted in Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: cannot import name 'flash_attn_qkvpacked_func' from 'flash_attn' (/usr/local/lib/python3.10/dist-packages/flash_at...
Do you have any suggestions on this? Is theFlashAttnQKVPackedFuncnumerically unstable? Thank you very much! Looking forward to your reply. Contributor Thanks for the report. The function should be numerically stable. Which commit of FlashAttention are you using? On which GPU? What are the dim...
- [x] Implement `zigzag_ring_flash_attn_qkvpacked_func` [issue#2](https://github.com/zhuzilin/ring-flash-attention/issues/2) - [x] Implement `stripe_flash_attn_qkvpacked_func` - [ ] Implement `zigzag_ring_flash_attn_varlen_qkvpacked_func` - [x] Implement `zigzag_ring_flash_at...
feat = flash_attn.flash_attn_varlen_qkvpacked_func( AttributeError: module 'flash_attn' has no attribute 'flash_attn_varlen_qkvpacked_func'