Files
ik_llama.cpp/ggml
Kawrakow dc023bc3be Zen4 Flash Attention (#32)
* Zen4 flash attention: moving useful parts from the kq_fused_softmax branch

* Add flash attention with soft-cap and fix D = 256 case

* Flash attention refinements

* Update FlashAttn comment

---------

Co-authored-by: Iwan Kawrakow <iwan.kawrakow@gmail.com>
2024-09-01 16:08:21 +03:00
..
2024-07-27 07:55:01 +02:00
2024-08-27 17:40:59 +03:00
2024-09-01 16:08:21 +03:00
2024-07-27 07:55:01 +02:00