This website requires JavaScript.
Explore
Help
Register
Sign In
ikawrakow
/
ik_llama.cpp
Watch
1
Star
0
Fork
0
You've already forked ik_llama.cpp
mirror of
https://github.com/ikawrakow/ik_llama.cpp.git
synced
2026-01-26 17:20:01 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
42e4c612433be387144da4e4c27d33e2b7edd1be
ik_llama.cpp
/
ggml
History
firecoperana
42e4c61243
CUDA: Fix FA for Pascal GPU (
#1036
)
...
Co-authored-by: firecoperana <firecoperana>
2025-12-05 16:42:14 +01:00
..
cmake
Merge mainline llama.cpp (
#3
)
2024-07-27 07:55:01 +02:00
include
Hadamard transforms for K-cache - CPU only (
#1033
)
2025-12-04 06:51:11 +01:00
src
CUDA: Fix FA for Pascal GPU (
#1036
)
2025-12-05 16:42:14 +01:00
.gitignore
Merge mainline llama.cpp (
#3
)
2024-07-27 07:55:01 +02:00
CMakeLists.txt
Enable fusion by default (
#939
)
2025-11-11 10:35:48 +02:00