This website requires JavaScript.
Explore
Help
Register
Sign In
ikawrakow
/
ik_llama.cpp
Watch
1
Star
0
Fork
0
You've already forked ik_llama.cpp
mirror of
https://github.com/ikawrakow/ik_llama.cpp.git
synced
2026-01-26 09:09:50 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
f0fb76da64c5175fa23225d4d4195b6689a2e04c
ik_llama.cpp
/
ggml
History
Kawrakow
f0fb76da64
Better GLM-4.7-Flash long context TG performance (
#1182
)
...
* Better GLM-4.7-Flash long context TG performance * Handle quantized cache
2026-01-24 07:05:48 +02:00
..
cmake
Merge mainline llama.cpp (
#3
)
2024-07-27 07:55:01 +02:00
include
Remove llamafile remnants (
#1179
)
2026-01-22 13:20:23 +02:00
src
Better GLM-4.7-Flash long context TG performance (
#1182
)
2026-01-24 07:05:48 +02:00
.gitignore
Merge mainline llama.cpp (
#3
)
2024-07-27 07:55:01 +02:00
CMakeLists.txt
Remove llamafile remnants (
#1179
)
2026-01-22 13:20:23 +02:00