This website requires JavaScript.
Explore
Help
Register
Sign In
ikawrakow
/
ik_llama.cpp
Watch
1
Star
0
Fork
0
You've already forked ik_llama.cpp
mirror of
https://github.com/ikawrakow/ik_llama.cpp.git
synced
2026-02-09 16:00:12 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
2fcf407ab33d6ef53a28286db439d8410d672bad
ik_llama.cpp
/
ggml
History
Iwan Kawrakow
f44844b328
Looks like with this change it is working with tensor overrides
2025-12-16 18:48:42 +00:00
..
cmake
Merge mainline llama.cpp (
#3
)
2024-07-27 07:55:01 +02:00
include
Command line option to set max. extra VRAM that the scheduler can use
2025-12-16 18:48:42 +00:00
src
Looks like with this change it is working with tensor overrides
2025-12-16 18:48:42 +00:00
.gitignore
Merge mainline llama.cpp (
#3
)
2024-07-27 07:55:01 +02:00
CMakeLists.txt
Enable fusion by default (
#939
)
2025-11-11 10:35:48 +02:00