Files
ik_llama.cpp/common
firecoperana 7978f04996 Add vision support in llama-server (#901)
* server: add support for vision model
webui: add support for vision model

* server : remove hack for extra parallel slot#10187

* llama : fix KV shift for qwen2vl #13870

* add no-context-shift parameter

---------

Co-authored-by: firecoperana <firecoperana>
2025-11-05 10:43:46 +02:00
..
2024-07-27 07:55:01 +02:00
2025-09-01 08:38:49 +03:00
2024-07-27 07:55:01 +02:00
2023-11-13 14:16:23 +02:00