Files
ik_llama.cpp/examples/server/webui
firecoperana 7978f04996 Add vision support in llama-server (#901)
* server: add support for vision model
webui: add support for vision model

* server : remove hack for extra parallel slot#10187

* llama : fix KV shift for qwen2vl #13870

* add no-context-shift parameter

---------

Co-authored-by: firecoperana <firecoperana>
2025-11-05 10:43:46 +02:00
..
2025-06-08 14:38:47 +03:00
2025-06-08 14:38:47 +03:00
2025-06-08 14:38:47 +03:00
2025-06-08 14:38:47 +03:00
2025-06-08 14:38:47 +03:00
2025-06-08 14:38:47 +03:00
2025-06-08 14:38:47 +03:00
2025-06-08 14:38:47 +03:00