mirror of
https://github.com/ikawrakow/ik_llama.cpp.git
synced 2026-05-01 03:41:53 +00:00
Add vision support in llama-server (#901)
* server: add support for vision model webui: add support for vision model * server : remove hack for extra parallel slot#10187 * llama : fix KV shift for qwen2vl #13870 * add no-context-shift parameter --------- Co-authored-by: firecoperana <firecoperana>
This commit is contained in:
@@ -57,8 +57,6 @@ add_library(${TARGET} STATIC
|
||||
chat-parser.cpp
|
||||
chat-parser.h
|
||||
common.cpp
|
||||
chat.h
|
||||
chat.cpp
|
||||
sampling.h
|
||||
sampling.cpp
|
||||
console.h
|
||||
|
||||
Reference in New Issue
Block a user