Files
ik_llama.cpp/examples
hksdpc255 e1c4c4a495 Fix Anthropic Messages API (#1136)
* server: stop processing the prompt when client disconnects

implement generator-based API for task results

Update httplib.h to 0.27.0

Fix embedding error

Stop prompt processing when disconnected

* Port upstream https://github.com/ggml-org/llama.cpp/pull/18551

* add back anthropic

* Fix merge issue caused by github webui

---------

Co-authored-by: firecoperana <firecoperana>
2026-01-13 08:37:29 +02:00
..
2024-07-27 07:55:01 +02:00
2025-11-30 18:45:38 +01:00
2024-07-27 07:55:01 +02:00
2024-07-27 07:55:01 +02:00
2025-05-23 08:07:42 +03:00
2025-06-19 10:24:53 +03:00
2025-06-19 10:24:53 +03:00
2025-06-19 10:24:53 +03:00
2024-07-27 07:55:01 +02:00
2025-12-15 08:27:20 +01:00
2026-01-13 08:37:29 +02:00
2024-08-12 15:14:32 +02:00
2024-08-12 15:14:32 +02:00
2023-03-29 20:21:09 +03:00
2024-07-27 07:55:01 +02:00