Files
ik_llama.cpp/examples
Samuel Oliveira Alves 557b674f63 Add llama_context to MTP (#1601)
* wip: separate llama_context for MTP with graph reuse

* wip: fix KV cache desync with separate MTP context

* refactor: remove dead mtp logic code, encapsulate KV mirroring

* mtp-context: derive args directly from the main model's context

* mtp: fix kv cache positions

* clean small comments

* minor refactor for context shift
2026-04-09 15:33:56 +02:00
..
2024-07-27 07:55:01 +02:00
2024-07-27 07:55:01 +02:00
2025-06-19 10:24:53 +03:00
2026-04-09 15:33:28 +02:00
2026-04-08 07:58:49 +02:00
2025-12-15 08:27:20 +01:00
2026-04-09 15:33:56 +02:00
2024-08-12 15:14:32 +02:00
2023-03-29 20:21:09 +03:00
2024-07-27 07:55:01 +02:00