Files
ik_llama.cpp/github-data/issues/614 - Feature Request_ port no-mmproj-offload.md
2025-07-23 13:31:53 +02:00

1.1 KiB

#614 - Feature Request: port no-mmproj-offload

Author erazortt
State Open
Created 2025-07-15
Updated 2025-07-16

Description

Prerequisites

  • I am running the latest code. Mention the version if possible as well.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

Please port over the flag no-mmproj-offload.

Motivation

This helps saving VRAM and since I use the vision model quite seldom, I can wait a little longer when I do use it.

Possible Implementation

No response


💬 Conversation

👤 ikawrakow commented the 2025-07-16 at 09:19:35:

There is no vision support at all in ik_llama.cpp, see my response in #615