Files
ik_llama.cpp/github-data/issues/433 - Feature Request_ CORS support.md
2025-07-23 13:31:53 +02:00

1.6 KiB

#433 - Feature Request: CORS support

Author KCS-Mack
State Open
Created 2025-05-18
Updated 2025-05-18

Description

Prerequisites

  • I am running the latest code. Mention the version if possible as well.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

With the original llama.cpp they added a flag to enable cors:

1e6a2f12c6

However I don't see that added to ik_llama(great work by the way, love this project!)

Is there any plans to enable CORS in the future?

Motivation

I use an application endpoint that requires CORS to interract, It works with llama-cp with the --public-domain flag.

Possible Implementation

No response


💬 Conversation

👤 ubergarm commented the 2025-05-18 at 15:39:10:

You could use any reverse proxy to add this yourself e.g. nginx, caddy server, etc.

Also someone created a wrapper/reverse-proxy like thing to support tool calling and other openai style endpoint stuff it seems: https://github.com/ikawrakow/ik_llama.cpp/discussions/403#discussioncomment-13098276