2023-11-15 01:27:37 +01:00
2023-11-05 07:20:14 +01:00
2023-11-15 01:27:37 +01:00
2023-11-07 22:34:42 +01:00
2023-11-15 01:25:36 +01:00
2023-11-06 05:51:12 +01:00
2023-11-05 07:20:14 +01:00
2023-11-07 22:34:42 +01:00
2023-11-12 08:14:44 +01:00
2023-11-06 00:38:25 +01:00

ExUI

This is a simple, lightweight browser-based UI for running local inference using ExLlamaV2.

Overview of features

  • Friendly, responsive and minimalistic UI
  • Persistent sessions
  • Multiple instruct formats
  • Speculative decoding
  • Supports EXL2, GPTQ and FP16 models

Screenshots

chat_screenshot chat_screenshot chat_screenshot chat_screenshot chat_screenshot chat_screenshot

Running locally

First, clone this repository and install requirements:

git clone https://github.com/turboderp/exui
cd exui
pip install -r requirements.txt

Then run the web server with the included server.py:

python server.py

Your browser should automatically open on the default IP/port. Config and sessions are stored in ~/exui by default.

Prebuilt wheels for ExLlamaV2 are available here. Installing the latest version of Flash Attention is recommended.

More to come

Stay tuned.

avatar_unicorn.png

Description
Web UI for ExLlamaV2
Readme MIT 3 MiB
Languages
JavaScript 48.6%
Python 34.3%
CSS 14.7%
HTML 2.4%