This website requires JavaScript.
Explore
Help
Register
Sign In
theroyallab
/
tabbyAPI
Watch
1
Star
0
Fork
0
You've already forked tabbyAPI
mirror of
https://github.com/theroyallab/tabbyAPI.git
synced
2026-05-12 00:41:22 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
9ebbe06f2961953cc08bd9ba6a8e4e65ef6a8be3
tabbyAPI
/
backends
History
turboderp
9ebbe06f29
exllamav3: Supply max_chunk_size when loading model
2026-04-18 13:20:12 +02:00
..
exllamav2
OAI endpoints: Add option to suppress header after reasoning start token (e.g. Gemma4's "thought\n")
2026-04-12 04:12:53 +02:00
exllamav3
exllamav3: Supply max_chunk_size when loading model
2026-04-18 13:20:12 +02:00
infinity
Model: Add proper jobs cleanup and fix var calls
2025-04-24 21:30:55 -04:00
base_model_container.py
OAI endpoints: Add option to suppress header after reasoning start token (e.g. Gemma4's "thought\n")
2026-04-12 04:12:53 +02:00