mirror of
https://github.com/theroyallab/tabbyAPI.git
synced 2026-04-22 07:19:07 +00:00
Dependencies: Update torch, exllamav2, and flash-attn
Torch - 2.6.0 ExllamaV2 - 0.2.8 Flash-attn - 2.7.4.post1 Cuda wheels are now 12.4 instead of 12.1, feature names need to be migrated over. Signed-off-by: kingbri <8082010+kingbri1@users.noreply.github.com>
This commit is contained in:
@@ -52,8 +52,6 @@ def supports_paged_attn():
|
||||
"using the following command:\n\n"
|
||||
"For CUDA 12.1:\n"
|
||||
"pip install --upgrade .[cu121]\n\n"
|
||||
"For CUDA 11.8:\n"
|
||||
"pip install --upgrade .[cu118]\n\n"
|
||||
"NOTE: Windows users must use CUDA 12.x to use flash-attn."
|
||||
)
|
||||
|
||||
|
||||
Reference in New Issue
Block a user