Files
tabbyAPI/requirements.txt
kingbri a51889bdb8 Requirements: Update Flash Attention
Bump to version 2.3.3.

Signed-off-by: kingbri <bdashore3@proton.me>
2023-11-18 22:28:24 -05:00

13 lines
897 B
Plaintext

fastapi
pydantic < 2,>= 1
PyYAML
progress
sse_starlette
uvicorn
# Wheels
https://github.com/bdashore3/flash-attention/releases/download/2.3.3/flash_attn-2.3.3+cu122-cp311-cp311-win_amd64.whl; platform_system == "Windows" and python_version == "3.11"
https://github.com/bdashore3/flash-attention/releases/download/2.3.3/flash_attn-2.3.3+cu122-cp310-cp310-win_amd64.whl; platform_system == "Windows" and python_version == "3.10"
https://github.com/Dao-AILab/flash-attention/releases/download/v2.3.3/flash_attn-2.3.3+cu122torch2.1cxx11abiFALSE-cp311-cp311-linux_x86_64.whl; platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"
https://github.com/Dao-AILab/flash-attention/releases/download/v2.3.3/flash_attn-2.3.3+cu122torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl; platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"