Files
tabbyAPI/requirements.txt
kingbri e740b53478 Requirements: Update Flash Attention 2
Bump to 2.3.6

Signed-off-by: kingbri <bdashore3@proton.me>
2023-12-03 01:56:29 -05:00

18 lines
1.2 KiB
Plaintext

fastapi
pydantic < 2,>= 1
PyYAML
progress
uvicorn
# Wheels
# Flash Attention 2. If the wheels don't work, comment these out, uncomment the old wheels, and run `pip install -r requirements.txt`
# Windows FA2 from https://github.com/jllllll/flash-attention/releases
https://github.com/jllllll/flash-attention/releases/download/v2.3.6/flash_attn-2.3.6+cu121torch2.1cxx11abiFALSE-cp311-cp311-win_amd64.whl; platform_system == "Windows" and python_version == "3.11"
https://github.com/jllllll/flash-attention/releases/download/v2.3.6/flash_attn-2.3.6+cu121torch2.1cxx11abiFALSE-cp310-cp310-win_amd64.whl; platform_system == "Windows" and python_version == "3.10"
# Linux FA2 from https://github.com/Dao-AILab/flash-attention/releases
https://github.com/Dao-AILab/flash-attention/releases/download/v2.3.6/flash_attn-2.3.6+cu122torch2.1cxx11abiFALSE-cp311-cp311-linux_x86_64.whl; platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"
https://github.com/Dao-AILab/flash-attention/releases/download/v2.3.6/flash_attn-2.3.6+cu122torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl; platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"