-
Notifications
You must be signed in to change notification settings - Fork 29
Expand file tree
/
Copy pathrequirements.txt
More file actions
29 lines (27 loc) · 1.05 KB
/
requirements.txt
File metadata and controls
29 lines (27 loc) · 1.05 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
coloredlogs==15.0.1
gradio>=5.31.0
matplotlib==3.7.1
numpy>=1.25.0
Pillow==10.3.0
Requests>=2.32.4
scikit-image==0.22.0
torch_summary==1.4.5
tqdm==4.66.3
pytorch_fid==0.3.0
fastapi==0.115.6
tensorboard==2.19.0
tensorboardX==2.6.1
transformers==4.56.1
# If you want to use flash attention, please install flash-attn.
# Compile your own environment: pip install flash-attn --no-build-isolation
# or download flash-attn .whl file from github: https://github.com/Dao-AILab/flash-attention/releases/tag/v2.8.2
# Optional installation (Not installed by default)
# flash-attn==2.8.2
# If you want to download gpu version
# Please use: pip install torch==1.13.0+cu116 torchvision==0.14.0+cu116 -f https://download.pytorch.org/whl/torch_stable.html
# About more torch information please click: https://pytorch.org/get-started/previous-versions/#linux-and-windows-25
# More versions please click: https://pytorch.org/get-started/previous-versions
# [Note] torch versions must >= 1.10.0
# More info: https://pytorch.org/get-started/locally/ (recommended)
torch>=1.10.0
torchvision>=0.10.0