Skip to content

Commit 1bbebcc

Browse files
committed
Edit README to mention bf16 support
1 parent de19de7 commit 1bbebcc

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,8 @@ PYTHONPATH=$PWD python benchmarks/benchmark_flash_attention.py
2323

2424
FlashAttention currently supports:
2525
1. Turing or Ampere GPUs (e.g., A100, RTX 3090, T4, RTX 2080).
26-
2. fp16.
27-
3. Head dimensions 16, 32, 64, 128 (bwd requires A100).
26+
2. fp16 and bf16 (bf16 requires Ampere GPUs).
27+
3. Head dimensions 16, 32, 64, 128 (head dim 128 backward requires A100).
2828

2929
Our tentative roadmap:
3030
1. [Jun 2022] Make package pip-installable.

0 commit comments

Comments
 (0)