flash-attn
PyPI page
Home page
Author:
Tri Dao
Summary:
Flash Attention: Fast and Memory-Efficient Exact Attention
Latest version:
2.8.3
Required dependencies:
einops
|
torch
Downloads last day:
29,214
Downloads last week:
261,403
Downloads last month:
1,099,872