Flash Attention 2 pre-built wheels for Windows. Drop-in replacement for PyTorch attention providing up to 10x speedup and 20x memory reduction. Compatible with Python 3.10 and CUDA 11.7+. No build ...