https://github.com/Dao-AILab/flash-attention
Flash Attention 2, 2.8.3 precompiled for latest ComfyUI 3.50 Win.
Attention method, necessary for some models.
Couldn't find a wheel for Win, Python 3.13 Pytorch3.8.0,CU12.9 which what ComfyUI is using, so compiled myself one.
Is a long , annoying process that required a beefy PC, so I saved you some time.
When Comfy changes and I need a updated compile, I ll update here again if enough people find it useful..
Description
2.8.3 ,Cu12.9, Py3.12, Pytorch3.8.0 Win
Details
Downloads
15
Platform
CivitAI
Platform Status
Available
Created
8/21/2025
Updated
9/28/2025
Deleted
-
Files
flashAttention2For_flashattention283.zip
Mirrors
CivitAI (1 mirrors)
flashAttention2For_flashattention2.zip
Mirrors
CivitAI (1 mirrors)