CivArchive
    Flash Attention 2 for ComfyUI - flash_attn-2.8.3+cu130tor
    Preview 127895537

    https://github.com/Dao-AILab/flash-attention

    Flash Attention 2, 2.8.3 precompiled for latest ComfyUI 3.50 Win.
    Attention method, necessary for some models.
    Couldn't find a wheel for Win, Python 3.13 Pytorch3.8.0,CU12.9 which what ComfyUI is using, so compiled myself one.
    Is a long , annoying process that required a beefy PC, so I saved you some time.
    When Comfy changes and I need a updated compile, I ll update here again if enough people find it useful..

    Description

    flash_attn-2.8.3+cu130torch2.10-cp313-cp313-win_amd64 for Comfy and others

    Other
    Other

    Details

    Downloads
    15
    Platform
    CivitAI
    Platform Status
    Available
    Created
    4/18/2026
    Updated
    4/27/2026
    Deleted
    -