Pinned Loading
-
-
-
flash-attention-v2-RDNA3-minimal
flash-attention-v2-RDNA3-minimal Publica simple Flash Attention v2 implementation with ROCM (RDNA3 GPU, roc wmma), mainly used for stable diffusion(ComfyUI) in Windows ZLUDA environments.
Python 16
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.