Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transformer Engine using FlashAttention V3 #1125

Closed
heavyrain-lzy opened this issue Aug 21, 2024 · 2 comments
Closed

Transformer Engine using FlashAttention V3 #1125

heavyrain-lzy opened this issue Aug 21, 2024 · 2 comments

Comments

@heavyrain-lzy
Copy link

I find that TE don't support FA-V3. There some error when I use flash_attn==2.6.3 and transformer_engine=1.9.0 enable context parallel in Megatron-LM. Do you have the plan to support it?

@yaox12
Copy link
Collaborator

yaox12 commented Aug 27, 2024

FA3 support is added in #1019.

@cyanguwa
Copy link
Collaborator

The context parallel support with FA3 is added in #1232. Please give it a try and let us know if there's any problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants