Skip to content

Support fused_attention_qkv for auto_parallel llama #12139

Support fused_attention_qkv for auto_parallel llama

Support fused_attention_qkv for auto_parallel llama #12139

Annotations

1 warning

Test

succeeded May 15, 2024 in 28m 8s