-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Readme] Add flash mask #9219
[Readme] Add flash mask #9219
Conversation
Thanks for your contribution! |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## develop #9219 +/- ##
===========================================
- Coverage 52.96% 52.92% -0.04%
===========================================
Files 656 660 +4
Lines 106258 106870 +612
===========================================
+ Hits 56282 56565 +283
- Misses 49976 50305 +329 ☔ View full report in Codecov by Sentry. |
@@ -0,0 +1,92 @@ | |||
# FlashMask |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
建议放到llm/readme里,能力列表里增加一列,说明flashmask支持情况
是不是专门弄个 FlashMask 的页面,FlashMask 不仅仅是给 Llama 支持的,只是现在只子 Llama 上适配了。如过 Attention 那个方法是个基础函数,可以给 Transformer based 的模型调用,那就是通用的了。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
New features
PR changes
Others
Description
新增flashmask使用示例脚本