Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fastmoe是否支持微调,page-attention,flasahattention和kvcache,混合精度等 #192

Open
PowerDispatch opened this issue Feb 5, 2024 · 4 comments

Comments

@PowerDispatch
Copy link

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like
A clear and concise description of what you want to happen.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
Add any other context or screenshots about the feature request here.

@laekov
Copy link
Owner

laekov commented Feb 5, 2024

是, 是, 是和是, 是...

@PowerDispatch
Copy link
Author

没有找到微调的case, 能帮忙引导一下吗

@laekov
Copy link
Owner

laekov commented Feb 5, 2024

我们没有在仓库中包含微调的例子. 请您自行实现.

@PowerDispatch
Copy link
Author

好的,谢谢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants