Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Hackathon 5th No.73] ToT #7660

Merged
merged 54 commits into from
Jan 26, 2024
Merged

[Hackathon 5th No.73] ToT #7660

merged 54 commits into from
Jan 26, 2024

Conversation

ErnestinaQiu
Copy link
Contributor

  1. finish meta/llama2 version
  2. fast start in readme.md

PR types

add tree-of-thought

PR changes

Description

1. finish meta/llama2 version
Copy link

paddle-bot bot commented Dec 14, 2023

Thanks for your contribution!

@CLAassistant
Copy link

CLAassistant commented Dec 14, 2023

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

✅ ErnestinaQiu
❌ root


root seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

Copy link

codecov bot commented Dec 14, 2023

Codecov Report

Attention: 37 lines in your changes are missing coverage. Please review.

Comparison is base (1bfe864) 57.59% compared to head (34c0953) 56.68%.
Report is 123 commits behind head on develop.

Files Patch % Lines
paddlenlp/transformers/llama/modeling_auto.py 5.12% 37 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #7660      +/-   ##
===========================================
- Coverage    57.59%   56.68%   -0.92%     
===========================================
  Files          582      588       +6     
  Lines        86912    89352    +2440     
===========================================
+ Hits         50061    50653     +592     
- Misses       36851    38699    +1848     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@Ligoml Ligoml changed the title Hackathon TASK73 ToT [Hackathon 5th No.73] ToT Dec 18, 2023
@w5688414
Copy link
Contributor

w5688414 commented Dec 21, 2023

  1. 删除关于torch 的llama的代码,该用paddlenlp的llama
  2. 删除所有的数据集,只需要在README里面添加数据集的链接即可

1. delete some unnecessary files according to comments.
1. add llama2 in paddlenlp
1. format data structure
@ErnestinaQiu
Copy link
Contributor Author

ErnestinaQiu commented Dec 22, 2023

teaser

w5688414
w5688414 previously approved these changes Jan 26, 2024
Copy link
Contributor

@w5688414 w5688414 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@PaddlePaddle PaddlePaddle locked as resolved and limited conversation to collaborators Jan 26, 2024
@PaddlePaddle PaddlePaddle unlocked this conversation Jan 26, 2024
Copy link
Contributor

@w5688414 w5688414 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@w5688414 w5688414 closed this Jan 26, 2024
@w5688414 w5688414 reopened this Jan 26, 2024
@sijunhe sijunhe merged commit cdfa861 into PaddlePaddle:develop Jan 26, 2024
9 of 12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants