Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support evaluation using peft(lora) weight #433

Closed
wants to merge 5 commits into from
Closed

Support evaluation using peft(lora) weight #433

wants to merge 5 commits into from

Conversation

SingL3
Copy link
Contributor

@SingL3 SingL3 commented Jul 6, 2023

No description provided.

@SingL3 SingL3 changed the title Support evaluation using non-merged peft(lora) weight Support evaluation using peft(lora) weight Jul 6, 2023
TUTORIAL.md Outdated Show resolved Hide resolved
@vchiley vchiley requested review from codestar12 and danbider July 6, 2023 14:50
Copy link
Contributor

@danbider danbider left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overall LGTM and thanks for this contrib.
only comment is to first merge PR#435 supporting LORA FSDP and then see if the lora_merge is needed in the tutorial, yaml, and python script.

TUTORIAL.md Outdated Show resolved Hide resolved
scripts/eval/eval.py Outdated Show resolved Hide resolved
scripts/eval/eval.py Outdated Show resolved Hide resolved
@SingL3
Copy link
Contributor Author

SingL3 commented Jul 10, 2023

Hi, @danbider
The original purpose of this PR is to evaluate model using unmerged lora weight actually, now it is still not supported yet.

@danbider
Copy link
Contributor

danbider commented Jul 10, 2023

Hi, @danbider The original purpose of this PR is to evaluate model using unmerged lora weight actually, now it is still not supported yet.

definitely @SingL3, this is a timely and needed PR for us. I was just questioning whether it is needed to do the "merge lora modules" step for FSDP, as there's another PR that supports FSDP with LoRA.

@SingL3
Copy link
Contributor Author

SingL3 commented Jul 10, 2023

I see. Maybe I will just delete the merging part, as I said in the TUTORIAL, merging would cause slightly different evalution result.
Will push later.

@SingL3
Copy link
Contributor Author

SingL3 commented Jul 11, 2023

Hi @danbider, I have deleted the merging part.

Copy link
Contributor

@danbider danbider left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. I'd like @vchiley or @codestar12 to take a look and merge if they see fit?

@SingL3 SingL3 requested a review from vchiley July 13, 2023 07:05
@danbider danbider mentioned this pull request Aug 10, 2023
@SingL3
Copy link
Contributor Author

SingL3 commented Aug 15, 2023

Closing this because of #515

@SingL3 SingL3 closed this Aug 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants