-
Notifications
You must be signed in to change notification settings - Fork 27.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix failing test_batch_generation
for bloom
#25718
Conversation
@@ -449,9 +449,9 @@ def test_batch_generation(self): | |||
|
|||
input_sentence = ["I enjoy walking with my cute dog", "I enjoy walking with my cute dog"] | |||
|
|||
input_ids = tokenizer.batch_encode_plus(input_sentence, return_tensors="pt", padding=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not actually ids
@@ -449,9 +449,9 @@ def test_batch_generation(self): | |||
|
|||
input_sentence = ["I enjoy walking with my cute dog", "I enjoy walking with my cute dog"] | |||
|
|||
input_ids = tokenizer.batch_encode_plus(input_sentence, return_tensors="pt", padding=True) | |||
input_ids = input_ids["input_ids"].to(torch_device) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no ids
@@ -449,9 +449,9 @@ def test_batch_generation(self): | |||
|
|||
input_sentence = ["I enjoy walking with my cute dog", "I enjoy walking with my cute dog"] | |||
|
|||
input_ids = tokenizer.batch_encode_plus(input_sentence, return_tensors="pt", padding=True) | |||
input_ids = input_ids["input_ids"].to(torch_device) | |||
attention_mask = input_ids["attention_mask"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no attention_mask
in the actual ids
input_ids = tokenizer.batch_encode_plus(input_sentence, return_tensors="pt", padding=True) | ||
input_ids = input_ids["input_ids"].to(torch_device) | ||
attention_mask = input_ids["attention_mask"] | ||
inputs = tokenizer.batch_encode_plus(input_sentence, return_tensors="pt", padding=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good name and avoid failure.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
😅 Nice catch !
I was caught by the failed CI 😢 |
The documentation is not available anymore as the PR was closed or merged. |
fix Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
fix Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
fix Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
What does this PR do?
#25571 changed this test. There was a tiny issue. See my comment along the change in this PR.
We should probably to change this variable names in multiple places for house keeping.
Also cc @gante 😄