Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Tokenizer] Support reading Tiktoken tokenizer.model. #9215
[Tokenizer] Support reading Tiktoken tokenizer.model. #9215
Changes from 5 commits
0fd7240
d1ee434
9004ac9
d004c33
0b61d11
aad6750
04dff4d
6475a83
dea3ad4
f5ae794
ce684a1
75368d5
469ffbf
ee33fba
92e4e0e
f0f4113
353fb41
d279d8d
e367332
a422932
7ff5a17
d46655c
3412f50
19521f9
99299b0
5c169fb
d2d7eeb
5579695
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
Check warning on line 220 in paddlenlp/transformers/auto/configuration.py
Codecov / codecov/patch
paddlenlp/transformers/auto/configuration.py#L217-L220
Check warning on line 237 in paddlenlp/transformers/auto/configuration.py
Codecov / codecov/patch
paddlenlp/transformers/auto/configuration.py#L237
Check warning on line 250 in paddlenlp/transformers/auto/configuration.py
Codecov / codecov/patch
paddlenlp/transformers/auto/configuration.py#L249-L250
Check warning on line 256 in paddlenlp/transformers/auto/configuration.py
Codecov / codecov/patch
paddlenlp/transformers/auto/configuration.py#L256
Check warning on line 259 in paddlenlp/transformers/auto/configuration.py
Codecov / codecov/patch
paddlenlp/transformers/auto/configuration.py#L259
Check warning on line 262 in paddlenlp/transformers/auto/configuration.py
Codecov / codecov/patch
paddlenlp/transformers/auto/configuration.py#L262
Check warning on line 265 in paddlenlp/transformers/auto/configuration.py
Codecov / codecov/patch
paddlenlp/transformers/auto/configuration.py#L265
Check warning on line 450 in paddlenlp/transformers/auto/configuration.py
Codecov / codecov/patch
paddlenlp/transformers/auto/configuration.py#L449-L450
Check warning on line 42 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L42
Check warning on line 141 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L141
Check warning on line 157 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L156-L157
Check warning on line 162 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L159-L162
Check warning on line 168 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L166-L168
Check warning on line 170 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L170
Check warning on line 374 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L374
Check warning on line 386 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L385-L386
Check warning on line 391 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L391
Check warning on line 406 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L406
Check warning on line 415 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L415
Check warning on line 419 in paddlenlp/transformers/auto/tokenizer.py
Codecov / codecov/patch
paddlenlp/transformers/auto/tokenizer.py#L419
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个Error类型是不是应该是EntryNotFoundError?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这块在我修改之前就是这样的(
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
估计是当时就写错了,这个错误可以改
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
如果要raise EntryNotFoundError,那前面就不需要用
except
捕获EntryNotFoundError
了,之前这么做应该有这么做的道理(吧)。