Skip to content

Commit

Permalink
fix get_offset_mapping error for Ernie tokenizer (#2857)
Browse files Browse the repository at this point in the history
  • Loading branch information
yingyibiao authored Jul 25, 2022
1 parent 4d18e4d commit e568e07
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion paddlenlp/transformers/tokenizer_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -1320,7 +1320,7 @@ def get_offset_mapping(self, text):
if text is None:
return None
split_tokens = []
if self.do_basic_tokenize:
if hasattr(self, "basic_tokenizer"):
for token in self.basic_tokenizer.tokenize(
text, never_split=self.all_special_tokens):
# If the token is part of the never_split set
Expand Down

0 comments on commit e568e07

Please sign in to comment.