Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug fixes] fix config attributes backward compatibility #4237

Merged
merged 8 commits into from
Dec 26, 2022

Conversation

wj-Mcat
Copy link
Contributor

@wj-Mcat wj-Mcat commented Dec 26, 2022

PR types

Bug fixes

PR changes

APIs

Description

解决 config 属性字段兼容问题

@paddle-bot
Copy link

paddle-bot bot commented Dec 26, 2022

Thanks for your contribution!

Comment on lines 273 to 288
def __getattribute__(self, name: str):
if not self.constructed_from_pretrained_config():
raise AttributeError(f"'{type(self)}' object has no attribute '{name}'")

# FIXME(wj-Mcat): for details, please refer to: https://github.com/PaddlePaddle/PaddleNLP/pull/4201#discussion_r1057063402
# this condition branch code will be removed later.
if name in self.config.attribute_map:
logger.warning(f"do not access config from `model.{name}`, you should use: `model.config.{name}`")
return self.config[name]

if name in self.config.standard_config_map:
logger.warning(f"do not access config from `model.{name}`, you should use: `model.config.{name}`")
return self.config[name]

return super().__getattribute__(name)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里通过__getattr____getattribute__都存在循环属性获取的问题,理论上__getattr__只有属性不存在的时候才会进入这个函数,可是我在测试的时候,即使属性存在也会进入到该函数当中。比较奇怪。

测试代码如下:

from paddlenlp.transformers import BertConfig, BertModel
config = BertConfig.from_pretrained('__internal_testing__/bert')
model = BertModel(config)
a = model.embeddings

@wj-Mcat wj-Mcat marked this pull request as ready for review December 26, 2022 07:39
try:
return super(PretrainedModel, self).__getattr__(name)
except AttributeError:
result = getattr(self.config, name)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

所以说num_classes -> num_labels这种attribute map就交给底层的PretrainedConfig来支持了是吧?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里的代码逻辑是:如果 model 获取不到,就从 config 获取,并且抛出warning。

Comment on lines 313 to 315
model = ErnieModel(config)
model.eval()
assert model.num_classes == config.num_labels
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

能不能写进common_test里,对于各种model_class (e.g. ErnieModel, ErnieForQuestionAnwering), 所有在attribute_map和standard_map里的都被正确的映射。然后配置一个一键turn-on 测试

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

可以。

@codecov
Copy link

codecov bot commented Dec 26, 2022

Codecov Report

Merging #4237 (70a8828) into develop (706971b) will increase coverage by 0.90%.
The diff coverage is 100.00%.

@@             Coverage Diff             @@
##           develop    #4237      +/-   ##
===========================================
+ Coverage    35.08%   35.98%   +0.90%     
===========================================
  Files          408      419      +11     
  Lines        57029    59060    +2031     
===========================================
+ Hits         20007    21254    +1247     
- Misses       37022    37806     +784     
Impacted Files Coverage Δ
paddlenlp/transformers/model_utils.py 73.16% <100.00%> (+0.04%) ⬆️
paddlenlp/transformers/roberta/modeling.py 89.85% <0.00%> (-0.37%) ⬇️
paddlenlp/transformers/feature_extraction_utils.py 27.02% <0.00%> (-0.19%) ⬇️
paddlenlp/transformers/bit/configuration.py 88.57% <0.00%> (ø)
paddlenlp/transformers/image_transforms.py 12.93% <0.00%> (ø)
paddlenlp/utils/initializer.py 48.76% <0.00%> (ø)
paddlenlp/transformers/dpt/configuration.py 86.88% <0.00%> (ø)
paddlenlp/transformers/image_processing_utils.py 25.53% <0.00%> (ø)
paddlenlp/transformers/dpt/modeling.py 91.60% <0.00%> (ø)
... and 13 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@sijunhe sijunhe changed the title [Bug fixes] fix config comp [Bug fixes] fix config attributes backward compatibility Dec 26, 2022
Copy link
Collaborator

@sijunhe sijunhe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@sijunhe sijunhe merged commit e6a1bca into PaddlePaddle:develop Dec 26, 2022
@wj-Mcat wj-Mcat deleted the fix-num-classes-comp branch December 27, 2022 05:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants