Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update optimizer #26711

Merged
merged 22 commits into from
Sep 1, 2020
Merged

update optimizer #26711

merged 22 commits into from
Sep 1, 2020

Conversation

MRXLT
Copy link
Contributor

@MRXLT MRXLT commented Aug 26, 2020

PR types

New features

PR changes

Ops

Describe

修复adamw多次执行step报错的问题
更换learningratedecay为lrscheduler
修复load op在没有params文件时报错的问题
添加参数检查
修复adamw计算错误的问题

修改optimizer相关文档描述,
minimize示例代码改为动态图版本
set_lr添加参数
parameters接收的参数为list[Tensor]而不是list[Tensor.name]

中文文档PR:PaddlePaddle/docs#2424

image

image

image

@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@MRXLT MRXLT changed the title 2.0 qa update optimizer Aug 26, 2020
python/paddle/optimizer/optimizer.py Outdated Show resolved Hide resolved
python/paddle/optimizer/optimizer.py Outdated Show resolved Hide resolved
@MRXLT MRXLT mentioned this pull request Aug 28, 2020
Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@TCChenlong TCChenlong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@MRXLT MRXLT merged commit 1f36d3c into PaddlePaddle:develop Sep 1, 2020
@MRXLT MRXLT deleted the 2.0-qa branch September 1, 2020 05:29
MRXLT added a commit to MRXLT/Paddle that referenced this pull request Sep 2, 2020
* update doc

* update doc

* fix optimizer sample code

* add default value for adamw weight_decay

* fix adamw

* change LearningRateDecay to _LRScheduler

* fix adamw;notest

* fix load;notest

* remove file

* bug fix

* fix code style

* bug fix

* add ut

* adamw support weight_decay=0

* fix ut

* fix set_lr doc

* fix doc

* change parameters place
MRXLT added a commit that referenced this pull request Sep 4, 2020
* update optimizer (#26711)

* update doc

* update doc

* fix optimizer sample code

* add default value for adamw weight_decay

* fix adamw

* change LearningRateDecay to _LRScheduler

* fix adamw;notest

* fix load;notest

* remove file

* bug fix

* fix code style

* bug fix

* add ut

* adamw support weight_decay=0

* fix ut

* fix set_lr doc

* fix doc

* change parameters place

* fix sample code
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants