Skip to content

[Docathon][Update Doc No.28] add the LRScheduler #7182

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Mar 31, 2025
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions docs/api_guides/low_level/layers/learning_rate_scheduler.rst
Original file line number Diff line number Diff line change
Expand Up @@ -67,3 +67,6 @@

* :code:`CosineAnnealingWarmRestarts`: 余弦退火学习率,即学习率随 step 数变化呈余弦函数周期变化。
相关 API Reference 请参考 :ref:`cn_api_paddle_optimizer_lr_CosineAnnealingWarmRestarts`

* :code:`LRScheduler`: 学习率策略基类,所有具体策略均继承自此类,需重写 get_lr() 方法实现自定义逻辑。
相关 API Reference 请参考 :ref:`_cn_api_paddle_optimizer_lr_LRScheduler`
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
相关 API Reference 请参考 :ref:`_cn_api_paddle_optimizer_lr_LRScheduler`
相关 API Reference 请参考 :ref:`cn_api_paddle_optimizer_lr_LRScheduler `

上面的 _cn_api_paddle_optimizer_lr_CyclicLR_cn_api_paddle_optimizer_lr_LinearLR 也有这个问题

辛苦同学参考预览链接 preview-pr-7182,一起改下

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Original file line number Diff line number Diff line change
Expand Up @@ -48,3 +48,5 @@ The following content describes the APIs related to the learning rate scheduler:
* :code:`LinearLR`: Linear decay. That is, the learning rate will be firstly multiplied by start_factor and linearly increase to end learning rate. For related API Reference please refer to :ref:`api_paddle_optimizer_lr_LinearLR`

* :code:`CosineAnnealingWarmRestarts`: Cosine attenuation. It means the learning rate changes with the number of steps in the form of a cosine function. For related API Reference please refer to :ref:`api_paddle_optimizer_lr_CosineAnnealingWarmRestarts`

* :code:`LRScheduler`: Learning rate scheduling base class. All specific learning rate scheduling strategies inherit from this class. For related API Reference please refer to :ref:`api_paddle_optimizer_lr_LRScheduler`