Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix docs #7054

Open
wants to merge 6 commits into
base: develop
Choose a base branch
from
Open

Fix docs #7054

wants to merge 6 commits into from

Conversation

Xuxuanang
Copy link
Contributor

@Xuxuanang Xuxuanang commented Feb 24, 2025

映射文档对齐与修改

Copy link

paddle-bot bot commented Feb 24, 2025

感谢你贡献飞桨文档,文档预览构建中,Docs-New 跑完后即可预览,预览链接:http://preview-pr-7054.paddle-docs-preview.paddlepaddle.org.cn/documentation/docs/zh/api/index_cn.html
预览工具的更多说明,请参考:飞桨文档预览工具

@zhwesky2010
Copy link
Collaborator

映射文档对齐与修改

链接一下对应的PR

@@ -19,7 +19,7 @@ paddle.distributed.fleet.distributed_optimizer(optimizer, strategy=None)

| PyTorch | PaddlePaddle | 备注 |
| --------------- | ------------ | --------------------------------------------------------------------- |
| optimizer_class | optimizer | 优化器。 |
| optimizer_class | optimizer | 优化器,仅参数名不一致|
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个用法应该不一致,torch的optimizer_class是一个class,torch是通过 optimizer_class、args、kwargs 三个参数,来创建一个optimizer实例对象。而paddle里的optimizer是直接传入一个实例对象。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants