Skip to content

Commit

Permalink
fix some
Browse files Browse the repository at this point in the history
  • Loading branch information
jinyouzhi committed Oct 9, 2023
1 parent 84810bf commit 2a44dfb
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
4 changes: 2 additions & 2 deletions python/paddle/distributed/sharding/group_sharded.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ def group_sharded_parallel(
Examples:
.. code-block:: python
>>> # doctest: +REQUIRES(env:distributed)
>>> # doctest: +REQUIRES(env:DISTRIBUTED)
>>> import paddle
>>> from paddle.nn import Linear
>>> from paddle.distributed import fleet
Expand Down Expand Up @@ -196,7 +196,7 @@ def save_group_sharded_model(model, output, optimizer=None):
Examples:
.. code-block:: python
>>> # doctest: +REQUIRES(env:distributed)
>>> # doctest: +REQUIRES(env:DISTRIBUTED)
>>> import paddle
>>> from paddle.nn import Linear
>>> from paddle.distributed import fleet
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -316,7 +316,7 @@ class DistributeTranspiler:
... pserver_program = t.get_pserver_program(current_endpoint)
... pserver_startup_program = t.get_startup_program(current_endpoint,
... pserver_program)
>>> elif role == "TRAINER":
... elif role == "TRAINER":
... trainer_program = t.get_trainer_program()
>>> # for nccl2 mode
Expand Down

0 comments on commit 2a44dfb

Please sign in to comment.