commit | 0b18ed1c4717a55baed652057e57a30b58b0930f | [log] [tgz] |
---|---|---|
author | Andrew Gu <andgu@fb.com> | Thu Feb 29 10:30:22 2024 -0800 |
committer | PyTorch MergeBot <pytorchmergebot@users.noreply.github.com> | Fri Mar 01 18:40:30 2024 +0000 |
tree | 66d07446545dca066c29790c9cf23fda73adb32d | |
parent | f01a23d01b5f87e52ead52f32e29ed644b256cc8 [diff] |
[FSDP] Added warning about unsupported double backwards (#120926) Pull Request resolved: https://github.com/pytorch/pytorch/pull/120926 Approved by: https://github.com/Skylion007
diff --git a/torch/distributed/fsdp/fully_sharded_data_parallel.py b/torch/distributed/fsdp/fully_sharded_data_parallel.py index 0a997d7..8152438 100644 --- a/torch/distributed/fsdp/fully_sharded_data_parallel.py +++ b/torch/distributed/fsdp/fully_sharded_data_parallel.py
@@ -247,6 +247,10 @@ the all-reduce times over the replication process group for some cluster setups. + .. warning:: + FSDP does not work with double backwards due to how it registers + backward hooks. + Args: module (nn.Module): This is the module to be wrapped with FSDP.