commit | bd9ad89a6dcdf8637d2e3b3ccef2ebcd5ba82926 | [log] [tgz] |
---|---|---|
author | Andrew Gu <andgu@fb.com> | Tue Dec 06 03:21:32 2022 +0000 |
committer | PyTorch MergeBot <pytorchmergebot@users.noreply.github.com> | Tue Dec 06 20:13:21 2022 +0000 |
tree | e6991bd9df723d0980585a086ba4e32f84d76cce | |
parent | ce21262808422e3c8814d0244f95a14492c02f11 [diff] |
[FSDP] Fix accidental change in `_test_fsdp_parity` (#90252) I accidentally changed the semantics of this line when refactoring a while ago. The [previous version](https://github.com/pytorch/pytorch/pull/80873/files#diff-7b5c66f99161fa6a3d9042e80f8c8cc140a64e43445feede46f55e53154f6c3dL635) used to say: ``` if not mixed_precision: ``` which is actually the opposite of ``` if mixed_precision is not None: ``` Pull Request resolved: https://github.com/pytorch/pytorch/pull/90252 Approved by: https://github.com/zhaojuanmao
diff --git a/torch/testing/_internal/common_fsdp.py b/torch/testing/_internal/common_fsdp.py index 0c40630..405d8a5 100644 --- a/torch/testing/_internal/common_fsdp.py +++ b/torch/testing/_internal/common_fsdp.py
@@ -1029,7 +1029,7 @@ # the DDP parameters are in FP16 (from `half()`) while the FSDP # parameters are in FP32 (from `summon_full_params()`) and (2) DDP runs # the optimizer in FP16 while FSDP runs it in FP32 - if mixed_precision is not None: + if mixed_precision is None: self.assertEqual( ddp_params, fsdp_unsharded_params,