Corrected parameter typo in Adam docstring (#697)

diff --git a/torch/optim/adam.py b/torch/optim/adam.py
index 38756ed..2129de8 100644
--- a/torch/optim/adam.py
+++ b/torch/optim/adam.py
@@ -10,7 +10,7 @@
     Arguments:
         params (iterable): iterable of parameters to optimize or dicts defining
             parameter groups
-        lr (float, optional): learning rate (default: 1e-2)
+        lr (float, optional): learning rate (default: 1e-3)
         betas (Tuple[float, float], optional): coefficients used for computing
             running averages of gradient and its square (default: (0.9, 0.999))
         eps (float, optional): term added to the denominator to improve