Fix docstring for nn.Softplus (#70576)

Summary:
Fixes nn.Softplus' docstring problem reported at https://github.com/pytorch/pytorch/issues/70498.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/70576

Reviewed By: VitalyFedyunin

Differential Revision: D33407444

Pulled By: albanD

fbshipit-source-id: 7f1f438afb1a1079d30e0c4741aa609c5204329f
diff --git a/torch/nn/modules/activation.py b/torch/nn/modules/activation.py
index 9901e8f..0f2c34a 100644
--- a/torch/nn/modules/activation.py
+++ b/torch/nn/modules/activation.py
@@ -782,10 +782,8 @@
 
 
 class Softplus(Module):
-    r"""Applies the element-wise function:
-
-    .. math::
-        \text{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x))
+    r"""Applies the Softplus function :math:`\text{Softplus}(x) = \frac{1}{\beta} *
+    \log(1 + \exp(\beta * x))` element-wise.
 
     SoftPlus is a smooth approximation to the ReLU function and can be used
     to constrain the output of a machine to always be positive.