Update tensorflow/python/keras/activations.py
diff --git a/tensorflow/python/keras/activations.py b/tensorflow/python/keras/activations.py
index 3b9d728..ee8f2c8 100644
--- a/tensorflow/python/keras/activations.py
+++ b/tensorflow/python/keras/activations.py
@@ -204,8 +204,7 @@
   Swish activation function which returns `x*sigmoid(x)`.
   It is a smooth, non-monotonic function that consistently matches 
   or outperforms ReLU on deep networks, it is unbounded above and 
-  bounded below & it is the non-monotonic attribute that actually
-  creates the difference.
+  bounded below.
   
   
   Example Usage: