Make one_hot non-differentiable. (#19430)

Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/19430
ghimport-source-id: 6787473873fdc21400138a4322e17fee8db62607

Differential Revision: D15003382

Pulled By: gchanan

fbshipit-source-id: e9244c7a5f0ad7cd2f79635944a8b37f910231c9
diff --git a/tools/autograd/derivatives.yaml b/tools/autograd/derivatives.yaml
index 50e3226..d1d8ef7 100644
--- a/tools/autograd/derivatives.yaml
+++ b/tools/autograd/derivatives.yaml
@@ -817,6 +817,9 @@
 - name: t(Tensor self)
   self: grad.t()
 
+- name: one_hot(Tensor self, int64_t num_classes)
+  self: non_differentiable
+
 - name: flip(Tensor self, IntArrayRef dims)
   self: grad.flip(dims)