[ONNX] Update scripting docs (#54634) (#54868)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/54868
* Updating docs for scripting
* Rebase
* Fix formatting
Test Plan: Imported from OSS
Reviewed By: nikithamalgifb
Differential Revision: D27408980
Pulled By: SplitInfinity
fbshipit-source-id: 2b176a5a746c1a2369be1940d84e6491a1ecd015
diff --git a/docs/source/onnx.rst b/docs/source/onnx.rst
index eb17a0f..2aeca59 100644
--- a/docs/source/onnx.rst
+++ b/docs/source/onnx.rst
@@ -201,9 +201,9 @@
%loop_range : Long()):
%2 : Long() = onnx::Constant[value={1}](), scope: LoopModel2/loop
%3 : Tensor = onnx::Cast[to=9](%2)
- %4 : Long(2, 3) = onnx::Loop(%loop_range, %3, %input_data), scope: LoopModel2/loop # custom_loop.py:240:5
+ %4 : Long(2, 3) = onnx::Loop(%loop_range, %3, %input_data), scope: LoopModel2/loop
block0(%i.1 : Long(), %cond : bool, %x.6 : Long(2, 3)):
- %8 : Long(2, 3) = onnx::Add(%x.6, %i.1), scope: LoopModel2/loop # custom_loop.py:241:13
+ %8 : Long(2, 3) = onnx::Add(%x.6, %i.1), scope: LoopModel2/loop
%9 : Tensor = onnx::Cast[to=9](%2)
-> (%9, %8)
return (%4)
@@ -249,9 +249,41 @@
out = model(*inputs)
torch.onnx.export(model, inputs, 'loop_and_list.onnx', opset_version=11, example_outputs=out)
+
+Type Annotations
+--------------------------------
+TorchScript only supports a subset of Python types. You can find more details about type annotation
+`here <https://pytorch.org/docs/stable/jit_language_reference.html#id8>`_.
+
+Due to optimization purposes, TorchScript only supports variables with single static types for script functions.
+By default, each variable is assumed to be Tensor. If an argument to a ScriptModule function is not Tensor,
+its type should be specified using MyPy-style annotations.
+
+::
+
+ import torch
+
+ class Module(torch.nn.Module):
+ def forward(self, x, tup):
+ # type: (int, Tuple[Tensor, Tensor]) -> Tensor
+ t0, t1 = tup
+ return t0 + t1 + x
+
+If the type annotation is not specified, TorchScript compiler fails with the runtime error below.
+
+::
+
+ RuntimeError:
+ Tensor (inferred) cannot be used as a tuple:
+ File <filename>
+ def forward(self, x, tup):
+ t0, t1 = tup
+ ~~~ <--- HERE
+ return t0 + t1 + x
+
+
Write PyTorch model in Torch way
--------------------------------
-
Avoid using numpy
~~~~~~~~~~~~~~~~~
@@ -398,12 +430,12 @@
data[torch.tensor([[1, 2], [2, 3]]), torch.tensor([2, 3])]
data[torch.tensor([2, 3]), :, torch.tensor([1, 2])]
- # Ellipsis
+ # Ellipsis followed by tensor indexing
# Not supported in scripting
# i.e. torch.jit.script(model) will fail if model contains this pattern.
# Export is supported under tracing
# i.e. torch.onnx.export(model)
- data[...]
+ data[..., torch.tensor([2, 1])]
# The combination of above
data[2, ..., torch.tensor([2, 1, 3]), 2:4, torch.tensor([[1], [2]])]
@@ -460,12 +492,12 @@
data[torch.tensor([[1, 2], [2, 3]])] = new_data
data[torch.tensor([2, 3]), torch.tensor([1, 2])] = new_data
- # Ellipsis
+ # Ellipsis followed by tensor indexing
# Not supported to export in script modules
# i.e. torch.onnx.export(torch.jit.script(model)) will fail if model contains this pattern.
# Export is supported under tracing
# i.e. torch.onnx.export(model)
- data[...] = new_data
+ data[..., torch.tensor([2, 1])] = new_data
# The combination of above
data[2, ..., torch.tensor([2, 1, 3]), 2:4] += update
@@ -1020,6 +1052,11 @@
Please checkout `Tracing vs Scripting`_.
+Q: How to export models with primitive type inputs?
+
+ Support for primitive type inputs will be added starting from PyTorch 1.9 release.
+ However, exporter does not support conversion of models with String inputs.
+
Q: Does ONNX support implicit scalar datatype casting?
No, but the exporter will try to handle that part. Scalars are converted to constant tensors in ONNX.