Shortcuts

ExportOutputSerializer

class torch.onnx.ExportOutputSerializer(*args, **kwargs)

Protocol for serializing an ONNX graph into a specific format (e.g. Protobuf). Note that this is an advanced usage scenario.

serialize(export_output, destination)[source]

Protocol method that must be implemented for serialization.

Parameters
  • export_output (ExportOutput) – Represents the in-memory exported ONNX model

  • destination (BufferedIOBase) – A binary IO stream or pre-allocated buffer into which the serialized model should be written.

Example

A simple serializer that writes the exported onnx.ModelProto in Protobuf format to destination:

# xdoctest: +REQUIRES(env:TORCH_DOCTEST_ONNX)
>>> import io
>>> import torch
>>> import torch.onnx
>>> class MyModel(torch.nn.Module):  # Dummy model
...     def __init__(self) -> None:
...         super().__init__()
...         self.linear = torch.nn.Linear(2, 2)
...     def forward(self, x):
...         out = self.linear(x)
...         return out
>>> class ProtobufExportOutputSerializer:
...     def serialize(
...         self, export_output: torch.onnx.ExportOutput, destination: io.BufferedIOBase
...     ) -> None:
...         destination.write(export_output.model_proto.SerializeToString())
>>> model = MyModel()
>>> arg1 = torch.randn(2, 2, 2)  # positional input 1
>>> torch.onnx.dynamo_export(model, arg1).save(
...     destination="exported_model.onnx",
...     serializer=ProtobufExportOutputSerializer(),
... )

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources