This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English version of the page.

Note: This page has been translated by MathWorks. Click here to see
To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

exportONNXNetwork

Export network to ONNX model format

Export a trained Deep Learning Toolbox™ network to the ONNX™ (Open Neural Network Exchange) model format. You can then import the ONNX model to other deep learning frameworks that support ONNX model import, such as TensorFlow™, Caffe2, Microsoft® Cognitive Toolkit, Core ML, and Apache MXNet™.

Syntax

exportONNXNetwork(net,filename)
exportONNXNetwork(net,filename,Name,Value)

Description

example

exportONNXNetwork(net,filename) exports the deep learning network net with weights to the ONNX format file filename. If filename exists, then exportONNXNetwork overwrites the file.

This function requires the Deep Learning Toolbox Converter for ONNX Model Format support package. If this support package is not installed, then the function provides a download link.

exportONNXNetwork(net,filename,Name,Value) exports a network using additional options specified by one or more name-value pair arguments.

Examples

collapse all

Load a pretrained SqueezeNet convolutional neural network. If Deep Learning Toolbox Model for SqueezeNet Network is not installed, then the squeezenet function provides a download link.

net = squeezenet
net = 
  DAGNetwork with properties:

         Layers: [68×1 nnet.cnn.layer.Layer]
    Connections: [75×2 table]

Export the network as an ONNX format file in the current folder called squeezenet.onnx. If the Deep Learning Toolbox Converter for ONNX Model Format support package is not installed, then the function provides a link to the required support package in the Add-On Explorer. To install the support package, click the link, and then click Install.

filename = 'squeezenet.onnx';
exportONNXNetwork(net,filename)

Now, you can import the squeezenet.onnx file into any deep learning framework that supports ONNX import.

Input Arguments

collapse all

Trained network, specified as a SeriesNetwork or a DAGNetwork object. You can get a trained network by importing a pretrained network (for example, by using the alexnet function) or by training your own network using trainNetwork.

Name of file, specified as a character vector or string.

Example: 'network.onnx'

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside quotes. You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

Example: exportONNXNetwork(net,filename,'NetworkName','my_net') exports a network and specifies 'my_net' as the network name in the saved ONNX network.

Name of ONNX network to store in the saved file, specified as a character vector or string.

Example: 'my_squeezenet'

Version of ONNX operator set to use in the exported model. If the default operator set does not support the network you are trying to export, then try using a later version. If you import the exported network to another framework and you used an operator set during export that the importer does not support, then the import can fail.

Example: 6

Tips

  • exportONNXNetwork does not export settings or properties related to network training such as training options, learning rate factors, or regularization factors.

  • If you export a network that contains a layer that the ONNX format does not support, then exportONNXNetwork saves a placeholder ONNX operator in place of the unsupported layer and returns a warning. You cannot import an ONNX network with a placeholder operator into other deep learning frameworks.

    exportONNXNetwork can export the following layers:

References

[1] Open Neural Network Exchange. https://github.com/onnx/.

[2] ONNX. https://onnx.ai/.

Introduced in R2018a