This is the second day of my participation in the August Text Challenge.More challenges in August

The MxNet model exports the ONNX model

Open Neural Network Exchange (ONNX) provides an Open source data model format for AI models. It defines an extensible graph model with built-in operators and definitions of standard data types. It can be used as a medium for conversion between various AI models. For example, there is no ready-made conversion tool for Caffe model to MxNet model in the market. We can use ONNX to convert Caffe model to ONNX, and then ONNX to MxNet. The conversion process only loses the accuracy of the original model.

In this tutorial, we will show you how to save the MXNet model in ONNX format. The coverage and features of the mxnet-onnx operator are updated periodically. Visit ONNX Operator Coverage for the latest information. In this tutorial, we will learn how to export pre-trained models using the Model export tool from MXNet to ONNX.

Preliminary knowledge

To run this tutorial, you need to install the following Python modules:

  • MXNet > = 1.3.0. PIP INSTALL MXNET==1.4.0 –USER: PIP INSTALL MXNET==1.4.0 –USER

  • Onnx. PIP install onnx==1.2.1 –user PIP install onnx==1.2.1 –user

** Note: ** mxnet-onnx import and export tools follow version 7 of the ONNX operator set, which comes with ONNX v1.2.1.

import mxnet as mx
import numpy as np
from mxnet.contrib import onnx as onnx_mxnet
import logging
logging.basicConfig(level=logging.INFO)
Copy the code

Download a Model from MXNet Model Zoo

We download pre-trained ResNET-18 ImageNet models from MXNet Model Zoo. We will also download the Synset file to match the label

# Download pre-trained resnet model - json and params by running following code.
path='http://data.mxnet.io/models/imagenet/'
[mx.test_utils.download(path+'resnet/18-layers/resnet-18-0000.params'),
 mx.test_utils.download(path+'resnet/18-layers/resnet-18-symbol.json'),
 mx.test_utils.download(path+'synset.txt')]
Copy the code

We have now downloaded the resNET-18, Params, and Synset files on disk.

MXNet to ONNX exporter API

Let’s describe MXNet’s ‘export_model’ API.

help(onnx_mxnet.export_model) Help on function export_model in module mxnet.contrib.onnx.mx2onnx.export_model: export_model(sym, params, input_shape, input_type=<type 'numpy.float32'>, onnx_file_path=u'model.onnx', verbose=False) Exports the MXNet model file, passed as a parameter, into ONNX model. Accepts both symbol,parameter objects as well as json and params filepaths as input. Operator support and coverage - https://cwiki.apache.org/confluence/display/MXNET/MXNet-ONNX+Integration Parameters ---------- sym : str or symbol object Path to the json file or Symbol object params : str or symbol object Path to the params file or params dictionary. (Including both arg_params and aux_params) Input_shape: List of tuple Input shape of the model LLDB [(1,3,224,224)] input_type: data type Input data type e.g. np.float32 onnx_file_path : str Path where to save the generated onnx file verbose : Boolean If true will print logs of the model conversion Returns ------- onnx_file_path : str Onnx file pathCopy the code

The ‘export_model’ API can accept MXNet models in one of two ways.

  1. MXNet Sym, Params object:
    • This is useful if we are training a model. At the end of the training, we just need to call the ‘export_model’ function and provide sym and Params objects as input and other properties to save the model in ONNX format.
  2. Json and params files from MXNet:
    • This is useful if we have pre-trained models and want to convert them to the ONNX format.

Since we have downloaded the pre-trained model file, we will use the ‘export_model’ API by passing symbols and the path to the Params file.

How to import and export PI from MXNet to ONNXA

We will use the downloaded pre-trained model files (SYm, Params) and define the input variables.

Sym = './resnet-18-symbol.json' params = './resnet-18-0000.params' Onnx_file = './mxnet_exported_resnet50.onnx'Copy the code

We have defined the input parameters required by the ‘export_model’ API. Now we are ready to convert the MXNet model to the ONNX format

Call the export model API. Converted_model_path = onnx_mxnet.export_model(sym, params, [input_shape], NP.float32, onnx_file)Copy the code

This API returns the path to the transformed model, which you can use later to import the model into other frameworks.

Verify the validity of the ONNX model

Now we can use the ONNX inspection tool to check the validity of the transformed ONNX model. The tool will validate the model by checking if the content contains a valid Protobuf:

from onnx import checker
import onnx

# Load onnx model
model_proto = onnx.load_model(converted_model_path)

# Check if converted ONNX protobuf is valid
checker.check_graph(model_proto.graph)
Copy the code

If the converted Protobuf format does not conform to the ONNX Proto specification, the inspector will throw an error, which in this case passed successfully.

This method verifies the validity of the derived model original BUF. Now the model can be imported into other frameworks for reasoning!