Deep Learning AMI
Developer Guide

Apache MXNet to ONNX to CNTK Tutorial

ONNX Overview

The Open Neural Network Exchange (ONNX) is an open format used to represent deep learning models. ONNX is supported by Amazon Web Services, Microsoft, Facebook, and several other partners. You can design, train, and deploy deep learning models with any framework you choose. The benefit of ONNX models is that they can be moved between frameworks with ease.

This tutorial shows you how to use the Deep Learning AMI with Conda with ONNX. By following these steps, you can train a model or load a pre-trained model from one framework, export this model to ONNX, and then import the model in another framework.

ONNX Prerequisites

To use this ONNX tutorial, you must have access to a Deep Learning AMI with Conda version 12 or later. For more information about how to get started with a Deep Learning AMI with Conda, see Deep Learning AMI with Conda.

Launch a terminal session with your Deep Learning AMI with Conda to begin the following tutorial.

Convert an Apache MXNet (incubating) Model to ONNX, then Load the Model into CNTK

How to Export a Model from Apache MXNet (incubating)

You can install the latest MXNet build into either or both of the MXNet Conda environments on your Deep Learning AMI with Conda.

    • (Option for Python 3) - Activate the Python 3 MXNet environment:

      $ source activate mxnet_p36
    • (Option for Python 2) - Activate the Python 2 MXNet environment:

      $ source activate mxnet_p27
  1. The remaining steps assume that you are using the mxnet_p36 environment.

  2. Download the model files.

    curl -O https://s3.amazonaws.com/onnx-mxnet/model-zoo/vgg16/vgg16-symbol.json curl -O https://s3.amazonaws.com/onnx-mxnet/model-zoo/vgg16/vgg16-0000.params
  3. To export the model files from MXNet to the ONNX format, create a new file with your text editor and use the following program in a script.

    import numpy as np import mxnet as mx from mxnet.contrib import onnx as onnx_mxnet converted_onnx_filename='vgg16.onnx' # Export MXNet model to ONNX format via MXNet's export_model API converted_onnx_filename=onnx_mxnet.export_model('vgg16-symbol.json', 'vgg16-0000.params', [(1,3,224,224)], np.float32, converted_onnx_filename) # Check that the newly created model is valid and meets ONNX specification. import onnx model_proto = onnx.load(converted_onnx_filename) onnx.checker.check_model(model_proto)

    You may see some warning messages, but you can safely ignore those for now. After you run this script, you will see the newly created .onnx file in the same directory.

Use an ONNX Model with CNTK

How to Use an ONNX Model for Inference with CNTK

    • (Option for Python 3) - Activate the Python 3 CNTK environment:

      $ source activate cntk_p36
    • (Option for Python 2) - Activate the Python 2 CNTK environment:

      $ source activate cntk_p27
  1. The remaining steps assume you are using the cntk_p36 environment.

  2. Create a new file with your text editor, and use the following program in a script to open ONNX format file in CNTK.

    import cntk as C # Import the Chainer model into CNTK via CNTK's import API z = C.Function.load("vgg16.onnx", device=C.device.cpu(), format=C.ModelFormat.ONNX) print("Loaded vgg16.onnx!")

    After you run this script, CNTK will have loaded the model.

  3. You may also try running inference with CNTK. First, download a picture of a husky.

    $ curl -O https://upload.wikimedia.org/wikipedia/commons/b/b5/Siberian_Husky_bi-eyed_Flickr.jpg
  4. Next, download a list of classes will work with this model.

    $ curl -O https://gist.githubusercontent.com/yrevar/6135f1bd8dcf2e0cc683/raw/d133d61a09d7e5a3b36b8c111a8dd5c4b5d560ee/imagenet1000_clsid_to_human.pkl
  5. Edit the previously created script to have the following content. This new version will use the image of the husky, get a prediction result, then look this up in the file of classes, returning a prediction result.

    import cntk as C import numpy as np from PIL import Image from IPython.core.display import display import pickle # Import the model into CNTK via CNTK's import API z = C.Function.load("vgg16.onnx", device=C.device.cpu(), format=C.ModelFormat.ONNX) print("Loaded vgg16.onnx!") img = Image.open("Siberian_Husky_bi-eyed_Flickr.jpg") img = img.resize((224,224)) rgb_img = np.asarray(img, dtype=np.float32) - 128 bgr_img = rgb_img[..., [2,1,0]] img_data = np.ascontiguousarray(np.rollaxis(bgr_img,2)) predictions = np.squeeze(z.eval({z.arguments[0]:[img_data]})) top_class = np.argmax(predictions) print(top_class) labels_dict = pickle.load(open("imagenet1000_clsid_to_human.pkl", "rb")) print(labels_dict[top_class])
  6. Then run the script, and you should see a result as follows:

    248 Eskimo dog, husky