Running a VGG Model¶
Overview
This example imports a pretrained VGG model from the ONNX framework, and demonstrates inference.
Description
This example shows how to run an ONNX model using the Qualcomm® Neural Processing SDK. We will perform the following steps:
Set up the ONNX environment for converting the VGG-16 model into a DLC, using snpe-onnx-to-dlc.
Download the ONNX pre-trained VGG model and preprocess input image.
Convert the VGG model to DLC format, using snpe-onnx-to-dlc. Use snpe-dlc-info to visualize the converted network structure.
Execute on your Qualcomm® Neural Processing SDK compatible device, using snpe-net-run, and postprocess the result for prediction.
Running The Example
First set up ONNX environment
cd $SNPE_ROOT source bin/envsetup.sh
where $ONNX_DIR is the path to the ONNX installation. The script sets up the following environment variables.
SNPE_ROOT: root directory of the Qualcomm (R) Neural Processing SDK installation ONNX_HOME: root directory of the ONNX installation provided
The script also updates PATH, LD_LIBRARY_PATH, and PYTHONPATH environment variables. You should be able to run snpe-onnx-to-dlc -h without error if the environment is set correctly.
Download the ONNX pretrained VGG model from here.
cd $SNPE_ROOT/examples/Models/VGG wget https://s3.amazonaws.com/onnx-model-zoo/vgg/vgg16/vgg16.onnx
You can find more information about the ONNX VGG model here
Download a sample image, and the label file for the model.
mkdir data cd $SNPE_ROOT/examples/Models/VGG/data wget https://s3.amazonaws.com/model-server/inputs/kitten.jpg wget https://s3.amazonaws.com/onnx-model-zoo/synset.txt
The size of the input image is not limited. Also note, you can use your own image.
Preprocess the image and convert it into a raw file.
Resize to 256x256
Take center crop of 224x224
Normalize
Save as a raw file
cd $SNPE_ROOT/examples/Models/VGG mkdir data/cropped/ python3 scripts/create_VGG_raws.py -i data/ -d data/cropped/
If you see this message, it means the image is preprocessed successfully.
Preprocessed successfully!
Convert the ONNX model into Qualcomm® Neural Processing SDK DLC format.
cd $SNPE_ROOT/examples/Models/VGG snpe-onnx-to-dlc -i vgg16.onnx -o dlc/vgg16.dlc
You should see the following message:
INFO - INFO_CONVERSION_SUCCESS: Conversion completed successfully
- Note
From step 2 to step 5, it is equivalent to running “python3 $SNPE_ROOT/examples/Models/VGG/scripts/setup_VGG.py”
usage: $SNPE_ROOT/examples/Models/VGG/scripts/setup_VGG.py [-h] -a ASSETS_DIR [-d] Prepares the VGG assets for tutorial examples. required arguments: -a ASSETS_DIR, --assets_dir ASSETS_DIR directory containing the VGG assets optional arguments: -d, --download Download VGG assets to VGG example directory
View your DLC model using snpe-dlc-info. Execute
snpe-dlc-info -i dlc/vgg16.dlc
and you will see each layer information in detailed. This tool shows the name, dimensions and important parameters of each layer. Additionally, it shows enabled runtimes.
Run inference: snpe-net-run loads a DLC file, loads the data for the input tensor(s), and executes the network on the specified runtime.
cd $SNPE_ROOT/examples/Models/VGG/data/cropped snpe-net-run --input_list raw_list.txt --container ../../dlc/vgg16.dlc --output_dir ../../output
You will see the following:
------------------------------------------------------------------------------- Model String: N/A SNPE vX.Y.Z ------------------------------------------------------------------------------- Processing DNN input(s): kitten.raw
Postprocess the result for prediction
cd $SNPE_ROOT/examples/Models/VGG python3 scripts/show_vgg_classifications.py -i data/cropped/raw_list.txt -o output/ -l data/synset.txt
You will see the following, and it means the example ran successfully!
Classification results probability=0.351833 ; class=n02123045 tabby, tabby cat probability=0.315166 ; class=n02123159 tiger cat probability=0.313086 ; class=n02124075 Egyptian cat probability=0.012995 ; class=n02127052 lynx, catamount probability=0.003528 ; class=n02129604 tiger, Panthera tigris