PSNPE Android Tutorial¶
Prerequisites
The Qualcomm® Neural Processing SDK has been set up following the |Qualcomm(R)| Neural Processing SDK Setup.
The Tutorials Setup has been completed.
Introduction
This tutorial walks through the process of integrating PSNPE snpe-parallel-run Java APIs within an Android application. The PSNPE Java APIs are made available as an Android Archive (AAR) file which application developers include as a dependency of their applications.
Gradle project dependency
allprojects {
repositories {
jcenter()
flatDir {
// Marks the directory as a repository for
// dependencies. Place the psnpe-release.aar
// in the directory below.
dirs 'libs'
}
}
}
...
dependencies {
// This adds the |Qualcomm(R)| Neural Processing SDK including PSNPE as a project dependency
compile(name: 'psnpe-release', ext:'aar')
...
}
PSNPE Java API Overview
Once the dependency of PSNPE is added, the PSNPE classes under the com.qualcomm.qti.psnpe package will be available in the application classpath. Application will firstly create an object of PSNPE, then use that object to call validation APIs.
//PSNPE manager class is for creating, executing and releasing psnpe instance.
import com.qualcomm.qti.psnpe.PSNPEManager;
//This class is for PSNPE configuration.
import com.qualcomm.qti.psnpe.PSNPEConfig;
//Interface for registeration of callback functions.
import com.qualcomm.qti.psnpe.PSNPEManagerListener;
Most applications could follow below steps with a model in format of DLC:
Initialize the environment
Set the builder configuration list
Load input data
Execute the object with multiple runtimes
Process the network output
The following sections describe how to implement each step described above.
Configuring a Neural Network
The excerpt of code below shows how to set paths of native libraries and configuration file.
PSNPEManager.init(nativeLibPath, runtimeConfigPath + "/model_configs.json")
The configuration file should be named as “model_configs.json” and follow below format:
//the example of json file
{
"models": [
{
"name": "<Model Name>",
"modelFile": "<Path of DLC on device, e.g:
/storage/emulated/0/Android/data/com.qualcomm.qti.psnpedemo/files/models/classification/inceptionv3.dlc>",
"executeMode": "<Choose implementation type for psnpe: sync, outputAsync, inputOutputAsync>",
"enableInitCache": "<Set to true to enable init cache>",
"platformOptions": "<List of supported platform options. For example: 'unsignedPD:ON'>",
"bulkSize": <No. of images in once time>,
"buildConfig": [
{
"runtime": "<Run psnpe with the following runtime options: CPU, GPU,GPU_s, GPU_FP16, DSP, AIP, AIP_ACT16>",
"batch": <No. of dynamic setting the model batch>,
"userBufferMode": "<Choose Buffer Types from the following options: float, TF8. TF8 is considered by default.>",
"performanceProfile": "<Choose profiling level from the following options: default, balanced, high_performance,
power_saver, system_settings, burst, sustained_high_performance, low_power_saver, high_power_saver, extreme_power_saver, low_balanced>",
"enableCPUFallback": "<Set to true to enable CPU fallback>"
}
]
}
]
}
The excerpt code below shows how to use PSNPEManager to build a neural network.
PSNPEManager.buildFromFile(modelInfo.getModelName())
Synchronous Mode
In synchronous mode, input data loading, neural network propagation and output postprocessing are processed sequentially. The code excerpt below shows how to load input data and propagate neural network in synchronous mode.
// handleSize is the number of input images.
for(int i=0; i<handleSize; i++) {
File image = imagesList[index++];
//image preprocessing
float[] data = imagePreprocessor.preProcessData(image);
//load data into float buffer.
if (!PSNPEManager.loadData(data, i)) {
PSNPEManager.release();
...
return false;
}
...
}
// propagate in synchronous mode
if (!PSNPEManager.executeSync()) {
PSNPEManager.release();
return false;
}
The code excerpt below shows how to get output in synchronous mode.
//get output result
for(int i = 0; i < imageNum; i++) {
/* output:
* <image1><image2>...<imageBulkSize>
* split output and handle one by one.
*/
Map<String, float []> outputMap = PSNPEManager.getOutputSync(i); //the index 'i'
String[] outputNames = PSNPEManager.getOutputTensorNames();
float[] output = outputMap.get(outputNames[0]);
...
}
Output Asynchronous Mode
In output asynchronous mode, input data loading is same as Synchronous Mode, but a different Java API shown in below code excerpt is for neural network execution in output asynchronous mode.
PSNPEManager.executeOutputAsync();
Callback function is used to get the output data asynchronously.
imageMap = new HashMap<>();
listener = new PSNPEManagerListener() {
@Override
public void getOutputCallback(int index, Map<String, float[]> data, int errorCode) {
resultPostProcessor.addToProcessList(imageMap.get(index), data);
}
@Override
public void onInferenceDone() {
... //the signal after finishing the inference.
}
@Override
public void onOutputProcessDone() {
... //the signal after outputting last one data.
}
};
PSNPEManager.registerPSNPEManagerListener(listener);
Input/Output Asynchronous Mode
In input/output asynchronous mode, all of input data loading, neural network propagation and output postprocessing are processed asynchronously.
for(int i = 0; i < imagesList.length; i++) {
...
List<String> file = new ArrayList<String>();
file.add(files.get(i));
...
PSNPEManager.executeInputOutputAsync(file, i, imagesList.length);
}
Callback function of loading input data.
@Override
public float[] IOAsyncInputCallback(String s) {
File file = new File(s);
float[] data = imagePreprocessor.preProcessData(file);
return data;
}
Android Sample Application
The Qualcomm® Neural Processing Android SDK includes a sample PSNPE application that showcases the PSNPE features. This application source code is in:
$SNPE_ROOT/examples/SNPE/android/psnpe-demo
Here are the screenshots of the sample:
Note that PSNPE provides the following AAR file which include necessary binaries:
psnpe-release.aar: Native binaries compiled with clang using libc++ STL. Available at $SNPE_ROOT/lib/android/snpe-release.aar
Please set environment variable to this PSNPE_AAR file.
To build and run this sample, include the PSNPE AAR as described above and build with the following commands, in addition, you need to refer to model-conversion and prepare the DLC file in advance:
export PSNPE_AAR=psnpe-release.aar
cd $SNPE_ROOT/examples/SNPE/android/psnpe-demo
cp $SNPE_ROOT/lib/android/$PSNPE_AAR app/libs/psnpe-release.aar
adb push psnpe-demo/resources/model_configs.json /storage/emulated/0/Android/data/com.qualcomm.qti.psnpedemo/files/configs/
adb push models/classification/inceptionv3.dlc /storage/emulated/0/Android/data/com.qualcomm.qti.psnpedemo/files/models/classification/
adb push images /storage/emulated/0/Android/data/com.qualcomm.qti.psnpedemo/files/datasets/classificationData/images
adb push labels.txt /storage/emulated/0/Android/data/com.qualcomm.qti.psnpedemo/files/datasets/classificationData/
adb push imagenet_slim_labels.txt /storage/emulated/0/Android/data/com.qualcomm.qti.psnpedemo/files/datasets/classificationData/
Note:
If building produces the error gradle build failure due to “SDK location not found”, set the environment variable ANDROID_HOME to point to your sdk location.
Building the sample code with gradle requires Java 8.
To run the sample APK, push DLC model, JSON file, and images with its label file into the related paths.
After the build successfully completes, the output APK can be found in the application build folder:
$SNPE_ROOT/examples/SNPE/android/psnpe-demo/app/build/outputs/apk/debug