*Choose Use Case:

Inference One

*Target Device:

Device ID:

*Runtime:

*Architecture:

*Verifier:

Accuracy Threshold:

rtolmargin

atolmargin

*Model json:

*Model Cpp:

Model Bin:

*NDK Path:

*Devices Engine Path:

*Input List:

Input Data Type:

Output Data Type:

Perf Profile:

Profile Comparison:

Perf Threshold: %

*Working Directory:

Inference Two

*Target Device:

Device ID:

*Runtime:

*Architecture:

*Model json:

*Model Cpp:

Model Bin:

*NDK Path:

*Devices Engine Path:

*Input List:

Input Data Type:

Output Data Type:

Perf Profile:

Inference

*Target Device:

Device ID:

*Runtime:

*Architecture:

*Default Verifier:

Accuracy Threshold:

rtolmargin

atolmargin

*Model json:

*Model Cpp:

Model Bin:

*NDK Path:

*Devices Engine Path:

*Input List:

*Path to Goldens:

Input Data Type:

Output Data Type:

Profile Comparison:

Perf Threshold: %

Perf Profile:

*Working Directory:

*Model json:

Model Bin:

*Path to Raw:

*Path to Goldens:

*Default Verifier:

Accuracy Threshold:

Perf Threshold:

%

rtolmargin

atolmargin

*Working Directory:

Save Run Configurations:

.
.
.