Package examples.onnx.executionproviders

Functions

cpuInference
Link copied to clipboard
fun cpuInference(model: OnnxInferenceModel, inputData: FloatArray, n: Int = 10): Long
cudaInference
Link copied to clipboard
fun cudaInference(model: OnnxInferenceModel, inputData: FloatArray, n: Int = 10): Long
main
Link copied to clipboard
fun main()
fun main()
multiPoseCudaInference
Link copied to clipboard
fun multiPoseCudaInference()

This example compares the inference speed of different execution providers:

prepareInputData
Link copied to clipboard
ssdCudaInference
Link copied to clipboard
fun ssdCudaInference()

This example demonstrates how to infer SSD model using inferAndCloseUsing scope function: