Package examples.onnx.executionproviders
Functions
cpuInference
Link copied to clipboard
fun cpuInference(model: OnnxInferenceModel, inputData: FloatArray, n: Int = 10): Long
Content copied to clipboard
cudaInference
Link copied to clipboard
fun cudaInference(model: OnnxInferenceModel, inputData: FloatArray, n: Int = 10): Long
Content copied to clipboard
multiPoseCudaInference
Link copied to clipboard
fun multiPoseCudaInference()
Content copied to clipboard
This example compares the inference speed of different execution providers:
prepareInputData
Link copied to clipboard
fun prepareInputData(modelType: ONNXModels.PoseDetection.MoveNetMultiPoseLighting): FloatArray
Content copied to clipboard
ssdCudaInference
Link copied to clipboard
fun ssdCudaInference()
Content copied to clipboard
This example demonstrates how to infer SSD model using inferAndCloseUsing scope function: