OnnxInferenceModel
Inference model built on ONNX format.
Since
0.3
Constructors
OnnxInferenceModel
Link copied to clipboard
Constructs an ONNX inference model from the given model file.
OnnxInferenceModel
Link copied to clipboard
Constructs an ONNX inference model from the byte array representing an ONNX model.
OnnxInferenceModel
Link copied to clipboard
Constructs an ONNX inference model from the function which returns a byte array representing an ONNX model.
Types
Functions
initializeWith
Link copied to clipboard
open override fun initializeWith(vararg executionProviders: ExecutionProvider)
Content copied to clipboard
Initializes the model, if it's not initialized, or re-initializes it, depending on the execution providers.
predict
Link copied to clipboard
predictRaw
Link copied to clipboard
Returns list of multidimensional arrays with data from model outputs.
fun <R> predictRaw(inputData: FloatArray, extractResult: (<ERROR CLASS>) -> R): R
Content copied to clipboard
Runs prediction on a given inputData and calls extractResult function to process output.
predictSoftly
Link copied to clipboard
Predicts vector of probabilities instead of specific class in predict method.
open fun predictSoftly(inputData: FloatArray, predictionTensorName: String): FloatArray
Content copied to clipboard
Properties
inputDataType
Link copied to clipboard
inputDimensions
Link copied to clipboard
outputDataType
Link copied to clipboard
outputShape
Link copied to clipboard