-
Notifications
You must be signed in to change notification settings - Fork 0
Description
While running onnx model in onnx runtime you have optional parameters called "provider_options" which are specific for certain onnx providers (aka backends). If one is not placing any options here is the output of execution:
https://drive.google.com/file/d/1qKYrpfd-9jEmLTMEY-_rfMGFBBiY_vpk/view?usp=sharing
Yes it outputs ("node is not supported") for some nodes, but still proceeds to execute. I need to dive deeper into this
Here is another example of execution with provider_options passed into onnxruntime session.
bundeled_model_inference.log
And it just throws up the error for unsupported node and interrupts the execution. But the node itself is "Slice" and is most basic one and being implemented since the first onnx opset. https://onnx.ai/onnx/operators/onnx__Slice.html Which does not make any sense to me.