ONNXRuntime Issue #11419
Unanswered
FusionModele
asked this question in
Other Q&A
ONNXRuntime Issue
#11419
Replies: 1 comment 1 reply
-
|
Can you share how this can be reproduced? What is your input size and does it match what the model expects? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
the onnx file is generated with opset 11.
I'm trying to run an onnx model with onnxruntime and I got this issue.
self.ort_session = ort.InferenceSession(model_path)
---------log------------------------
RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: /onnxruntime_src/onnxruntime/core/providers/cpu/tensor/reshape_helper.h:40 onnxruntime::ReshapeHelper::ReshapeHelper(const onnxruntime::TensorShape&, onnxruntime::TensorShapeVector&, bool) gsl::narrow_cast<int64_t>(input_shape.Size()) == size was false. The input tensor cannot be reshaped to the requested shape. Input shape:{1,16074,16074}, requested shape:{1,16074,1}
----------env--------
onnxruntime-1.11.1
onnx-1.11.0
Beta Was this translation helpful? Give feedback.
All reactions