-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: RuntimeError: Exception from [CPU] Loop node with name 'Loop.0' at src/plugins/intel_cpu/src/nodes/tensoriterator.cpp:182 #28235
Comments
I also tried converting the model to OpenVINO's XML and BIN files for loading. The model converts successfully, but it still throws the same error during inference. |
@fxwfzsxyq Could you share:
|
@Iffa-Intel I haven't modified the model. I simply used Paddle2ONNX to convert the Paddle model to ONNX, and then used the official command to convert the ONNX model to OpenVINO. When performing inference, I encounter this error. The ONNX model loads and works fine, but when using the converted OpenVINO model, it loads successfully but throws an error during inference. I find this error quite strange. The model can be converted, but an error occurs during the actual forward prediction. It seems like the issue is related to a dimension mismatch. |
@Iffa-Intel Official link to that paddle model :https://paddle-model-ecology.bj.bcebos.com/paddlex/official_inference_model/paddle3.0b2/SLANet_plus_infer.tar |
@fxwfzsxyq From my side, it seems that the model has shape issue: Even when I tried to convert with a static shape: Are there any specific reason that you want to use OpenVINO IR format (xml and bin) instead of ONNX, since OpenVINO can read ONNX models directly? |
@Iffa-Intel My point is that the model can be used within the ONNX framework, utilizing ONNX Runtime for inference. However, I would like to run it using the OpenVINO framework on the CPU, as the speed is expected to be faster than ONNX on the CPU. The model's input shape should be (1, 3, 488, 488). I suspect that the issue might be due to rounding in the OpenVINO framework's implementation of RNNs, because this shape is not divisible by 16. |
@fxwfzsxyq we'll further investigate this and get back to you with a possible workaround. |
@Iffa-Intel Sure, looking forward to your reply. |
Hi @fxwfzsxyq have you tried to directly use the Paddle model as input of OpenVINO for inference? |
@yuxu42 This will still cause an error, and the error is the same as the one encountered by @Iffa-Intel . The error is as follows: |
OpenVINO Version
2024.6.0
Operating System
Ubuntu 18.04 (LTS)
Device used for inference
CPU
Framework
ONNX
Model used
slanet_plus
Issue description
convert paddlepaddle model to onnx model, use openvino to inference, but get error
Step-by-step reproduction
model_path: https://paddle-model-ecology.bj.bcebos.com/paddlex/official_inference_model/paddle3.0b2/SLANet_plus_infer.tar
use paddle2onnx to convert model to onnx model
inference code:
error:
RuntimeError: Exception from src/inference/src/cpp/infer_request.cpp:245:
Exception from src/bindings/python/src/pyopenvino/core/infer_request.hpp:54:
Caught exception: Exception from src/plugins/intel_cpu/src/node.cpp:746:
[CPU] Loop node with name 'Loop.0' Check 'mem->getShape() == Shape(VectorDims{1})' failed at src/plugins/intel_cpu/src/nodes/tensoriterator.cpp:182
Relevant log output
No response
Issue submission checklist
The text was updated successfully, but these errors were encountered: