How do I solve this IndexError in ONNX Model Predict block?

58 ビュー (過去 30 日間)
翼
2025 年 1 月 20 日
コメント済み: 2025 年 1 月 24 日 15:25
I created this kind of a sample Simulink model with ONNX Model Predict.
I want to run the simulation with my simple ONNX model that I created with Pytorch.
Also I set Input and output tab of the block parameter like the below images.
Input tab
Output tab
then I run the simulation however I got this error.
Despite the same count of Input to ONNX model, this happens...
Does this ring a bell? I could figure out where I am missing...
I'd appreciate it if you could give me your advice. How do I solve?
MATLAB System block 'untitled/ONNX Model Predict/ONNX Model Block' error when calling 'getOutputSizeImpl' method of
'nnet.pycoexblks.
Call to the Python model predict() function 'py.ONNXModelBlock.predict(...)' failed.
The Python error message is: == START OF PYTHON ERROR MESSAGE ==
Python error: IndexError: list index out of range
== END OF PYTHON ERROR MESSAGE ==.
Terminal width or dimension error.
' Output Terminal 1' in 'untitled/ONNX Model Predict/In7' is a 1-dimensional vector with 1 element.
My python code to create simple onnx model (Pytorch)
This is how I created my ONNX model. I just first wanted to try with easy way.
import torch
import torch.nn as nn
class EmptyModel(nn.Module):
def __init__(self):
super(EmptyModel, self).__init__()
# No trainable parameters, but add a linear layer to match Simulink requirements
self.linear = nn.Linear(7, 2, bias=False)
with torch.no_grad():
self.linear.weight.fill_(0.0)
def forward(self, x):
# Returns the first two elements of the input as is, without any computation
return x[:, :2]
model = EmptyModel()
dummy_input = torch.randn(1, 7, dtype=torch.float32)
torch.onnx.export(
model,
dummy_input,
"empty_model.onnx",
export_params=True,
opset_version=11,
do_constant_folding=True,
input_names=["input"],
output_names=["output"],
dynamic_axes={
"input": {0: "batch_size"},
"output": {0: "batch_size"},
},
)
Sorry that Japansene is included in my attached images...
I look forward to your answer.
Best,

回答 (2 件)

Don Mathis
Don Mathis 2025 年 1 月 21 日 21:29
編集済み: Don Mathis 2025 年 1 月 21 日 21:30
Your PyTorch model (and ONNX model) actually takes only 1 input of shape [N,7], not 7 separate scalar inputs. The Simulink block passes your 7 inputs as 7 separate inputs to the ONNX model. To fix this, you could either concatenate your 7 inputs into a vector, or make a PyTorch model that takes 7 separate inputs.
  2 件のコメント
翼
2025 年 1 月 22 日 6:27
I made a PyTorch model that takes 7 separate inputs and 2 separate outputs.
This is how I solved it. Thank you!
I'm just curious about something,
but I suppose if I have a larger Pytorch mode and use it in "ONNX Model Predict" block, the performance of simulation for the whole Simulink model might be lagging.
For instance, you have a vehicle controll model like the below image.
Then you replace "VCU" model block to ONNX Model Predict block.
It's kind of like you change to a AI Surrogate Model ( by Pytorch or Tensorflow and stuff)
This is off topic but, what are your thoughts on that?
Do you have any ideas in order to speed up the simulation of ONNX model ?
Best,
Don Mathis
Don Mathis 2025 年 1 月 22 日 13:30
The best you can do to speed up the ONNX block is to make sure your Python installation runs as fast as possible:
  • In MATLAB, use pyenv("ExecutionMode","InProcess").
  • If you have a GPU,
  • make sure your python environment has onnxruntime-gpu installed,
  • make sure it supports CUDAExecutionProvider,
  • make sure in python (outside MATLAB), you see a speedup when using GPU vs CPU
You could also try using the PyTorch Model Predict block directly, instead of converting to ONNX. There may not be a speed difference, but it may be easier to get GPU working in PyTorch.

サインインしてコメントする。


Don Mathis
Don Mathis 2025 年 1 月 22 日 18:12
Did you get a warning message before the error message that said something like this?
"Warning: Number of inputs specified on the Inputs tab must match the number of inputs specified in the Python model file. "
  3 件のコメント
Don Mathis
Don Mathis 2025 年 1 月 24 日 13:06
I expected the software to warn you that your Simulink model was passing too many inputs to your ONNX block.
翼
2025 年 1 月 24 日 15:25
Thank you for telling me.
I didn't realize that. I wonder if it's better to figure something out....
Do you have any ideas on avoid passing too many inputs to your ONNX block ?
What would you do?

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

製品


リリース

R2024b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by