How do I avoid rapid simulation failures due to large input data sizes?

4 ビュー (過去 30 日間)
matthew cammuse
matthew cammuse 2021 年 8 月 31 日
編集済み: matthew cammuse 2021 年 11 月 1 日
I utilize Simulink's rapid simulation mode with batch scripts to speed up my simulation and change the input signal data. I have different size input signal streams, but the simulation fails when the sample size of the input stream is too large. Is there a memory size or a time limit that I can increase to prevent this issue? This is an intermittent problem. The same input signal stream (same size) can fail or pass.
4 Columns of 12528 Samples => Pass!
4 Columns of 24528 Samples => Fail!
[status, result] = system(runstr);
>> status = -1073741819

採用された回答

matthew cammuse
matthew cammuse 2021 年 11 月 1 日
編集済み: matthew cammuse 2021 年 11 月 1 日
To anyone interested,
I have resolved my own issue. If you want to run Simulink Rapid Simulations via batch script, you have to set your Inputs' Signal Type to "auto." If you set it to "complex" because your input data is complex, then sim executions will fail for large input datasets.

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeSimulink についてさらに検索

製品


リリース

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by