Reading from large workspace array slows down my Simulink simulation

4 ビュー (過去 30 日間)
Michael
Michael 2019 年 5 月 22 日
回答済み: Michael 2019 年 5 月 22 日
I have a large workspace array sized ~200x500x4000. This contains data for 4000 uint8 images captured with a camera. If I were to change the size of my workspace array to say 200x500x8 my simulation takes about 30 seconds per image. If I try to run with the full sized 4000 element array it takes over 5 minutes per image.
I have to read this array in a matlab function block in my Simulink model where I parse the data and output a few pixels of data at a time to the rest of the model.
What is the recommended way to deal with large datasets like this? I tried storing my data in a .mat file and using the matfile function to load parts of it at a time but when I try to use the matfile function in my Matlab function block I get an error that matfile is not supported for code generation. I am running in Normal mode, running in Accelerated mode did not result in an improvement.

採用された回答

Michael
Michael 2019 年 5 月 22 日
Solved. I ended up using a data store block in Simulink to access my large array as a global variable. Running in accelerated mode I was able to get my simulation time down to ~10 seconds per frame
https://www.mathworks.com/help/releases/R2011b/toolbox/simulink/ug/bsds2rv.html#bsdud7d-1

その他の回答 (0 件)

タグ

製品


リリース

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by