[HDL Coder] Ridiculous RAM usage + How to activate parallel processing?

4 ビュー (過去 30 日間)
Jan Siegmund
Jan Siegmund 2020 年 4 月 3 日
コメント済み: Jan Siegmund 2020 年 4 月 8 日
On my project, the HDL Coder uses ridiculus amounts of RAM. On my 8GB Windows 10 Notebook, it chews through it until after 3min the graphics crash. Now I got a 132GB CentOS6 Server at hand and 120GB are occupied after 10min. After 30min, the OOM Killer kicks in and kills the MATLAB process, because it uses all memory and SWAP.
  • Is this RAM usage still sane for the HDL Coder?
  • Is there anyway I could limit this RAM usage?
  • Can I somehow activate parallel computing on the HDL Coder? I think on the server, there was only one core used.
I cannot share the code of the project though.
  1 件のコメント
Jan Siegmund
Jan Siegmund 2020 年 4 月 3 日
Update: I tried to limit matlabs memory usage to 50GB in Linux:
ulimit -v 50000000
But then it crashes with: Exception found: st9bad_alloc

サインインしてコメントする。

採用された回答

Jan Siegmund
Jan Siegmund 2020 年 4 月 8 日
This memory usage is not sane and it is the result of not properly optimized MATLAB code for HDL conversion. I tried to run code, which processes 4K Images as a whole.
I expected the HDL Coder to serialize the Matrix and processing structures, but this has to be done by hand.
Now my Images are processed in little 5x5 windows, and the HDL coder does not even exceed 1GB of RAM during compilation.
  1 件のコメント
Jan Siegmund
Jan Siegmund 2020 年 4 月 8 日
However I would love a future feature of the HDL coder, which serializes the processing on its own, because only then, own could use the same speedy processing on FPGA and CPU.

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeHDL Coder についてさらに検索

タグ

製品


リリース

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by