Out of Memory error

1 回表示 (過去 30 日間)
Waseem Abbas
Waseem Abbas 2022 年 5 月 31 日
コメント済み: Waseem Abbas 2022 年 5 月 31 日
I run a file in mattlab, with a medium size data set, after executing some of the funtions, the following out of memory error ocuur in the normalizatioin, i am working with 4gb Ram, but i also faced the same error when executed the same file on 16gb system,
Normalization:
Out of memory.
Error in movie_kernel (line 69)
K = K./(d*d');
Error in movie_build_kernel (line 86)
KXmfull = movie_kernel( xmovies, ... % training feature matrix
Error in movie_train (line 22)
[KK] = movie_build_kernel(isubset,xranges,xmovies, Y0,whichOption);
Error in movie_main (line 747)
[xalpha3] = movie_train(isubset_tra, ... % training indeces,
why the this error occurs and what is the possible solution to run this on 4gb ram system?
Thanks
  4 件のコメント
Walter Roberson
Walter Roberson 2022 年 5 月 31 日
https://www.mathworks.com/help/deeplearning/ref/trainingoptions.html
Yes, it will affect the accuracy. The smaller the minibatch size, the more prone it is to go the wrong direction for noise, but the easier it is to escape local minima. Papers estimate that minibatch size is best in the 2 to 32 range.
Waseem Abbas
Waseem Abbas 2022 年 5 月 31 日
thank you

サインインしてコメントする。

回答 (0 件)

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

製品


リリース

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by