フィルターのクリア

Solve out of memory problem

1 回表示 (過去 30 日間)
Amelia
Amelia 2020 年 11 月 5 日
編集済み: Ameer Hamza 2020 年 11 月 5 日
Hello I have laptop, Dell precision M4800, processor Intel core i7 , 2.8GHz , RAM 16 GB. Then, I lntall SSD hard and 8 GB Ram.
When I use it for classification using large scale data, It runs out of memory.
I tried to solve this problem by increasing system swap space, changing double variables to single precision, clear unused variables But the problem still appears.
Is there another solutions I can use to solve this problem
Thanks in advance

採用された回答

Ameer Hamza
Ameer Hamza 2020 年 11 月 5 日
編集済み: Ameer Hamza 2020 年 11 月 5 日
This shows that the dataset is too large to fit in the available RAM. The solution is not to read all dataset at once. Instead, create an image datastore: https://www.mathworks.com/help/matlab/ref/matlab.io.datastore.imagedatastore.html. Functions such as trainNetwork() accept image datastore as input. This link will also be useful: https://www.mathworks.com/matlabcentral/answers/291597-best-way-to-deal-with-large-data-for-deep-learning.

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeImage Processing Toolbox についてさらに検索

製品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by