フィルターのクリア

Changing large matrices by not completely loading them into memory

3 ビュー (過去 30 日間)
Moritz
Moritz 2015 年 6 月 18 日
コメント済み: Walter Roberson 2015 年 6 月 18 日
Hi,
I'm attempting to modify very large matrices (single, 50e3 x 50e3), which don't make sense to load into the memory. I was wondering what you could recommend me as a data handling strategy? I thought ideally I could always load a let us say 100x100 square modify it and write it back. My working machine uses a SSD connected via M2 so it should be relatively speedy (however of course not nearly as fast as RAM). What suggestions do you have?
Thanks,
Moritz

回答 (2 件)

Stephen23
Stephen23 2015 年 6 月 18 日
編集済み: Stephen23 2015 年 6 月 18 日
You should read TMW's own advice on working with big data:
And in particular you might find memmapfile to be of significant interest to you:
  1 件のコメント
Walter Roberson
Walter Roberson 2015 年 6 月 18 日
Or instead of memmapfile, save the .mat with -v7.3 and then use matFile objects to read in portions of the array.

サインインしてコメントする。


Alessandro
Alessandro 2015 年 6 月 18 日
Did you check the sparse command out?
  1 件のコメント
Moritz
Moritz 2015 年 6 月 18 日
Yes I did. However, I believe this only works if you have a considerable amount of zero elements. In my case however, the amount of zero elements are < 5%.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeLarge Files and Big Data についてさらに検索

製品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by