フィルターのクリア

large matrix and memory space

8 ビュー (過去 30 日間)
Massimo Laspia
Massimo Laspia 2017 年 6 月 20 日
回答済み: Steven Lord 2017 年 6 月 20 日
Goodmornig, I need to work with large matrix for simulation purposes. How can I manage these matrix without incurring in out of memory error? Is there a way to store matrix and save space?
Thanks
Regards
  2 件のコメント
KSSV
KSSV 2017 年 6 月 20 日
What's the dimension of matrix you work with? What problem? All ways there is chance of running our code on chunk of a matrix to avoid memory problems.
Adam
Adam 2017 年 6 月 20 日
doc matfile
can help, by keeping the data on disk rather than in memory, but it does have restrictions on how you can access the data and is obviously slower (though less so that reading from disk manually every time)

サインインしてコメントする。

回答 (2 件)

Jan
Jan 2017 年 6 月 20 日
It depeneds on what "large" is. If you have some GigaBytes, install more RAM. Having the double size of the array as free RAM is a point to start from, but it is a good idea to install even more RAM. Then increase the virtual memory.
If you have an array with TerraBytes of data, this option has its limits. Then process the array in chunks. In both cases there is no magic switch to save space - except the array contains a lot of zeros: Then a sparse array will help you.
  1 件のコメント
Massimo Laspia
Massimo Laspia 2017 年 6 月 20 日
thanks

サインインしてコメントする。


Steven Lord
Steven Lord 2017 年 6 月 20 日
As Jan said, it depends on what "large" means. Sparse matrices are one option if your data is sparsely populated. If your matrices are REALLY large, consider some of the tools designed for working with Big Data.

カテゴリ

Help Center および File ExchangeMultidimensional Arrays についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by