Matlab kills itself when my matrix is bigger than 7.5 GB
6 ビュー (過去 30 日間)
古いコメントを表示
Hi, I need to create a variable with the size 20x529x413x672; however, when Matlab gets to 20x529x413x215 it kills itself (variable size ~7.5Gb). I would like to know if someone has any suggestions to deal with this huge variable. Would parallel processing help me in this case? Or the easiest way would be to split my variable in two?
Any suggestion would be appreciated.
0 件のコメント
採用された回答
John D'Errico
2023 年 1 月 2 日
編集済み: John D'Errico
2023 年 1 月 2 日
No. Parallel processing will not help. In fact, it could make things worse, since you only have a fixed amount of RAM, but now you want 4 or 8 cores to all use part of it at once. Parallel processing does not give you more memory, and memory is the limiting factor here, NOT processing speed.
You might consider just getting more RAM. A cheap fix these days! Yes, I know. Sometimes it can be difficult to stuff more RAM into a laptop. But people forget that their computer has finite limits.
20*529*413*672
And that is roughly 3 billion elements you need to address. Not a problem, if you have sufficient memory. My computer would probably just manage to do it. But consider that each element in that array uses 8 bytes of RAM.
20*529*413*672*8
So really, you are looking to create an array that uses approximately 23 gigabytes of RAM. Generally, you need to have at least 2x the size of the largest variable you will create. And 3x is probably a safe thing. Don't forget that often copies of arrays need to be made.
What can you do? You can make the problem smaller. You might decide to use single precision for that array. This would cut the memory required in half. Or you might even be able to use int8 or uint8. We have no clue as to what you are doing with that array. Does the loss in precison matter? Sometimes you can get away with it, but that often requires reworking your code and a strong understanding of numerical methods to avoid any issues that can crop up.
Depending on what you are doing, there are other tools like tall arrays, that allow your computer to have only part of the array in memory at once.
You can also do other things, like creating only a small part of that array as you need it. So you might use loops. Creative coding can often solve a problem that brute force fails on. But again, we don't know what you are doing with this array, or why it needs to be that big.
Too often we solve small problems, and see our computer easily handle everything we throw at at. So we make our probems just a little bigger, then bigger yet. At some point, everything just comes to a massive halt, because that computer really does have finite limits and capabilities. Years ago, I described what I called then John's law of computing - that computer programs expand to just beyond the current capacities of the computers we have available to solve them. (I'm sure someone else has said it before me. Such is life.) The trick is to learn methods of mathematics, of numerical analysis, of computing to handle these problems with finesse instead of brute force. Or, you can just wait a few years until Moore's law catches up to where you are now.
その他の回答 (1 件)
MJFcoNaN
2023 年 1 月 2 日
Hello,
It is often the RAM that limits variable size and parallel may not help. You may try to allow more RAM by matlab: Preference-general-java heap. If your computer does not have enough RAM, to split it is a good choice.
PS: If you don't need to load the whole huge variable laterly, you can create a .mat file in a powerful computer and "Access and change variables in MAT-file without loading file into memory" by the function of "matfile", which may simplify some code even with limited RAM.
参考
カテゴリ
Help Center および File Exchange で Logical についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!