Matlab performance handling large arrays
6 ビュー (過去 30 日間)
古いコメントを表示
I have a code which depending on how many iterations I choose, may end up with arrays in excess of 5,000,000 by 3. I soon started running into "out of memory" type problems because of the individual size of the large matrices.
I initialised my matrices beforehand, so all memory should have been allocated. Still sometimes I would get memory probs, but more interesting is that as the simulation progressed, it got gradually slower, eventually reaching 100% but at an exponentially slower pace it seemed.
I solved it by using smaller arrays, and after a set number of steps, assigning those matrices to other fixed matrices (which are not accessed in every loop), and then restarting the "looping array". So for e.g. say A is an array accessed in every loop. After the first 100 loops, i assign A to B (which is fixed and not accssed). I clean out A, then use it to fill in the next 101-200 steps, assign that portion to say array C, etc etc. So A is the only "dynamic" variable here.
So I fixed the issue it runs much better now, I am just curious to know why this would happen? Can anyone shed some light?
0 件のコメント
回答 (1 件)
Jan
2012 年 8 月 19 日
A [5e6 x 3] double array needs 120MB RAM. This should not cause out-of-error messages. To understand the cause of your problems, we have to see the code.
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Loops and Conditional Statements についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!