Matlab doesn't release memory when variables are cleared

I am working with very large data sets (~50 GB) in matlab R2015a. When I clear variables, matlab frequently does not release the memory. I have tried clear all, clear classes, clear java and running the java garbage collector. The only way I can get matlab to release the memory is close and restart matlab. Is there a better way?

9 件のコメント

KSSV
KSSV 2016 年 12 月 13 日
How you concluding that matlab is not releasing memory after clear ?
José-Luis
José-Luis 2016 年 12 月 13 日
I am curious. What sort of data is this? Do you actually need all 50GB or just part of it? Pre-processing might help you in case of the later. Can you actually have 50GB in memory? If yes, that's an impressive computer.
Adam
Adam 2016 年 12 月 13 日
Where are your variables stored? e.g. if they are stored somewhere under a GUI, such as on the handles structure, then clear all will not clear them as they are not in the workspace of the clear all (assuming you type 'clear all' at the command line. Doing this in the middle of program code is a very bad idea!)
Bonnie Tyler
Bonnie Tyler 2016 年 12 月 13 日
I'm looking at the total physical memory usage in the Windows task manager.
José-Luis
José-Luis 2016 年 12 月 13 日
And what sort of data are these and what are you doing with them?
Bonnie Tyler
Bonnie Tyler 2016 年 12 月 13 日
Adam, The data are in the workspace, although I do use the data in a variety of GUIs. I have this problem even though I have closed all GUIs and figures that have used the data. I've had the problem even with data that has never been used in a GUI or a figure. I've tried clearing individual variables. Sometimes the memory is released when I clear the variables and sometimes it isn't. If the memory isn't released when I clear the individual variable, it isn't released when I clear all.
Bonnie Tyler
Bonnie Tyler 2016 年 12 月 13 日
I'm working with large hyper-spectral images. I need to do a number of consecutive processing steps on the images. I keep having to save the data, shut down matlab, and reload because of out of memory errors. Its very frustrating. If I can't find a solution, I will have to move away from matlab.
Walter Roberson
Walter Roberson 2016 年 12 月 15 日
Did you experiment with
pack
?
Science Machine
Science Machine 2022 年 4 月 8 日
you can write a script to close matlab and continue with the next function in a new file, after saving variables to disk. Not ideal for sure, but it seems that eg. even shutting down parallel workers does not release a substantial chunk of ram, and that fully closing is indeed the only option here..

サインインしてコメントする。

 採用された回答

dpb
dpb 2016 年 12 月 13 日

0 投票

It has more to do with Windows and how its memory-management routines work (or not) regarding what memory that is marked as unused by the application is actually physically released and when. Also, even though there may be sufficient total free memory, it is free contiguous memory that is limiting when creating arrays; if there isn't sufficient for the job, you're stuck.
There are guidelines to help and also some newer techniques more recently introduced you can try--

その他の回答 (4 件)

Sean de Wolski
Sean de Wolski 2016 年 12 月 13 日

1 投票

You can force clear a variable from memory by setting it to empty rather than calling clear on it.
x = [];
v.
clear x

2 件のコメント

Bonnie Tyler
Bonnie Tyler 2016 年 12 月 15 日
I have tried both setting the variables to [] and clearing and have the same problem.
dpb
dpb 2016 年 12 月 15 日
編集済み: dpb 2016 年 12 月 15 日
They're the same; all an application can do is mark the memory as "unused"; it's up to the OS to reclaim it; see the above.
In the olden days in FORTRAN with nothing but static memory allocation, the "trick" was to allocate a very large chunk of memory and then use it by subscripting within it judiciously.
Perhaps reusing existing memory from one of your processing step to the next would be a similar possibility here of assigning the output of the step to the same variable as previously used. Matlab may still need to make copies if it can't tell that memory can be overwritten safely, so it may not help; there again may not be sufficient contiguous memory for the temporary, but it's a tactic you could possibly try.
Alternatively, perhaps it's time for mex files or, if you were able to illustrate specifically what your processing steps are, perhaps with some additional background others may have more efficient processing ideas.

サインインしてコメントする。

Adam Danz
Adam Danz 2017 年 10 月 18 日

0 投票

Bonnie mentioned that clear all, clear classes, etc didn't work but what worked for me was using:
clearvars -global
This immediately reduced memory devoted to Matlab from 3.2 gig to 0.7 gig. In my case, one or two GUIs that were closed were still occupying a lot of memory.
Arwel
Arwel 2019 年 7 月 4 日

0 投票

Previously, Ive just done something this which seems to work.....
% After deleting your large variable, go....
evalin('base','save(''myVars'')');
evalin('base','clear');
evalin('base','load(''myVars'')');
Christian Schwermer
Christian Schwermer 2020 年 8 月 16 日
編集済み: Christian Schwermer 2020 年 8 月 19 日

0 投票

Hello,
i had a similar problem in my GUI, where i used a cell array as FIFO buffer to acquire images. Memory usage increases for every session. Only closing matlab and restart releases the memory usage:
bufferSize = 450;
frame_buffer = cell(1, bufferSize);
....
flushdata(VideoInputObj)
delete(VideoInputObj)
frame_buffer(:) = {[]}=;
clear('frame_buffer')
imaqreset
When i preallocate the buffer for each cell . Memory usage stays on a constant acceptable level. Nevertheless it wasn't possible to release memory without restarting:
ROI = VideoInputObj.ROIPosition;
bufferSize = 450;
frame_buffer = cell(1, bufferSize);
frame_buffer(:) = {zeros(ROI(4), ROI(3) ,'uint8')} ;

カテゴリ

ヘルプ センター および File ExchangePerformance and Memory についてさらに検索

質問済み:

2016 年 12 月 13 日

コメント済み:

2022 年 4 月 8 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by