How can I reduce the memory footprint of Matlab

32 ビュー (過去 30 日間)
Josh
Josh 2018 年 7 月 16 日
コメント済み: Walter Roberson 2018 年 7 月 18 日
Hi,
I recently ran into a memory issue using 2017b Matlab. I have a 16 core machine with 64 GB of memory. I ran one of my analysis scripts, and there is nothing in it that is all that memory intensive. When I saw the memory error, I immediately started digging in to understand what went wrong. I could not find anything in my code that was large or data intensive by nature. However, when I looked at the system statistics, more than 90% of the machine's memory was being consumed. I reopened Matlab and took some very basic steps. The freshly opened Matlab instance consumed about 1.7 GB of memory. I then loaded Simulink by typing 'simulink'. With Simulink loaded, 2.6 GB of memory was consumed. Then, I opened a parallel pool by typing p=parpool;, which opens 16 parallel instances of Matlab. Each worker consumed about 1.1 GB. Including the local Matlab instance, a total of 20 GB was consumed. Now, I load a model into each worker by typing 'spmd; load_system('MySystem'); end'. 'MySystem' is a very simple model which is essentially empty. At this point, each worker consumes between 1.3 and 2.0 GB. My total system resource consumption at this point is 30 GB. Within Simulink, I have an s-function block which contains a commercial modeling software package that consumes 1.2 GB per core. At this point, 49.2 GB is consumed (76%), and no actual data has been loaded. I am fairly confident that of all the variables, nothing is anywhere close to 15 GB, the remaining memory on my machine. However, that is consumed somehow, and I get a memory error.
At this point, my biggest struggle is that Matlab and Simulink consume such a large amount of system resources, prior to any actual working being done. By the time I get the software set up, 76% of the machine memory is used. I tried the following things: 1) removing all unnecessary paths 2) uninstalling unnecessary toolboxes Neither of these substantially impacted the memory consumption. Could someone please provide some other ideas about how to reduce the memory footprint of Matlab?
Second, when I created the parallel pool and loaded the model, some workers consumed just 1.3 GB while others consumed 2 GB. This is hard to understand. I would think all workers would consume the same amount of memory; 700 MB per worker is not negligible. Could someone please help me understand why some workers consume more memory than others, at this stage?
Thanks, Josh
  6 件のコメント
OCDER
OCDER 2018 年 7 月 18 日
Where did you get the "commercial modeling software package"? How about increasing the swap space?
Not saying reducing the limit was a permanent solution, but it is the most logical one at the moment and also it lets you test if there's a random memory usage burst in one of the workers.
Other alternatives: you'll either have to wait a long time for Mathwork to redesign parallel computing, or wait a while for the "commercial modeling software package that consumes 1.2 GB per core" to make a lower memory-usage package, or avoid simulink and use matlab functions somehow, or set swap space to be high, OR spend >$1500 to get > 64GB RAM...
Josh
Josh 2018 年 7 月 18 日
編集済み: Josh 2018 年 7 月 18 日
I wasn't very clear about the "commercial modeling software package". It is a GT-Power solver embedded into a Simulink model via an s-function block. I am not going to get rid of that, or Simulink.
I was able to open Matlab 2014b. Each worker consumed about half of what each worker does in 2017b. I would imagine that if I look at GT-Power, the memory footprint would be the same, in terms of growth in memory-use.
Thanks for the suggestions. I will seek solutions on all three fronts. It is absolutely possible for Matlab and Simulink to reduce their base memory footprint, since it was smaller by 1/2, 3 years ago. GT-Power likely has some areas they can improve to reduce their memory footprint. I will see what I can do to get more memory. I don't see increasing the swap space as being a solution, because whatever ends up in the swap file is going to run very slowly. I would likely be better off reducing the number of workers.

サインインしてコメントする。

回答 (1 件)

OCDER
OCDER 2018 年 7 月 18 日
I'd still try increasing the swap space because IF the program rarely goes over 64GB, then you get the benefits of 16 cores "most" of the time and plus, your software will not stop due to a short-term memory overuse error. Since swap space stores the rarely used memory pages first, maybe it'll still be okay depending on if these software are using RAM inactively. You'll still have to test for the optimal number of workers without massive slow down / crashing. Hopefully, future versions of Matlab and GT-Power will be memory efficient, or there's a huge sale on RAM somewhere.
  1 件のコメント
Walter Roberson
Walter Roberson 2018 年 7 月 18 日
The behavior of swap space is system dependent.
Pages that have been recently used are probably in the cache rather than on disk.
In the case of a single level swap, it typically would not make sense to somehow prioritize the most recently used pages on disk, as that would imply that the pages on disk would keep getting rewritten as pages get used.
There is some sense in writing recently used pages on the outer edges of the disk, which is moving faster (at least on Constant Angular Velocity drives) and so has lower access time, so maybe it could be worth-while to occasionally run a consolidation / page migration pass.
In a multilevel swap space of different speeds then Yes it can make sense to migrate pages. For example swap to SSD can be fast but possibly not large enough, so it can make some sense to migrate pages that have mid-level use to SSD. Some disk hardware has this built in, with SSD cache and automatic migration to hard drive.
... but that is all system (and hardware) dependent. It is not always true that swap space is arranged MRU.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangePerformance and Memory についてさらに検索

タグ

製品


リリース

R2017b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by