フィルターのクリア

How to average more than 50 3D matrices using nanmean

1 回表示 (過去 30 日間)
raheem mian
raheem mian 2019 年 11 月 14 日
編集済み: Matt J 2019 年 11 月 15 日
Hi, I am trying to average a lot of 3D matrices using NaNmean. I have tried using cat but my 3D matrices are huge (351x400x400) which is using a lot of memory. Is there a better way to do this ?
  7 件のコメント
Adam Danz
Adam Danz 2019 年 11 月 14 日
編集済み: Adam Danz 2019 年 11 月 14 日
Hmmmm... concatenating 50 arrays that each have more than 56 million elements isn't going to happen.
Off the bat I can think of a couple ideas.
1) Using 2 loops, you can loop through each file and partially load each 351 x 400 slice so you have 50 of those matricies which would make ~7m data points. If that's still too large you could partially load in each 351x1 column. Then you can do element-wise averaging and store the values as you proceed through the loops. That would involve 50 x 400 loops which isn't a big deal.
2) you can reorganize your data as tall arrays which are designed for large amounts of data.
raheem mian
raheem mian 2019 年 11 月 14 日
This idea is perfect, Thank you!

サインインしてコメントする。

採用された回答

Matt J
Matt J 2019 年 11 月 14 日
編集済み: Matt J 2019 年 11 月 15 日
Here's what I would do, I suppose. It assumes each of your .mat files stores the volume under the name 'a'.
Summation=0;
NCounter=0;
files=dir(fullfile('yourFolder','*.mat'));
for i=1:numel(files)
S=load(fullfile('yourFolder',files(i).name));
map=isnan(S.a);
S.a(map)=0;
Summation = Summation + S.a;
NCounter = NCounter + (~map);
end
result = Summation./Ncounter;
  2 件のコメント
raheem mian
raheem mian 2019 年 11 月 14 日
編集済み: raheem mian 2019 年 11 月 14 日
I like this method too ! Thanks. Method seems faster.
Adam Danz
Adam Danz 2019 年 11 月 14 日
Yep, this is simple and fast!

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeCreating and Concatenating Matrices についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by