MATLAB Answers

Too many .mat files when using Write Tall Table

4 ビュー (過去 30 日間)
Tylor Slay
Tylor Slay 2019 年 3 月 3 日
コメント済み: Tylor Slay 2019 年 3 月 4 日
I have a reasonably large tall table (16940800x5) that was created using vertcat on about 1600 columns of a matrxi that I am stacking for better performance. However when I try to write the new stacked tall table it creats 1600 .mat files which ends up crashing the program for some reason. Is there a way to reduce the number of snapshots that are created during the write process?

  0 件のコメント

サインイン to comment.


Rick Amos
Rick Amos 2019 年 3 月 4 日
As you've likely guessed, vertical concatenation of 1600 arrays results in at least 1600 files from tall/write. This is the case because tall/vertcat is conservative, it assumes all input argument are truly tall and avoids combining multiple input arguments into the same partition (and so file). I'm afraid there isn't a direct way to reduce the number of files.
I am surprised this causes the program to crash. Would it be possible to hear some more details about this?
If the order of data is not important, and you are working with R2018b, there is an alternative. You can stack data by interleaving rows with matlab.tall.transform:
tX = tall(..)
tY = matlab.tall.transform(@reshapeToWidth5, tX);
function y = reshapeToWidth5(x)
% Stack each block of 5 columns onto the first 5 columns by interleaving rows
% I.E. y = [x(1:5,1); x(6:10,1); ...; x(1:5,2); x(6:10,2); ...]
y = reshape(x', 5, [])';

  1 件のコメント

Tylor Slay
Tylor Slay 2019 年 3 月 4 日
I mispoke when I said it crashed the program, it threw an error stating that the directory location ran out of memory. Which I didn't quite understand becuase I was working on a 4 TB hard drive so that wasn't true. I had actually thought about reshaping the tall array so thank you for your answer. I was unaware of the "matlab.tall.transform"

サインイン to comment.

その他の回答 (0 件)

サインイン してこの質問に回答します。

Translated by