Appending to a very large file
6 ビュー (過去 30 日間)
古いコメントを表示
Hi,
I'm having trouble writing very large files to disk. I'm appending 64 smaller files (each ~1 GB) into a sinlge giant matrix. I expect the file to be ~64 GB, and I'm running into an "Out of memory" problem during processing. I'm wondering if there's a more efficient way to do this without needing to load all of the smaller files into memory before writing one monster file to disk. Is there a way for me to load each one at a time and append that to the file, then clear memory and load the next?
Current code looks like this:
clear
close all
clc
% Make a for loop to import every channel
for i=1:64
fprintf('i = %f\n', i);
[Samples, Header] = Nlx2MatCSC(['CSC' num2str(i,'%02.f') '.ncs'],...
[0 0 0 0 1], 1, 1, [] );
%temp_1 = Samples';
temp_2 = reshape(Samples,[],1)';
if exist('signal_mat')
signal_mat = vertcat(signal_mat,temp_2);
else
signal_mat = temp_2;
end
clear Samples Header temp_2
end
clear i
% Demedian the data
fprintf('Demedian data');
signal_med = median(signal_mat);
signal_mat_demed = signal_mat - signal_med;
%% Write to file for KS2
fprintf('Write data');
fid = fopen('myNewFile.dat', 'w');
fwrite(fid,signal_mat, 'int16');
fclose(fid);
fid = fopen('myNewFile_demed.dat', 'w');
fwrite(fid,signal_mat_demed, 'int16');
fclose(fid);
clear
fprintf('Done');
採用された回答
Jan
2021 年 1 月 7 日
This line increases the problem:
signal_mat = vertcat(signal_mat,temp_2);
In e.g. the last step, you concatenate a 63 GB array with a 1 GB array and copy it to a new 64 GB array. This requires 63+64 GB of RAM.
Pre-allocation would avoid this problem. In your case it could work with 64 + X GB RAM, where X might be 8 or 20. But even then this is a huge signal. How much RAM do you have?
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Data Import and Analysis についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!