xlsread memory problems

9 ビュー (過去 30 日間)
Norbert
Norbert 2012 年 3 月 28 日
Mar28
I'm trying to import data from a *.xlsx-file, which is approximately 42MB big and has 86400 rows and 52 columns. 1st row are the column headers, 1st column is for timestamps.
As I need not only the data but the timestamps and column headers as well, I'm using the following command:
[num,txt]=xlsread('Filename',1); %loading only the first sheet
Checking the task manager, the MATLAB memory usage increase by approximately 1GB, just by having num and txt in the workspace.
I have to read about 1600 more of those files, so you can see that there's gonna be a memory problem soon.
Strange thing is that num is only a 86399x52 (excl. 1st row and 1st column) matrix with double values. MATLAB should have no problem at all handling that matrix.
I've already checked a lot of questions posted hear regarding xlsread and out of memory error message, but it seems none of them were able to find a solution yet (most of them were old posts though). I'm using MATLAB R2012a, Win7 x64, 8GB RAM
***UPDATE***** Mar29 I've tried something what REALLY confused me. Instead of loading numeric data and text (column headers and timestamps in first column) all at once with
[num,txt]=xlsread();
I first imported only the numeric data and secondly the text. Which looks as follows:
[num]=xlsread('Filename',1);
[~,colheaders]=xlsread('Filename',1,'1:1');
[~,timestamps]=xlsread('Filename',1,'A:A');
Apparently it takes way less memory (look down to 1.) then reading the whole data at ones, which makes sense because the number of cells in txt is way higher compared to number of cells from colheaders+timestamps, although it contains the same data.
Up to this point, it all makes sense. But now it's getting weird...
1. Running the three above mentioned commands only consume less memory if they are performed one after another by hand. If I put them in a m-file and run it, it takes same memory than [num,txt]=xlsread();
2. After performing [num,txt]=xlsread(); on a specific file (one of my 1600 files) the workspace with only those two variables in it takes about 1GB of memory. If I run the 3 commands (manually one after another) mentioned before instead of [num,txt]=xlsread(); (workspace is cleared), it only takes about 0.2GB of memory. So the workspace includes now the variables num, colheaders and timestamps. Now I'm reading just one cell from a random xlsx-file:
[x,y]=xlsread('randomfile',1,'A1:A1');
And after that:
clear x y
So again, theres only num, colheaders and timestamps in the workspace. Surprisingly the amount of memory taken by those variables is now down to almost 0! There must be something wrong with the xlsread() command. Leaving something open or stored in system cache, I don't know.
Could anyone explain to me WHY? Or even better: how can I fix this (bug)?
(memory usage always compared to MATLABs memory usage when workspace is empty and MATLAB is on idle.)

回答 (1 件)

Image Analyst
Image Analyst 2012 年 3 月 28 日
Can't you read them in a loop so that you have only one set in memory at a time? Is it really necessary to have all 1600 in memory at the same time?
Product Support
1107 - Avoiding 'Out of Memory' Errors
'Out of memory' errors occur when.........
  1 件のコメント
Norbert
Norbert 2012 年 3 月 28 日
already checked that link.
It may be not necessary to load all of them of course, but if it's only 100 of 1600, it's already way too much.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeData Import from MATLAB についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by