MATLAB Answers

How to efficiently integrate big data without using memory / (How to create big data)

24 ビュー (過去 30 日間)
Mehmet OZC
Mehmet OZC 2015 年 8 月 18 日
コメント済み: Mehmet OZC 2015 年 8 月 19 日
  • in a study i will produce large arrays.
  • Each array will have at least 500 MB size.
  • Each array will have the same number of rows.
  • the total size of dataset will be approximately 20 GB or over.
  • Somehow I have to create a single variable/array which includes all data and size of 20 GB.
matfile seems a good solution. However when the size of file increases, it gets slower. How can i handle this problem?
  9 件のコメント



JMP Phillips
JMP Phillips 2015 年 8 月 19 日
編集済み: Walter Roberson 2015 年 8 月 19 日
Here are some things you could try:
Use the matfile function, which allows you to access and change variables directly in MAT-files, without loading into memory:
Structure your data differently: - if you are representing the data as doubles, maybe you can afford less accuracy e.g. use int32. For example, you can use scaling of 1e4 to represent a double value such as 100.3425 as an integer 1003425.
  • use 64 bit matlab version
  • try disabling compression when saving the files, with the -v6 option
Optimize your PC for your task:
  2 件のコメント
Mehmet OZC
Mehmet OZC 2015 年 8 月 19 日
In one of the links provided above I have run across following code
example = matfile('example.mat','Writable',true);
[nrowsB,ncolsB] = size(example,'B');
for row = 1:nrowsB
example.B(row,:) = row * example.B(row,:);
And that solved my problem. Thanks


その他の回答 (0 件)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by