Save "big" object to file fails
16 ビュー (過去 30 日間)
古いコメントを表示
Hi,
I'm working on a project with OOP. There is an object called "database" containing a "big" cell-array (nested with mixed contents).
In this database, I stored some file contents. Until now, as I had about 2000 files in this database, the file could be stored properly with "save" and creates a 20 MB file. But as I added another 1000 files, the saving process stops after some time and produces a rudimentary 1KB .mat-file (no error or anything else).
I tried the "pack" command but then Matlab crashed. Of course I could post the log here if desired. I'm using Windows XP SP3, Matlab v. 7.5.0 (R2007b) and wanted to save the file on several file systems (fat/ntfs).
Is this a common issue? I couldn't find anything similar out there...
Greetings
0 件のコメント
回答 (5 件)
Andrea Gentilini
2012 年 5 月 7 日
Try go to File -> Preferences -> General -> MAT-Files -> and click the option MATLAB Verion 7.3 or later. This will allows you to save variables that are in excess of 2 GB.
Jan
2011 年 11 月 22 日
If Matlab crahs inside the pack command, you have a serious problem in the memory management. Do you use user-defined Mex functions?
BTW. Athough you can create a database using Matlab, dedicated database programs will do this much better.
0 件のコメント
Vincent
2011 年 11 月 24 日
2 件のコメント
Peter O
2011 年 11 月 30 日
Hi Vincent, I'm getting the same problem here today. Around a 300MB dataset won't save, but the 52MB version _sometimes_ will. R2011a here. I think the issue, for me, is that we have 7MB 'profile' spaces on the network for program temporary work and it's hitting the wall. I'll let you know if I find anything.
Martin Kahn
2018 年 7 月 1 日
Hi guys,
Given that this question still gets some views, I just had an issue that sounds very similar (with Matlab 2018a and Windows 10): When trying to save with "save('filename.mat','myFile')" I just got a 1KB file. I don't really know the details of why but this fixed it: "save('filename.mat','myFile','-v7.3')". I guess this is what Andrea suggested? Sorry if it's not helpful...
1 件のコメント
Riccardo Scorretti
2021 年 9 月 23 日
編集済み: Riccardo Scorretti
2021 年 9 月 23 日
Hi there.
Unfortunately I'm experiencing the same problem (Matlab 2020b, Linux Fedora F34). As it can be observed in the picture hereafter, as soon as serialization is triggered the amount of used memory nearly doubles:

It looks like if Matlab makes a temporary copy of data which have to be saved (with option -v 7.3 of course), and in some circumstances this ends up in an out of memory error.
In my case, I was trying to save the whole workspace, which contains many huge variables. I suggest to overcome the problem by saving separately each huge variable in different files, so as to lower the peak temporary memory usage, which is apparently required to serialize data.
参考
カテゴリ
Help Center および File Exchange で Database Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!