I am trying to access data stored in a large MAT file. The file is 72G of Simulink sim data.
Now, obviously I cannot use the LOAD command on my laptop with 16G of RAM. I thought the reason MathWorks provides the MATFILE command was to allow for accessing large MAT files without loading them.
But that doesn't seem to be the case.
When I attempt to access the file using the MATFILE command, Matlab behaves as if it were loading all that data into memory. My memory utilization goes to 98%, I get an out of memory error, and then Matlab silently crashes and exits.
So I go back to my big linux machine that I used to run Simulink and create this file, and run the MATFILE command there. And indeed it looks like Matlab is loading the whole file into RAM. I am hoping to divide the file up there into separate MAT files, but it is taking a really long time to load this data, and also using all available RAM.
Which leads to my questions: What is the MATFILE command doing? Is this expected behavior? Am I stuck rerunning my simulations and putting all results into separate MAT files? How are truely huge datasets stored and manipulated in Matlab? Evidently it is not with MAT files...