How read extremely large HDF5 format data and resolve out of memory issue?
26 ビュー (過去 30 日間)
古いコメントを表示
I need to read several datasets from a 2TB HDF5 file which are needed for further computation.
If I simply coding as following,
varible1=h5read('path to .h5 file', 'path to dataset')
It would require ~500 GB array memory.
Is there any good way to solve this problem?
Thanks!
1 件のコメント
回答 (1 件)
ROSEMARIE MURRAY
2022 年 5 月 3 日
You could use a fileDatastore with the read function h5read, which would allow you to specify a certain amount to read at a time.
Or you could try this datastore: https://www.mathworks.com/matlabcentral/fileexchange/64919-hdf5-custom-file-datastore-for-timeseries-in-matlab
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で HDF5 についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!