How read extremely large HDF5 format data and resolve out of memory issue?

26 ビュー (過去 30 日間)
Trista Ni
Trista Ni 2022 年 5 月 3 日
回答済み: ROSEMARIE MURRAY 2022 年 5 月 3 日
I need to read several datasets from a 2TB HDF5 file which are needed for further computation.
If I simply coding as following,
varible1=h5read('path to .h5 file', 'path to dataset')
It would require ~500 GB array memory.
Is there any good way to solve this problem?
Thanks!
  1 件のコメント
Jonas
Jonas 2022 年 5 月 3 日
編集済み: Jonas 2022 年 5 月 3 日
does this help? https://de.mathworks.com/matlabcentral/answers/423693-read-and-divide-hdf5-data-into-chunks
maybe you could also spit data using oython beforehand, i saw multiple those scripts when googling

サインインしてコメントする。

回答 (1 件)

ROSEMARIE MURRAY
ROSEMARIE MURRAY 2022 年 5 月 3 日
You could use a fileDatastore with the read function h5read, which would allow you to specify a certain amount to read at a time.

カテゴリ

Help Center および File ExchangeHDF5 についてさらに検索

製品


リリース

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by