Deep Learning Memory issues for BiLstm mat files
9 ビュー (過去 30 日間)
古いコメントを表示
Hi,
I have a huge dataset of over 20GB of numercial time-series data stored in files as shown below:
x1_1 = double(23*921600)
...
I have about 250 of these mat files. and the label or target of these are single row categorical array that shows 0 and 1.
y1_1= categorical(1*921600)
...
I loaded 5 of each file and was able to train and classiy them using a BiLSTM Netowrok of 3 hidden layers with over 300 neurorns. but now i want to run all of them. My example is very similar to this example, I also tried this which was no help. I know that i should use a datastore of some kind but i tried most[File,Tall] of them and couldn't solve the issue. Any ideas how to solve this issue? Thank you.
0 件のコメント
回答 (1 件)
Divya Gaddipati
2019 年 7 月 19 日
Hi,
You could use the Custom Mini-batch Datastore which uses the function sequenceDatastore that reads data from the specified folder and obtains labels from the subfolder names.
参考
カテゴリ
Help Center および File Exchange で Deep Learning Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!