LSTM SequenceLength and Batch Explained
20 ビュー (過去 30 日間)
古いコメントを表示
Hi Matlab staff
For the current example https://www.mathworks.com/help/deeplearning/examples/time-series-forecasting-using-deep-learning.html
In this example, the sequencelength is by default longest. Does that mean that the sequencelength is 500 (I dont think it is! as this would make the algorithm much more computational extensive). Does that mean sequencelength is 1 then? It is unclear how the data is fed into the model. Further, a single batch is used (I assume! because changing minibatch size does not change the algorithm). Could you elaborate on the sequencelength and batchsize in this particular example ?.
I think this is very important for LSTM in Matlab community that there is full understanding of how the LSTM in Matlab is designed.
Cheers, MB
2 件のコメント
John D'Errico
2019 年 3 月 17 日
We are not MATLAB staff, although some MathWorks employees may drop in here, on their free time.
採用された回答
Abhishek Singh
2019 年 3 月 25 日
Hi MB Sylvest,
According to the documentation provided you can know about the sequence length and batch size using trainingOptions.
When you try running the example and see the values of trainingOptions you’ll get to know that it took the default sequence length as ‘longest’ (which means it is 500) and batch size as 128.
I’ve added the screenshot for your reference:
You may also find these links to be useful:
- https://www.mathworks.com/help/deeplearning/ug/long-short-term-memory-networks.html
- https://www.mathworks.com/help/deeplearning/ref/trainingoptions.html
0 件のコメント
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Startup and Shutdown についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!