フィルターのクリア

Matlab Shallow Network Mini Batch Training

6 ビュー (過去 30 日間)
Lynn Marciano
Lynn Marciano 2018 年 7 月 12 日
コメント済み: Jingjun Liu 2019 年 11 月 23 日
Hello, I have been training my data through the patternnet provided by matlab and really like it's functionality and I've been seeing great results using it. I have a problem however, I want to start investigating all the functions that can be adjusted under the hood of the default patternnet, but I have such a large data size, that even though I'm connected to a cluster, my model takes about 10 hours at minimum to train. I know there are capabilities with training on the GPU but after several attempts, it says I have no memory for training. I know having a minibatch might be able to compensate for this, but I'm not entirely sure if I have to create a datastore for the minibatch to be effective. If anyone has input a minibatch into the shallow network inputs and trained on GPU, please give me some insight on the right direction to go with this. Thanks in advance.
  4 件のコメント
sayak nag
sayak nag 2019 年 3 月 15 日
Please help I am following your advice but it seems that despite whatever I specify as my mini-batch the network is training in batch mode i.e. no of iterations per epoch is 1.
Jingjun Liu
Jingjun Liu 2019 年 11 月 23 日
The mini-batch does not work for sequenceInputLayer. That's what I found.

サインインしてコメントする。

回答 (1 件)

Greg Heath
Greg Heath 2018 年 12 月 20 日
If you have a huge dataset, it is often rewarding to just randomly divide it into m subsets. Then design with 1 and test on m-1. If the subsets are sufficiently large. it is not necessary to use m-fold cross-validation. However, you may want to design more than one net.
Hope this helps.
Thank you for formally accepting my answer
Greg

カテゴリ

Help Center および File ExchangeSequence and Numeric Feature Data Workflows についてさらに検索

製品


リリース

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by