How to use MATLAB's neural network tool box for minibatch gradient descent?

4 ビュー (過去 30 日間)
Ekta Prashnani
Ekta Prashnani 2016 年 2 月 17 日
コメント済み: Greg Heath 2016 年 3 月 4 日
Hi,
I want to learn the functional relationship between a set of input-output pairs. Each input is a vector of length 500 and the output is a scalar value. I have 1 million such input output pairs and the disk space is not enough to train on this entire batch of data at once (using a GPU).
Is there a way to perform mini-batch training in matlab? This question has been asked in the past ( http://www.mathworks.com/matlabcentral/answers/254826-matlab-neural-network-mini-batch-training ) but there was no reply.
I am aware of the function "adapt", which updates the network with each incoming input-output pair, but I want to perform training in a mini-batch. Are there any options to do so using the MATLAB Neural Network toolbox?
Please help me out, Ekta

採用された回答

Greg Heath
Greg Heath 2016 年 2 月 18 日
True to his word, Dr. Heath has posted
http://www.mathworks.com/matlabcentral/newsreader/view_thread/344511#943659
Hope this helps
Thank you for formally accepting my answer
Greg
  2 件のコメント
Ekta Prashnani
Ekta Prashnani 2016 年 3 月 1 日
This works, but it is significantly slower on my machine even when I use a Titan X GPU. Not sure if this solution utilizes GPU acceleration.
Greg Heath
Greg Heath 2016 年 3 月 4 日
Maybe a good programmer can optimize the code. The logic is straightforward.
Good Luck,
Greg

サインインしてコメントする。

その他の回答 (1 件)

Greg Heath
Greg Heath 2016 年 2 月 17 日
There is no problem; Train in a loop. However, do not configure or initialize the net between the minibatches of training data.
Hope this helps.
Thank you for formally accepting my answer
Greg
  4 件のコメント
Ekta Prashnani
Ekta Prashnani 2016 年 2 月 17 日
編集済み: Ekta Prashnani 2016 年 2 月 17 日
Thanks for your reply, Greg!
Sorry, I am confused: several online sources seem to suggest that "In training a neural net, the term epoch is used to describe a complete pass through all of the training patterns." (Source: http://www.cse.unsw.edu.au/~billw/mldict.html#epoch )
Some of the experts from Stanford are saying the same thing: "one epoch means that every example has been seen once." (Source: http://cs231n.github.io/neural-networks-3/, Section "Babysitting the learning process" )
Am I missing something in interpreting the text in the above sources? Or maybe the manner in which the epoch is defined in different in matlab?
I may be wrong, but I think the confusion here is in the difference between an epoch and an iteration. An iteration is completed every time the network parameters are updated (be it using the entire training data or mini batches of the training data). An epoch is completed when the network has passed through the entire training data once. (Source: http://deeplearning4j.org/glossary.html, please scroll down to "Epoch vs. Iteration" section)
Please correct me if I am wrong, looking forward to hearing back from you!
Greg Heath
Greg Heath 2016 年 2 月 18 日
No, it looks like I was wrong. As far as searching mathworks for minibatch info, I get the following number of hits
NEWSGROUP ANSWERS
minibatch 0 3
mini-batch 0 6 (includes above 3)
I will see if I can structure the guts of a naive minibatch code and post in the NEWSGROUP.
Greg

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by