フィルターのクリア

Adam Optimizer with feedforward nueral networks

6 ビュー (過去 30 日間)
Manos Kav
Manos Kav 2018 年 4 月 30 日
コメント済み: Bob 2022 年 11 月 18 日
Hello, is there any way to use Adam optimizer to train a neural network with the "train" fucntion? Or a way to use this implementation ( https://www.mathworks.com/matlabcentral/fileexchange/61616-adam-stochastic-gradient-descent-optimization ) to train the network?
Thanks in advance.
  2 件のコメント
Abdelwahab Afifi
Abdelwahab Afifi 2020 年 6 月 14 日
Have you get the answer ?
Bob
Bob 2022 年 11 月 18 日
did anyone of you guys got the answer?

サインインしてコメントする。

回答 (1 件)

Hrishikesh Borate
Hrishikesh Borate 2020 年 6 月 19 日
Hi,
It’s my understanding that you want to use Adam optimizer to train a neural network. This can be done using trainNetwork function, and setting the appropriate Training Options.
For eg.,
[XTrain,~,YTrain] = digitTrain4DArrayData;
layers = [ ...
imageInputLayer([28 28 1])
convolution2dLayer(12,25)
reluLayer
fullyConnectedLayer(1)
regressionLayer];
options = trainingOptions('adam', ...
'InitialLearnRate',0.001, ...
'GradientThreshold',1, ...
'Verbose',false, ...
'Plots','training-progress');
net = trainNetwork(XTrain,YTrain,layers,options);
[XTest,~,YTest] = digitTest4DArrayData;
YPred = predict(net,XTest);
rmse = sqrt(mean((YTest - YPred).^2))
For more information, refer to trainNetwork.
  1 件のコメント
Abdelwahab Afifi
Abdelwahab Afifi 2020 年 6 月 19 日
'trainNetwork' is used for Deep learning neural network. But I think he wanna use 'Adam optimizer' to train shallow neural network using 'train' function.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeDeep Learning Toolbox についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by