フィルターのクリア

How to use Nadam optimizer in training deep neural networks

4 ビュー (過去 30 日間)
kollikonda Ashok kumar
kollikonda Ashok kumar 2023 年 3 月 29 日
回答済み: Nayan 2023 年 4 月 5 日
Training_Options = trainingOptions('sgdm', ...
'MiniBatchSize', 32, ...
'MaxEpochs', 50, ...
"InitialLearnRate", 1e-5, ...
'Shuffle', 'every-epoch', ...
'ValidationData', Resized_Validation_Data, ...
'ValidationFrequency', 40, ...
"ExecutionEnvironment","gpu",...
'Plots','training-progress', ...
'Verbose',false);

回答 (1 件)

Nayan
Nayan 2023 年 4 月 5 日
Hi
I assume you want to use "adam" optimizer in place "sgdm". You need to simply replace the "sgdm" key with "adam" keyword.
options = trainingOptions("adam", ...
InitialLearnRate=3e-4, ...
SquaredGradientDecayFactor=0.99, ...
MaxEpochs=20, ...
MiniBatchSize=64, ...
Plots="training-progress")

カテゴリ

Help Center および File ExchangeDeep Learning Toolbox についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by