- sdgm
- rmsprop
- adam
- lbfgs
I am working on neural networks and having matlab 2021a and 2023a, but i face invalid solver name option while using optimization techniques.
7 ビュー (過去 30 日間)
古いコメントを表示
I am working on neural networks and having matlab 2021a and 2023a versions, but i face invalid solver name option while using optimization techniques like adadelta, adagrad, trainbr,trainlm,traingdx, trainscg, trainbfg with LSTM. how i solve this problem?
options = trainingOptions('trainbr', ...
'MaxEpochs', 10, ...
'MiniBatchSize', 10, ...
'GradientThreshold', 1, ...
'Verbose', 1, ...
'Plots', 'training-progress');
it gives invalid solver name.
0 件のコメント
採用された回答
Ayush
2024 年 1 月 1 日
編集済み: Ayush
2024 年 1 月 1 日
I understand that you are getting invalid solver name option while using optimization techniques like adadelta, adagrad, trainbr, trainlm, traingdx, trainscg, trainbfg. Till R2023b release solver name supported for training Deep learning Neural Networks are:
You may refer to documenation for more information: https://www.mathworks.com/help/deeplearning/ref/trainingoptions.html#mw_4f311de8-ad53-4201-ac1a-47f66e0de52d
Training function supported for shallow neural networks are given here: https://www.mathworks.com/help/deeplearning/ref/fitnet.html#bu2w2vc-1-trainFcn
Also, you may refer to file exchange for other optimization techniques from here: https://www.mathworks.com/matlabcentral/fileexchange/71069-gradient-descent-optimization
Thanks,
Ayush
2 件のコメント
Ayush
2024 年 1 月 1 日
編集済み: Ayush
2024 年 1 月 1 日
Hi @Muhammad Shoaib MATLAB built-in function trainingOptions will only expect the type of inputs specified in the documentation. However, if you want to define the custom network you may refer to the link: https://www.mathworks.com/help/deeplearning/ug/train-network-using-custom-training-loop.html
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Sequence and Numeric Feature Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!