- I agree that gradient descent is vector quantity & points in the direction of maximum change of the cost function.
- The ‘net.trainParam.min_grad’ is a scalar(numeric) quantity. The parameter ‘min_grad’ denotes the minimum magnitude (which is scalar) of gradient descent (which is vector), for which the training of neural network terminates.
- When the magnitude of gradient descent becomes less than ‘min_grad’, the neural network model is said to be optimized (and hence, further training stops).
What is the parameter minimum performance gradient (trainParam.min_grad) of traingd?
8 ビュー (過去 30 日間)
古いコメントを表示
I use the training function "traingd" to train a shallow neural network:
trainedNet = train(net,X,T)
For the training function "traingd": How is the parameter minimum performance gradient (net.trainParam.min_grad) defined?
As the gradient for the gradient descent is usually a vector, but net.trainParam.min_grad is a scalar value, I am confused.
Is it the change in the performace (loss) between 2 iterations, and if yes: Does it refer to the training, validation or testing errror?
Thanks in advance!
I use MATLAB 2013 and 2015 with the neural network toolbox.
0 件のコメント
採用された回答
Rishabh Mishra
2020 年 9 月 28 日
編集済み: Rishabh Mishra
2020 年 9 月 28 日
Hi,
Based on your description of the issue, I would state a few points:
For better understanding, refer the following links:
Hope this helps.
2 件のコメント
その他の回答 (0 件)
参考
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!