Weight decay parameter and Jacobian matrix of a neural network
2 ビュー (過去 30 日間)
古いコメントを表示
I want calculate prediction intervals so I have 2 direct questions:
- How can I get the weight decay parameter 'alpha' (mse+alpha*msw) used when using 'trainbr' as a training algorithm?
- How can I get the neural network jacobian matrix (derivatives following weights) calculated during training?
0 件のコメント
採用された回答
Greg Heath
2014 年 2 月 19 日
編集済み: Greg Heath
2014 年 2 月 19 日
The documentation for trainbr is pretty bad.
help trainbr
doc trainbr
Look at the source code
type trainbr
I am not familiar with it but will take a look when I get time.
Meanwhile, if you make a run, the training record tr, contains 2 parameters
gamk: [1x31 double]
ssX: [1x31 double]
that are involved.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 件のコメント
その他の回答 (1 件)
Platon
2014 年 2 月 21 日
1 件のコメント
Greg Heath
2014 年 2 月 21 日
When using the obsolete msereg and mse with the regularization option, the weight parameters are alpha (specified error weight) and (1-alpha).
However when using trainbr, the weight parameters alpha and beta are calculated each epoch. Haven't decifered the logic yet. Might be faster to search the web.
参考
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!