フィルターのクリア

Weight decay parameter and Jacobian matrix of a neural network

5 ビュー (過去 30 日間)
Platon
Platon 2014 年 2 月 19 日
コメント済み: Greg Heath 2014 年 2 月 21 日
I want calculate prediction intervals so I have 2 direct questions:
  1. How can I get the weight decay parameter 'alpha' (mse+alpha*msw) used when using 'trainbr' as a training algorithm?
  2. How can I get the neural network jacobian matrix (derivatives following weights) calculated during training?

採用された回答

Greg Heath
Greg Heath 2014 年 2 月 19 日
編集済み: Greg Heath 2014 年 2 月 19 日
The documentation for trainbr is pretty bad.
help trainbr
doc trainbr
Look at the source code
type trainbr
I am not familiar with it but will take a look when I get time.
Meanwhile, if you make a run, the training record tr, contains 2 parameters
gamk: [1x31 double]
ssX: [1x31 double]
that are involved.
Hope this helps.
Thank you for formally accepting my answer
Greg

その他の回答 (1 件)

Platon
Platon 2014 年 2 月 21 日
Thank you. It helps to determine alpha but not to calculate the neural network jacobian matrix. Hope that future Matlab NN tool Box versions include specific tools to make prediction interval study.
  1 件のコメント
Greg Heath
Greg Heath 2014 年 2 月 21 日
When using the obsolete msereg and mse with the regularization option, the weight parameters are alpha (specified error weight) and (1-alpha).
However when using trainbr, the weight parameters alpha and beta are calculated each epoch. Haven't decifered the logic yet. Might be faster to search the web.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeSequence and Numeric Feature Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by