How can I constrain neural network weights?

1 回表示 (過去 30 日間)
Luke Wilhelm
Luke Wilhelm 2012 年 12 月 7 日
回答済み: Sara Perez 2019 年 9 月 12 日
I am using the neural network toolbox to create a feed forward network. The input is one 4x1 vector, then there is one 4-neuron hidden layer, one 6-neuron hidden layer, and one 4-neuron output layer. I would like to be able to constrain the final 4x6 matrix of layer weights such that the weight values cannot be negative. I realize that this will probably affect the network's accuracy, but for the purpose of my research, I would like to see what the results are.
Is it possible to constrain the layer weights in this way? I have found how to set layer weights to a specified value and prevent their learning using net.layerWeights{i,j}.learn=false;, but not how to allow wights to change, while preventing them from becoming negative.
Thanks, Luke
  1 件のコメント
Greg Heath
Greg Heath 2012 年 12 月 9 日
One hidden layer is sufficient for a universal approximator.
If the hidden node activation functions are all odd, changing the sign of all weights connected to one activation function will not change the output.
Therefore, if there is only one output node, the task is easy.
Otherwise, it will not work in general.

サインインしてコメントする。

回答 (2 件)

R L
R L 2015 年 7 月 24 日
I would like to ask you how did you a subset of the layer weights to a specified value while preventing their learning using net.layerWeights{i,j}.learn=false.
Have you ever solved your question regarding constraining the weights to have a specified sign while training with learning? thanks

Sara Perez
Sara Perez 2019 年 9 月 12 日
You can set the propiety value of the layer 'WeightLearnRateFactor' to zero, so the weights won't be modified or learned
more info here:

カテゴリ

Help Center および File ExchangeSequence and Numeric Feature Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by