How to display weight distribution in hidden layers of neural network?
1 回表示 (過去 30 日間)
古いコメントを表示
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/167219/image.jpeg)
I have 8 inputs in the input layer.Now i want to display weight distribution of these 8 inputs in hidden layer to observe the importance of features.To make it more clear example is shown in figure ( https://pasteboard.co/GKCpA6Q.png ).I used `plotwb` function of Matlab it didn't display the weights of every input.
Actually i want to look at weights(weights connecting inputs to first hidden layer) . Larger the weight is, the more important the input.
0 件のコメント
回答 (1 件)
Greg Heath
2017 年 9 月 17 日
That will not work. It does not account for the correlations between inputs.
The best way to rank correlated inputs is
1. Use NO HIDDEN LAYERS !
2. Run 10 or more trials each (different random initial weights)
using
a. A single input
b. All inputs except the one in a.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Sequence and Numeric Feature Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!