Neural Network not updating weights after configuration

22 ビュー (過去 30 日間)
LukasJ
LukasJ 2020 年 9 月 7 日
回答済み: LukasJ 2020 年 11 月 6 日
Dear all,
I created a very simple feedforwardnet using the following code:
% set up
numknot = 14;
net = feedforwardnet([numknot]);
% reLU = poslin
net.layers{1}.transferFcn = 'poslin';
net = init(net);
net = configure(net, X, Y);
[rowW, colW] = size(net.iw{1,1});
I then wanted to set initialising input weights for the 14 knots. The biases are set to zero.
net.iw{1,1} = iwIN1;
%iwIN1 is a colX x numknot, e.g. 20 x 14, matrix with values between -1 and 1
%setting biases to 0
[rowB, ~] = size(net.b{1,1});
net.b{1,1} = zeros(rowB,1);
Unfortunately after I run:
[net, record] = train(net, X, Y,'useParallel','yes','useGPU','only');
The network is doing 3 iterations and not updating anything and it has little to no accuracy. iwIN1 is the same as iwIN2. The biases are still zero. If I run the code with zeros instead of my weight matrix iwIN1 it's updating the weights & biases...
I don't know how to proceed now. Thanks for any help in advance.

回答 (2 件)

Uday Pradhan
Uday Pradhan 2020 年 9 月 11 日
編集済み: Uday Pradhan 2020 年 9 月 11 日
Hi Lukas,
I have tried to implement your neural net with a sample data (bodyfat_dataset) and it seems to run fine for the size and shallowness of the network used. I have also attached the code in the asnwer, do have a look.
A good practice would be to place the line
net = init(net);
after using "configure" function as is suggested here. In addtion you can also try altering the default training parameters by using "net.trainParam". For example, you may set:
net.trainParam.max_fail = 10; %default is 6
and observe how the training and validation losses progress. Hope this helps!
  5 件のコメント
LukasJ
LukasJ 2020 年 9 月 12 日
I don't really want to share my training data. The inputs are normalized +1 - 0, the weights are normalized +1 -1. Obviously I cannot use any initilization weights, that's the purpose of the question.
A colleague of mine fixed it for these weights by putting init() not only after configure() but also after setting the weights.
net = configure(net, X, Y);
% training and validation settings
net.divideParam.trainRatio = 1;
net.divideParam.valRatio = 0;
net.divideParam.testRatio = 0;
net.trainParam.max_fail = 10;
net.iw{1,1} = iwIN;
% net.iw{1,1} = zeros(14,21);
% set biases to 0
[rowB, ~] = size(net.b{1,1});
net.b{1,1} = zeros(rowB,1);
net.trainParam.epochs = 30;
net = init(net);
[net, record] = train(net, X, Y,'useParallel','yes','useGPU','only');
I can however set the biases to zero before and after init() and they will always change.
This simply has to be a Matlab bug. There is no reason why my weights wouldn't work and any random weights between -1 and 1 do...
Uday Pradhan
Uday Pradhan 2020 年 9 月 13 日
編集済み: Uday Pradhan 2020 年 9 月 13 日
Hi Lukas,
Thanks for your detailed reply.
Yes, you are right, the network should be able to train (as loosely as it may) with the weights you have specified as well. As for the solution you have given, the network is not training with iwIN as the initial weights because using net = init(net) just before training the net will undo all the changes to the weights that you have made. You can verify this by checking the values of net.iw{1} just after using net = init(net). If you haven't specified in the initFcn, these will be different from iwIN.
Next, I was able to reproduce the situation that you faced i.e. the weights and biases are not updated after training. However, the following approach solved this issue:
net = feedforwardnet(14);
numNN = 10;
NN = cell(1, numNN);
perfs = zeros(1, numNN);
for i = 1:numNN
fprintf('Training %d/%d\n', i, numNN);
net = configure(net, xtrain, ttrain);
net.iw{1,1} = IWIN;
[rowB, ~] = size(net.b{1,1});
net.b{1,1} = zeros(rowB,1);
NN{i} = train(net, xtrain, ttrain,'useParallel','yes','useGPU','only');
y2 = NN{i}(xtest);
perfs(i) = mse(NN{i}, ttest, y2);
end
I trained the neural net for 10 consecutive times and stored the trained networks as well as their performance metrics. Remeber that even though we are initialising the first layer's weights and biases, the net still has the second layer where the weights and biases are initialised by default and each instance of train in this loop will start with a different set of initial weights and biases for the second layer, and with a different division of the first dataset into training, validation, and test sets.
After this training, you can investigate the network which has the lowest mse of all and check if its initial weights and biases have been updated. I hope this will help!

サインインしてコメントする。


LukasJ
LukasJ 2020 年 11 月 6 日
Dear Uday,
I tried to determine the best network architecture first by looping over the neurons of the first and second layer. To insure the same starting weights and biases on every neuron I tried to use midpoint initialisation. This appears to not work either unfortunately:
net = feedforwardnet([kk jj]);
net = configure(net, X, Y); % number of inputs; number of outputs
net.layers{1}.transferFcn = 'logsig';
net.layers{2}.transferFcn = 'poslin'; % reLU = poslin
net.initFcn = 'initlay';
net.layers{1,1}.initFcn = 'initwb';
net.layers{2,1}.initFcn = 'initwb';
net.InputWeights{1,1}.initFcn = 'midpoint';
net.LayerWeights{2,1}.initFcn = 'midpoint';
net.divideParam.trainRatio = 1;
net.divideParam.valRatio = 0;
net.divideParam.testRatio = 0; %these are checked manually
% Training Parameter
net.trainParam.max_fail = 10; %tolerances
net.trainParam.epochs = 15; % number of epochs till convergation
net.trainParam.showWindow = false;
[net, tr] = train(net, X, Y,'useParallel','yes','useGPU','only');
NN{kk,jj} = net;
If I check the configured network the before training by calling
getwb(net)
I am getting random weights and biases. Setting fixed values as above always results in the network not training at all. Do you have any guess why the initFcn is not working properly?
kk and jj range from 1 to 10.
Best regards,
Lukas

製品


リリース

R2017b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by