How to get the gradient with respect to the output of a specific layer in a deep learning model?
4 ビュー (過去 30 日間)
古いコメントを表示
Hi,
I have a model myModel which is a simple 2 Layers model (i.e. input size: 3; Layer_1 size: 7; Layer_2 size: 4).
That is, the layer's output is a
and the loss is
.


From below code, now I have the
whose dimention is 60 since there are 60 parameters in the model (i.e. (3+1)*7+(7+1)*4 = 60). Nis the number of data points (i.e.
in my code)


But want I want is the gradient with respect to the layer 2's output, instead of the loss. That is, I want
instead, whose dimention would be
since
.



How can I get
? Many thanks!

My code
rng(123); % seed
X_ori=[4,163,80;5,164,75]; % data; #(number) = 2; #(features) = 3;
X=permute(X_ori,[3,4,2,1]);
dlX = dlarray(X, 'SSCB');
Y_ori=[0, 0, 0, 1; 0, 1, 0, 0]; % data labels (i.e. one-hot vectors for 4 classes)
myModel = [
imageInputLayer([1 1 3],'Normalization','none','Name','in')
fullyConnectedLayer(7,'Name','Layer 1')
fullyConnectedLayer(4,'Name','Layer 2')];
MyLGraph = layerGraph(myModel);
myDLnet = dlnetwork(MyLGraph);
gradients = dlfeval(@modelGradients, myDLnet, dlX, Y_ori);
function [gradients] = modelGradients(myModel, modelInput, CorrectLabels)
CorrectLabels_transpose=transpose(CorrectLabels);
[modelOutput,state] = forward(myModel,modelInput);
loss = -sum(sum(CorrectLabels_transpose.*log(sigmoid(modelOutput/100))));
gradients = dlgradient(loss, myModel.Learnables);
end
Thanks
0 件のコメント
回答 (1 件)
Athul Prakash
2019 年 11 月 25 日
You could try looking at the source code for the fullyConnectedLayer that you're using; and then customizing that code to return the gradients you're looking for.
Alternatively, try creating your own custom layer to achieve this. See the doc below for that:
Hope it Helps!
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Deep Learning Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!