フィルターのクリア

dlgradient: Error Value to differentiate must be a traced dlarray scalar.

67 ビュー (過去 30 日間)
qi lu
qi lu 2021 年 1 月 14 日
回答済み: Anshika Chaurasia 2021 年 1 月 18 日
I want to train a deep network by Automatic Differentiation. Is these any solution?
layer2 = [
imageInputLayer([9 36 1],'Normalization','none','Name','input1-fcc')
convolution2dLayer([7,7],64,'Name','conv1-fcc')
batchNormalizationLayer('Name','bn1-fcc')
reluLayer('Name','relu1-fcc')
globalAveragePooling2dLayer('Name','pool5-fcc')
fullyConnectedLayer(1,'Name','fc1')];
lgraph = layerGraph(layer2);
dlnet = dlnetwork(lgraph);
% Input
a = rand(9,36,1,10);
a = dlarray(a,'SSCB');
a_pre = forward(dlnet,a);
% output
b = rand(1,10);
loss = mse(a_pre,b);
gradients = dlgradient(loss,dlnet.Learnables);

採用された回答

Anshika Chaurasia
Anshika Chaurasia 2021 年 1 月 18 日
Hi Qi Lu,
You can try following code to compute gradients that will resolve your error:
layer2 = [
imageInputLayer([9 36 1],'Normalization','none','Name','input1-fcc')
convolution2dLayer([7,7],64,'Name','conv1-fcc')
batchNormalizationLayer('Name','bn1-fcc')
reluLayer('Name','relu1-fcc')
globalAveragePooling2dLayer('Name','pool5-fcc')
fullyConnectedLayer(1,'Name','fc1')];
lgraph = layerGraph(layer2);
dlnet = dlnetwork(lgraph);
% Input
a = rand(9,36,1,10);
a = dlarray(a,'SSCB');
[loss,gradients] = dlfeval(@compute_gradient,dlnet,a);
function [loss,gradients]=compute_gradient(dlnet,a)
a_pre = forward(dlnet,a);
% output
b = rand(1,10);
loss = mse(a_pre,b);
gradients = dlgradient(dlarray(loss),dlnet.Learnables);%automatic gradient
end
Refer to the following documentation for more information on Automatic Differentiation.

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeCustom Training Loops についてさらに検索

タグ

製品


リリース

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by