フィルターのクリア

custom weight,bias(like fullyconnectedlayer) layer

4 ビュー (過去 30 日間)
jaehong kim
jaehong kim 2021 年 2 月 14 日
回答済み: Malay Agarwal 2024 年 2 月 28 日
Hi! I am currently coding a custom layer.
My custom layer = weight*input(vector)+bias
I'm thinking of configuring the following layers.
Input data= 2x25001
layers = [
featureInputLayer(2,'Name','in')
fullyConnectedLayer(64,'Name','fc1')
tanhLayer('Name','tanh1')
fullyConnectedLayer(32,'Name','fc2')
tanhLayer('Name','tanh2')
fullyConnectedLayer(16,'Name','fc3')
tanhLayer('Name','tanh3')
fullyConnectedLayer(8,'Name','fc4')
tanhLayer('Name','tanh4')
% fullyConnectedLayer(6,'Name','fc5')
weightedAdditionLayer(6,1,'add')
];
This is my custom layer code. However there is error...
Layer 'add': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
input 'in4'
input 'in5'
and 1 other inputs.
Thank you for reading my question.
Any answers are welcome.
classdef weightedAdditionLayer < nnet.layer.Layer
% Example custom weighted addition layer.
properties (Learnable)
% Layer learnable parameters
% Scaling coefficients
Weights
Bias
end
methods
function layer = weightedAdditionLayer(numInputs,numOutputs,name)
% layer = weightedAdditionLayer(numInputs,name) creates a
% weighted addition layer and specifies the number of inputs
% and the layer name.
% Set number of inputs.
layer.NumInputs = numInputs;
% Set number of onputs.
layer.NumOutputs = numOutputs;
% Set layer name.
layer.Name = name;
% Set layer description.
% layer.Description = "Weighted addition of " + numInputs + ...
% " inputs";
% Initialize layer weights.
% layer.Weights = rand(1,numInputs);
layer.Weights = rand(numOutputs,numInputs);
layer.Bias= zeros(numOutputs,1);
end
function Z = predict(layer, varargin)
% Z = predict(layer, X1, ..., Xn) forwards the input data X1,
% ..., Xn through the layer and outputs the result Z.
X = varargin;
W = layer.Weights;
B = layer.Bias;
% Initialize output
% X1 = X{1};
% sz = size(X1);
% Z = zeros(sz,'like',X1);
%
% % Weighted addition
% for i = 1:layer.NumInputs
% Z = Z + W(i)*X{i};
% end
Z = W*X+B;
end
end
end
  1 件のコメント
mary john
mary john 2022 年 10 月 29 日
Please use the deep network designer app from matlab which allows to connect the layers correct. You can import and add the custom layers using the version 2021a. Previous versions is not that user friendly.
So i suggest that you use the app.
All the best!!

サインインしてコメントする。

回答 (1 件)

Malay Agarwal
Malay Agarwal 2024 年 2 月 28 日
Hi Jaehong,
I understand that you are using a custom multi-input layer and receiving an error about unconnected inputs when creating a network with the layer.
I am assuming that your custom layer “weightedAdditionLayer” is based on the example shown here: https://www.mathworks.com/help/deeplearning/ug/define-custom-layer-with-multiple-inputs.html.
By default, when you pass a list of layers to the “layerGraph” function, it connects the layers sequentially, as mentioned in the documentation: https://www.mathworks.com/help/releases/R2021a/deeplearning/ref/nnet.cnn.layergraph.html#mw_78f2725b-f3a1-4a52-9c38-4ab635f59970. For example:
lgraph = layerGraph;
layers = [
imageInputLayer([32 32 3],'Name','input')
convolution2dLayer(3,16,'Padding','same','Name','conv_1')
batchNormalizationLayer('Name','BN_1')
reluLayer('Name','relu_1')];
lgraph = addLayers(lgraph,layers);
figure;
plot(lgraph);
The plot shows that each layer is connected in a sequential manner. In your case, you have specified the number of inputs for the “weightedAdditionLayer” to be 6. When you pass the list of layers "layers" to “layerGraph”, it connects the last “tanhLayer” to one of the inputs “in1” of the “weightedAdditionLayer”:
layers = [
featureInputLayer(2,'Name','in')
fullyConnectedLayer(64,'Name','fc1')
tanhLayer('Name','tanh1')
fullyConnectedLayer(32,'Name','fc2')
tanhLayer('Name','tanh2')
fullyConnectedLayer(16,'Name','fc3')
tanhLayer('Name','tanh3')
fullyConnectedLayer(8,'Name','fc4')
tanhLayer('Name','tanh4')
% fullyConnectedLayer(6,'Name','fc5')
weightedAdditionLayer(6,1,'add')
];
lgraph = layerGraph(layers);
figure;
plot(lgraph);
The remaining 5 inputs (“in2”, “in3”, “in4”, “in5” and “in6”) are unconnected, which generates the error.
The “connectLayers” function needs to be used to connect the unconnected inputs. For example, to create a network which uses the “weightedAdditionLayer” to add the output of two layers:
layers = [
imageInputLayer([28 28 1],'Name','in')
convolution2dLayer(5,20,'Name','conv1')
reluLayer('Name','relu1')
convolution2dLayer(3,20,'Padding',1,'Name','conv2')
reluLayer('Name','relu2')
convolution2dLayer(3,20,'Padding',1,'Name','conv3')
reluLayer('Name','relu3')
% Create a weightedAdditionLayer with 2 inputs
weightedAdditionLayer(2,1,'add')
fullyConnectedLayer(10,'Name','fc')
softmaxLayer('Name','softmax')
classificationLayer('Name','classoutput')];
% This will connect one input of the weightedAdditionLayer with the
% reluLayer just before the weightedAdditionLayer
lgraph = layerGraph(layers);
% Call connectLayers to connect the remaining input
lgraph = connectLayers(lgraph,"relu1","add/in2");
figure;
plot(lgraph);
In the code above:
  • A “weightedAdditionLayer” is created with 2 inputs.
  • A “layerGraph” is created. This connects one of the two inputs “in1” of the “weightedAdditionLayer” with the “reluLayer” just before the “weightedAdditionLayer” in “layers”.
  • connectLayers” is used to connect the remaining input "in2" of the “weightedAdditionLayer” with the “reluLayer” named “relu1”.
The output of “analyzeNetwork” is attached below:
Please add appropriate calls to “connectLayers” to ensure that all the inputs of the “weightedAdditionLayer” are connected.
Refer to the following resources for more information:
Hope this helps!

カテゴリ

Help Center および File ExchangeSequence and Numeric Feature Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by