connecting concenation layer error

Hello everyone. I have an issue. In the following code, I cant connect concatenationLayer = concat to featureAttention & temporalAttention. Would you please help?
Error Message
Caused by:
Layer 'concat': Unconnected input. Each layer input must be connected to the output of another layer.
++
numFeatures = size(XTrain, 2);
numClasses = numel(categories(YTrain));
% Feature-Level Attention
featureAttention = [
fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
reluLayer('Name', 'relu_feature_attention')
fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
softmaxLayer('Name', 'feature_attention_weights')
];
% Temporal Attention (not used for Iris dataset, but included for completeness)
temporalAttention = [
fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
softmaxLayer('Name', 'temporal_attention_weights')
];
% Combine into Hierarchical Attention
hierarchicalAttention = [
featureInputLayer(numFeatures, 'Name', 'input_features') % Input layer for features
featureAttention
temporalAttention
concatenationLayer(1, 2, 'Name', 'concat') % Concatenate feature and temporal attention outputs
];

1 件のコメント

Matt J
Matt J 2025 年 2 月 2 日
編集済み: Matt J 2025 年 2 月 2 日
We cannot demonstrate solutions, because you do not provide numFeatures or numClasses. They are generated from XTrain and YTrain, which we do not have.

サインインしてコメントする。

 採用された回答

Matt J
Matt J 2025 年 2 月 2 日
編集済み: Matt J 2025 年 2 月 2 日

0 投票

Use connectLayers to make your connections programmatically or make the connections manually in the deepNetworkDesigner.

2 件のコメント

Matt J
Matt J 2025 年 2 月 2 日
編集済み: Matt J 2025 年 2 月 3 日
If you have R2024a, you can simplify things a bit with networkLayer:
% Feature-Level Attention Block (Encapsulated)
featureAttention = networkLayer([
fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
reluLayer('Name', 'relu_feature_attention')
fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
softmaxLayer('Name', 'feature_attention_weights')
], 'Name', 'feature_attention_block');
% Temporal Attention Block (Encapsulated)
temporalAttention = networkLayer([
fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
softmaxLayer('Name', 'temporal_attention_weights')
], 'Name', 'temporal_attention_block');
% Create a layerGraph with multiple layers but NO connections yet
hierarchicalAttention = layerGraph([
featureInputLayer(numFeatures, 'Name', 'input_features');
featureAttention
temporalAttention
concatenationLayer(1, 2, 'Name', 'concat_attention');
]);
% Connect the layers
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'feature_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'temporal_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'feature_attention_block', 'concat/in1');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'temporal_attention_block', 'concat/in2');
Fatih
Fatih 2025 年 2 月 5 日
thanks alot. It worked, it seems I was just going a bit blind. Now it is solved once I checked it thoroughly.

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

ヘルプ センター および File ExchangeDeep Learning Toolbox についてさらに検索

質問済み:

2025 年 2 月 2 日

コメント済み:

2025 年 2 月 5 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by