connecting concenation layer error
2 ビュー (過去 30 日間)
古いコメントを表示
Hello everyone. I have an issue. In the following code, I cant connect concatenationLayer = concat to featureAttention & temporalAttention. Would you please help?
Error Message
Caused by:
Layer 'concat': Unconnected input. Each layer input must be connected to the output of another layer.
++
numFeatures = size(XTrain, 2);
numClasses = numel(categories(YTrain));
% Feature-Level Attention
featureAttention = [
fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
reluLayer('Name', 'relu_feature_attention')
fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
softmaxLayer('Name', 'feature_attention_weights')
];
% Temporal Attention (not used for Iris dataset, but included for completeness)
temporalAttention = [
fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
softmaxLayer('Name', 'temporal_attention_weights')
];
% Combine into Hierarchical Attention
hierarchicalAttention = [
featureInputLayer(numFeatures, 'Name', 'input_features') % Input layer for features
featureAttention
temporalAttention
concatenationLayer(1, 2, 'Name', 'concat') % Concatenate feature and temporal attention outputs
];
1 件のコメント
採用された回答
Matt J
2025 年 2 月 2 日
編集済み: Matt J
2025 年 2 月 2 日
Use connectLayers to make your connections programmatically or make the connections manually in the deepNetworkDesigner.
2 件のコメント
Matt J
2025 年 2 月 2 日
編集済み: Matt J
2025 年 2 月 3 日
% Feature-Level Attention Block (Encapsulated)
featureAttention = networkLayer([
fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
reluLayer('Name', 'relu_feature_attention')
fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
softmaxLayer('Name', 'feature_attention_weights')
], 'Name', 'feature_attention_block');
% Temporal Attention Block (Encapsulated)
temporalAttention = networkLayer([
fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
softmaxLayer('Name', 'temporal_attention_weights')
], 'Name', 'temporal_attention_block');
% Create a layerGraph with multiple layers but NO connections yet
hierarchicalAttention = layerGraph([
featureInputLayer(numFeatures, 'Name', 'input_features');
featureAttention
temporalAttention
concatenationLayer(1, 2, 'Name', 'concat_attention');
]);
% Connect the layers
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'feature_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'temporal_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'feature_attention_block', 'concat/in1');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'temporal_attention_block', 'concat/in2');
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Image Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!