Implement flatten layer in CNN

4 ビュー (過去 30 日間)
Ali Al-Saegh
Ali Al-Saegh 2020 年 9 月 5 日
回答済み: Aniket 2024 年 9 月 23 日
Please, how to implement the flatten layer in CNN, i.e. transform 2D feature map of convoulution layer output to 1D vector?
  5 件のコメント
Ali Al-Saegh
Ali Al-Saegh 2020 年 9 月 6 日
I was thinking about this but it is actually confusing me as it is just a fully connected, I understand it as connecting each entry within a feature map to all of the neurons in the fully connected layer.
I still cannot imagine how a fully connected layer flattens a feature map!
Thanks for your help!
Jianyu Zhao
Jianyu Zhao 2021 年 11 月 3 日
Hi friend, I also met this problem, I tried flatten layer but I found it is used for squence data. So have you solved this problem? Or do you have any ideas? I don't know how to do it so I just jump into pytorch instead:(

サインインしてコメントする。

回答 (1 件)

Aniket
Aniket 2024 年 9 月 23 日
I understand from the comments that you're facing issues visualizing how a fully connected (FC) layer flattens a feature map. In a typical CNN architecture, convolutions are used to extract features from the input data. After the convolutional layers, the feature map is flattened and passed to fully connected layers for further processing.
For example, if your feature map after the last convolutional layer looks like this:
[ [a, b],
[c, d] ]
After flattening, it will look like this:
[a, b, c, d]
Consider the below code:
% Define the layers of the network
layers = [
imageInputLayer([28 28 1], 'Name', 'input') % Input layer for 28x28 grayscale images
convolution2dLayer(3, 8, 'Padding', 'same', 'Name', 'conv1') % 3x3 convolution with 8 filters
batchNormalizationLayer('Name', 'batchnorm1')
reluLayer('Name', 'relu1')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool1') % Max pooling layer
convolution2dLayer(3, 16, 'Padding', 'same', 'Name', 'conv2') % Another convolution layer
batchNormalizationLayer('Name', 'batchnorm2')
reluLayer('Name', 'relu2')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool2') % Another max pooling layer
flattenLayer('Name', 'flatten') % Flatten layer to convert 2D feature maps to 1D vector
% fullyConnectedLayer(10, 'Name', 'fc') % Fully connected layer with 10 neurons
% softmaxLayer('Name', 'softmax') % Softmax layer for classification
% classificationLayer('Name', 'output') % Classification layer
];
% Analyze the network
analyzeNetwork(layers);
After running this code, the output is 784x1 vector:
But if you do not use a flatten layer, and use FC layer instead, MATLAB automatically flattens the output in background (Weights is 10 x 784) and use it for further processing
Each entry of this (hidden) flattened layer is connected to each neuron in the FC layer. Therefore, there is no need to explicitly add a flatten layer unless you have a specific reason to do so.
Refer to FlattenLayer documentation for further details:
If you want to transform a 2D feature map into a 1D vector manually, you can use the reshape function like this:
% Example 2D feature map
featureMap2D = rand(4, 4); % A 4x4 matrix as an example
% Convert the 2D feature map to a 1D vector
featureMap1D = reshape(featureMap2D, [], 1);
% Display the original and reshaped feature map
disp('Original 2D Feature Map:');
disp(featureMap2D);
disp('Reshaped 1D Feature Map:');
disp(featureMap1D);
Refer to reshape documentation to know more about this function:
I hope this helps you get on the right track.

タグ

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by