Are there any options to resize/replicate the matrices/vectors between layers of a deep network?
5 ビュー (過去 30 日間)
古いコメントを表示
In a deep learning network, I have two branches operating from same input layer. In one of the branches, I have fully-connected layer, say 1X1XN dimensions. In another branch, I have a convolutional layer giving two-dimensional matrix, say PXQXS. In order to proceed with further convolutions by combining them, I have to concatenate the outputs of these branches by repeating the N-dimensional vector to form PXQXN, so that I will get a PXQX(N+S) matrix. To do this, are there any means to replicate a vector to matrix, analogous to 'repmat()' function, in between deep network layers?
In other words, is there any means by which I can concatenate two layers of different width and height by bringing them to a common size in a deep network?
0 件のコメント
回答 (1 件)
Delprat Sebastien
2020 年 1 月 25 日
I did a custom reshape layer for that purpose. Read the custom layer doc, it is very simple.there is however a very big limitation. Custom layers cannot change the dlarray format. That means that it is necessary to have a conv layer between the fully connected (output format is SB) and your reshape layer. The conv layer will output a (SSCB) so you can reshape it.
Source:mathworks support
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Image Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!