I implemented in 2019b a simple reshape layer for Deep Learning which I'd like to use with a dlnetwork.
The network passes analyzeNetwork(), but fails with:
Error using dlarray (line 179)
Number of dimension labels must be greater than or equal to the number of dimensions in the input array.
Error in nnet.internal.cnn.layer.util.CustomLayerFunctionalStrategy>@(zi)dlarray(zi,Xdims)
After some digging, turns out the problem is in R2019b\toolbox\nnet\cnn\+nnet\+internal\+cnn\+layer\+util\CustomLayerFunctionalStrategy.m in predict() and forward() functions. There, the labels of the input X are captured through
[expectedType, Xdims, X] = processInput(this, X);
and then the labels are reapplied to the output of the layer in
Z = cellfun(@(zi) dlarray(zi, Xdims), Z, 'UniformOutput', false);
When the custom layer increases the number of dimensions of the output (as e.g. in reshape()) the error occurs.
Please report this bug/defficiency to Mathworks and suggest a workaround.
PS This behaviour may exist in other places too.
PSS BTW, note that both predict() and forward() expect the output to be an ulabeled dlarray and verify this through calls to
this.LayerVerifier.verifyUnlabeledDlarray( 'predict', layer.OutputNames, Z )
this.LayerVerifier.verifyUnlabeledDlarray( 'forward', layer.OutputNames, Z )
so I'm not sure how the extra labels could be passed.