How can i fix the error mismatch format when convert multiplication layer from LayerGraph to dlnetwork in prune and quantize network example
6 ビュー (過去 30 日間)
古いコメントを表示
I have some problems when getting used to MATLAB's Prune and Quantize Semantic Segmentation Network example at https://www.mathworks.com/help/deeplearning/ug/prune-and-quantize-semantic-segmentation-network.html
My custom LayerGraph network has a multiplication layer whose 2 input are 64(S) x 64(S) x 64( C) x 1(B) and 64(S) x 1(B), when accessing the dlnetwork function in the Prune section, I received an error message saying input format mismatch for this layer:
"Layer 'layer': For function layers with Formattable set to false, all inputs must have the same format. To enable support for multiple inputs with different formats, use a function layer with Formattable set to true."
I tried changing the Formattable property for this class to true and it worked, here is my custom multiplication layer to change the Formattable property.
function output = myPredictionFunction(input1, input2)
output = input1 .* input2;
end
here is my code for use it:
multiplication = functionLayer(@myPredictionFunction ,'Formattable', true);
lgraph = replaceLayer(lgraph,'layer',multiplication,'ReconnectBy','name');
trainedNet = dlnetwork(lgraph);
but, when using the calibrate() function in the Quantize section,
eqNet = equalizeLayers(prunedNet);
quantizableNet = dlquantizer(prunedNet,ExecutionEnvironment="GPU");
calibrate(quantizableNet,cdsTrain,MiniBatchSize=8);
I got the error:
"Error using dlquantizer/calibrate. For code generation of layer 'layer', 'Formattable' must be false."
I tried setting the Formattable to False and adjusting both inputs of the multiplication layer to 64(S) x 64(S) X 64(C) X 1(B) but the format mismatch error still persists when using dlnetwork() function.
Here is my layer to adjusting format of multiplication layer's input:
classdef RepeatValuesLayer < nnet.layer.Layer
% Custom layer to repeat values along spatial dimensions
properties
end
methods
function layer = RepeatValuesLayer(name)
% Construct a RepeatValuesLayer
% Set name
layer.Name = name;
% Set description
layer.Description = "Repeat values along spatial dimensions";
end
function Z = predict(~, X)
% Predict method
Z = repmat(X, [64, 64, 1]);
end
end
end
and the result in LayerGraph:
i dont know what to do to fix this issues
0 件のコメント
回答 (2 件)
Joss Knight
2024 年 3 月 29 日
That's an annoying limitation which is hopefully fixed in current releases. Is there any particular reason why you can't use multiplicationLayer? I presume it's the dimension expansion.
0 件のコメント
Joss Knight
2024 年 3 月 29 日
For one solution, replace the fully connected layers with convolution layers with filters the size of the input and num filters equal to the number of fully connected outputs. So I think 1x1x64x4 and 1x1x4x64 in your case. So basically one 2-d conv size 1 with 4 filters and one with 64 filters. That should ensure everything has the same dimension at the multiplication and you can use multiplicationLayer too.
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Build Deep Neural Networks についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!