Folding batch normalization into preceding convolution

12 ビュー (過去 30 日間)
Kai Tan
Kai Tan 2019 年 2 月 17 日
Trying to port a trained MATLAB CNN models to another framework. I would like to get rid of batch norm (BN) layers by folding the parameters into the preceding convolution layers. I use the following formulation:
% layer is BatchNorm layer
m = layer.TrainedMean;
v = layer.TrainedVariance;
offset = layer.Offset;
scale = layer.Scale;
ep = layer.Epsilon;
% adjust the weights:
for j = 1:size(scale, 3) % the number of output channels
denom = sqrt(v(1,1,j)+ep);
% adjust the 4D convolution weights
new_w(:,:,:,j) = scale(1,1,j)*w(:,:,:,j)/denom;
% adjust the convolution biases
new_b(1,1,j) = scale(1,1,j)*(b(1,1,j)-m(1,1,j))/denom + offset(1,1,j);
end
However, when I compare the outputs of the BN layer to the outputs of the convolution layer, where the BN parameters were folded into, I am getting different results.
Left (BN layer), Right (Conv Layer with BN parameters folded).
untitled.jpg
Has anyone successfully done this?

回答 (0 件)

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

製品


リリース

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by