Forward function with frozen batch normalization layers
古いコメントを表示
In my application i have both batch normalization and dropout, and i would like to perform MC dropout with the forward function, and ideally i would freeze the parameters TrainedMean and TrainedVariance for the batch normalization layers, but i cannot seem to understand is it possible. I have the bn layers after conv layers, and the dropout after the recurrent layer in my net. Thank you in advance
採用された回答
その他の回答 (0 件)
カテゴリ
ヘルプ センター および File Exchange で Deep Learning Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
