BatchNormalization layer with DropOut layer issue
古いコメントを表示
I'm having issues with BatchNormalization layer during training deep learning modules (UNET, SegNet) - both 2D/3D training models.
This layer is the reason all the time for having much lower validation accuracy/high jump in error values in the finish of the training - this causing me to not able to predict with this model. If I'm trying to load a certain checkpoint - I'm missing some values for using it (mean for example).
Is there a way to use in a certain model both DropOut + BatchNormalization layers without getting this issue? I'm using Matlab version 2020a, is there a fix in updater versions perhaps..?
回答 (0 件)
カテゴリ
ヘルプ センター および File Exchange で Deep Learning Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!