- Lower your learning rate. It is too high.
- Use of regularization technique
- Make sure each set (train, validation and test) has sufficient samples like 60%, 20%, 20% or 70%, 15%, 15% split for training, validation and test sets respectively.
- Perform k-fold cross validation
- Randomly shuffle the data before doing the spit, this will make sure that data distribution is nearly the same. If your data is in datastore you can use 'shuffle' function else you can use "randperm" function.
Overinflated mini-batch Accuracy and Validation Accuracy when training Faster-RCNN
4 ビュー (過去 30 日間)
古いコメントを表示
Hi. Can anyone offer any reasoning as to why when I train a RCNN using transfer learning (i.e. ResNet-18/ ResNet-50) after the first 50 iterations the mini-batch accuracy and validation accuracy immideatly jump to ~99% , yet when I review results the network performance doesn't reflect the mini-batch or validation accuracy (it's always much worse). I've tried: 1)reducing training set percentage 2)increasing mini-batch size 3)increasing validation frequency 4)changing max epochs 5) experimenting with different anchor box sizes. No matter what I try the training progress which looks good doesn't match the final results.
the top fig is what I always see, middle is precision recall curve and last fig is an example of the options I was using but I've changed a lot of these parameters and don't see a difference
0 件のコメント
回答 (1 件)
Prince Kumar
2022 年 4 月 7 日
Hi,
This generally happens when your model is learning the data instead of learning the pattern. This scenario is called 'Overfitting'.
Following few thing can be trieds:
Hope this helps!
参考
カテゴリ
Help Center および File Exchange で Deep Learning Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!