- Randomly sample from data, with or without replacement: https://www.mathworks.com/help/stats/datasample.html
- Label images for computer vision applications: https://www.mathworks.com/help/vision/ref/imagelabeler-app.html
- Bootstrap sampling: https://www.mathworks.com/help/stats/bootstrp.html
Evaluate classification/regression performance against noisy annotation
1 回表示 (過去 30 日間)
古いコメントを表示
Is there a way to evaluate classification/regression performance accounting of noisy annotations.
Let's say I have a cat/dog detector and 1000 cat/dog test images. The 1000 images are human annotated, so it's likely there are annotation errors, e.g. some bounding box may be incorrect and some class label may be wrong. Simply draw a confusion matrix or derive an IoU just compare the detector performance with the noisy data, and I don't think such way is appropriate.
So my questions are
- What's the appropriate way to estimate the bbox and class error margin in the data set, given it's not possible to go through each of them?
- How to incorporate the above annotation error when reporting the performance of the cat/dog detector?
Thanks.
0 件のコメント
回答 (1 件)
Himanshu
2024 年 8 月 9 日
Hi,
I see that you are trying to evaluate classification/regression performance while accounting for noisy annotations.
To estimate the bounding box (bbox) and class error margin, start by manually verifying a small, randomly selected subset of the data to determine the error rate. Additionally, employ statistical methods like bootstrapping to estimate error margins from this subset.
When reporting performance, adjust your metrics to account for the estimated annotation noise and include confidence intervals to reflect the uncertainty due to these errors. You can apply noise-robust algorithms to mitigate the impact of annotation noise on performance evaluation.
Please refer to the below documentations for more information.
I hope this helps.
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Statistics and Machine Learning Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!