Classification Learner App k-Nearest Neighbor k-NN

2 ビュー (過去 30 日間)
Leen Al Homoud
Leen Al Homoud 2020 年 10 月 9 日
編集済み: Gaurav Garg 2020 年 10 月 12 日
I was using the Classification Learner App and testing k-Nearest Neighbor. My results showed that the highest accuracy is when the neighbor is 1 and it keeps on decreasing as the number of neighbors is increased. These results don't make sense and don't match the k-NN code I wrote. Did anyone face this problem? What was the reason?

回答 (1 件)

Gaurav Garg
Gaurav Garg 2020 年 10 月 12 日
編集済み: Gaurav Garg 2020 年 10 月 12 日
Hi,
The accuracy for any ML model (including KNN) is decided by mutiple factors, one of which is the dataset and its features.
It could be possible that your dataset is well suited for kNN with k=1, while might not give good accuracy for other number of neighbours. So, I would recommend you to try testing your dataset with other models (like SVM, Regression) too and see how are these models behaving.
Moreover, this is not a problem with the model or the dataset. Some models (with some hyper-parameters) overfit the data, some of them underfit the data, while some of them prove to be the perfect choices and this choice has to be analyzed.

カテゴリ

Help Center および File ExchangeStatistics and Machine Learning Toolbox についてさらに検索

製品


リリース

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by