Principal component analysis and feature reductions

12 ビュー (過去 30 日間)
Diver
Diver 2015 年 10 月 18 日
編集済み: the cyclist 2015 年 10 月 18 日
Hi; I have a matrix composed of 35 features, I need to reduce those feature because I think many variable are dependent. I undertsood PCA could help me to do that, so using matlab, I calculated:
[coeff,score,latent] = pca(list_of_features)
I notice " coeff" contains matrix which I understood (correct me if I'm wrong) have column with high importance on the left, and second column with less importance and so on. However, it's not clear for me which column on " coeff" relate to which column on my original " list_of_features" so that I could know which variable is more important.
Thank you

採用された回答

the cyclist
the cyclist 2015 年 10 月 18 日
編集済み: the cyclist 2015 年 10 月 18 日
It is true that the first column of coeff is the first principal component (PC), and is "most important" in the sense that it captures the greatest possible portion of the variation.
I think your best bet is to really dig into the documentation for pca, and the examples.
In the first example,
coeff =
-0.0678 -0.6460 0.5673 0.5062
-0.6785 -0.0200 -0.5440 0.4933
0.0290 0.7553 0.4036 0.5156
0.7309 -0.1085 -0.4684 0.4844
What that means is that the first PC is calculated as -0.0678 times your 1st variable, -0.6785 times your second variable, etc. [There are some nuances with respect to normalization and de-meaning of your data. Read the documentation!]
The second column of coeff gives PC 2, and so on.
It may be that you will get a high degree of dimension reduction, with a very small number of PC's capturing the vast majority of the variation. You can check this with the output explained, which reports the fraction of variation captured by each PC.

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeDimensionality Reduction and Feature Extraction についてさらに検索

タグ

タグが未入力です。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by