How can I reduce extract features from a set of Matrices and vectors to be used in Machine Learning
1 回表示 (過去 30 日間)
古いコメントを表示
Hello Friends.
I have a task where I need to train a machine learning model to predict a set of outputs from multiple inputs. My inputs are 1000 iterations of a set of 3x 1 vectors, a set of 3x3 covariance matrices and a set of scalars, while my output is just a set of scalars. I cannot use regression learner app because these inputs need to have the same dimensions, any idea on how to unify them?
0 件のコメント
回答 (1 件)
Mahesh Taparia
2020 年 5 月 13 日
Hi
You can vectorised the 3X3 matrix to 9X1, then append with the rest of the features to make the 'd' dimensional input data, (i.e dX1). If the dimension is too high, you can use Principal Component Analysis to reduce that. Now you can use this dataset into regression learner and train your model.
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Linear Regression についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!