Optimization and Machine Learning

8 ビュー (過去 30 日間)
Greg
Greg 2014 年 12 月 10 日
コメント済み: Greg 2014 年 12 月 10 日
If you're able to create a machine learning classification algorithm (such as a boosted Ensemble Learning techniques for regression or classification) to create a model, is it then possible to optimize the predicted response around that model?
For instance, say I create a classification model from 1000 examples (rows) and 70 features (columns) to predict a binary classification response. It's simple to then manually create a hypothetical 1001st example and predict the class to which it will belong.
I would like to be able to define & fix some of those 70 features, (let's say 5) while allowing others to fluctuate. Is there a way to do this, and then allow an optimization algorithm to optimize the remaining 65 features, such that I get the optimal combination of features to maximize the likelihood of achieving a given classification?
On the surface, it seems like the Optimization Toolbox would provide this functionality, but I don't know if its possible to define a machine learning model in the optimization toolbox.
Thanks.

回答 (1 件)

Sean de Wolski
Sean de Wolski 2014 年 12 月 10 日
Sequential feature selection is what it sounds like you're looking for.
doc sequentialfs
  1 件のコメント
Greg
Greg 2014 年 12 月 10 日
I don't think so. Sequential feature selection looks to be a way to minimize the number of variables required to achieve optimal predictive capability of a model. I'm looking for a way to substitute values for features back into a model to generate the optimal combination of feature values that yields a target response.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeClassification Trees についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by