フィルターのクリア

Using fgoalattain with machine learning objects

1 回表示 (過去 30 日間)
Paul
Paul 2014 年 10 月 2 日
I'm having trouble optimizing a design, mostly because I'm not quite sure the best way to go about this, I think.
I started by running a simulation DOE in an external simulation software, which gave me a whole bunch of input and output data. My DOE has three inputs (design parameters), and ultimately I'm trying to optimize three separate outputs which characterize the design performance. I originally tried to do this the easy way in Minitab, but its response surface regression tools were just too limited and my optimizations weren't even close to accurate unless I was right up against the DOE boundaries. So I turned to MATLAB.
Now, because I have three inputs (and I know that they all interact with each other and I know that they all have nonlinear relationships), simple curve fitting wasn't going to work. So after pursuing some different options, I finally settled on using MATLAB's machine learning toolbox to just generate predictive models for each output. My first question is whether this was smart to do. I'm not too familiar with machine learning, and even less familiar with MATLAB's inner workings for it. That being said, I have ended up with three predictive models that do a fairly good job of representing my system, so I wrote an objective function for it:
function F = ObjectiveFunction(x)
load machine_learning_regression_workspace.mat
if exist('Delay_rens','var') == 1
F(1) = predict(Delay_rens,x);
F(2) = predict(SSlope_rens,x);
F(3) = predict(ESlope_rens,x);
elseif exist('DelayNet','var') == 1
F(1) = (sim(DelayNet,x'))';
F(2) = (sim(SSlopeNet,x'))';
F(3) = (sim(ESlopeNEt,x'))';
end
As you can see, I have it set up to allow a number of different options depending on what kind of machine learning objects I send it (i.e. boosted regression tree, bagged regression tree, neural network). This works pretty well, and if I send it hypothetical inputs I get outputs that agree with my simulation (which it should since I have verified the models' accuracy via error characterization methods).
My problem now is solving this objective function to find an optimal set of input parameters. I have been trying fgoalattain, passing in my starting point, goal, weights, constraints, etc, but every optimization ends the same way... the search normal hits 0 after the first or second iteration and everything stops, spitting out an x vector the same as my start point vector. I have determined that this is due to MATLAB having issues automatically calculating the derivative of the objective function, but I'm not quite sure how to fix that, seeing as I can't really get a representative derivative manually. So I'm wondering if it's reasonable or even possible to use fgoalattain in this manner with machine learning objects. If not, what are my other options?
Sorry for the long write up. I appreciate any help in advance.

回答 (0 件)

カテゴリ

Help Center および File ExchangeSolver Outputs and Iterative Display についてさらに検索

製品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by