fminsearch and simulated annealing with penalties

16 ビュー (過去 30 日間)
Rose16
Rose16 2020 年 2 月 9 日
コメント済み: John D'Errico 2020 年 2 月 9 日
Hello everybody,
I am dealing with the following problem:
I need to minimize the solution of a system of ordinary differential equations in order to find the optima parameters for my system. So, I give an initial x0 (containing my parameters) and then I write an objective function where I solve the system (with my own scheme) with the parameters and then I compare the solution of the system to the one I would like to get...To do this I use simulated annealing or minsearch and it works.
The point is that I need to impose some relations on the parameters x0, for example I would like that x0(1)>x0(2) etc...
I am trying to write some penalties in my objective function, but it does not work...How should I do?
Thank you in advance
  2 件のコメント
Rose16
Rose16 2020 年 2 月 9 日
Actually,this is what I do and does not work:
function cost=objFcn(r,ExpData)
%%relations to impose
den=(r(1)*r(2))/(r(1)+r(3));
r(7)=r(6)/(constant*den*den*exp(-2));
r(9)=r(11)/(r(10)-r(15));
r(8)=den/2;
% I should also impose some inequalities...but for now it is fine this way
[f]=myscheme(r);
cost = norm(f-Expdata);
end
John D'Errico
John D'Errico 2020 年 2 月 9 日
You don't say why it does not work, however I can make at least some conjecture.
It seems you have a few equality constraints among the variables, and you wish to accomplish them by eliminating a few of the variables.
Thus, r(7) can be written as a function of variables {1,2,3,6}.
r(9) can be re-written as a function of variables {10,11,15}.
r(8} is a function of variables {1,2,3}.
So let me look more closely at what you are doing. First, you are trying to solve a 15 parameter optimization using fminsearch. This exceeds the limits of what I would ever use fminsearch to solve, by nearly 2-1. 15 or more unknowns is a laughable attempt using fminsearch. Use a tool that can handle the problem. And we don't even know if you don't have more than 15 unknowns in the problem.
But, next, while you claim that you are doing the same thing as what I suggested in my answer, ARE YOU KIDDING? NOI! Not in the slightest are you doing the same.
Note that whatever values for r(7), r(8), r(9), chosen by the optimizer, they are completely disregarded by you. They are never used in any way. This means you are searching over a 15 parameter search space using fminsearch, but then ignoring 3 of the (at least) 15 parameters.
This will cause problems in the optimization, very possibly causing fminsearch to get lost, since it can freely vary those 3 additional ignored parameters, with no gain in the objective, but at no cost either. Essentialy, you have now created a singular problem, wherein you can vary those 3 extra parameters freely. There are now infinitely many solutions. Fminsearch will not handle that well at all.
So first, use a better tool than fminsearch. I would highly recommend fmincon. The small cost of the optimization toolbox is well worth the time you have probably already wasted in trying to kludge fminsearch into solving the problem. A virtue of using fmincon here is it will allow you to pose those inequality constraints easily. And you can pose the equality constraints properly, in the form of actual constraints.
Remember that your time has some value.

サインインしてコメントする。

回答 (3 件)

Jeff Miller
Jeff Miller 2020 年 2 月 9 日
It depends a little on exactly what relations you want to impose, but often you can do this by mapping MATLAB's real-number parameters to the constrained parameters that you really want, Do something like this inside the objective function:
function err = myObjFn(matlabx0)
myx0(1) = matlabx0(1)^2 + matlabx0(2); % constraint that myx0(1)>myx0(2)
myx0(2) = matlabx0(2);
% Now proceed with your calculations using myx0, which satisfies your constraints
end

Star Strider
Star Strider 2020 年 2 月 9 日
The siumlannealbnd function only permits range bounds on the parameters. The ga function permits linear and nonlinear constraints as well. See Minimize a Nonsmooth Function with Linear Constraints and others to understand how to do that. I almost always use ga because it allows such constraints, so I have no recent experience with simulannealbnd. It appears that the fitness functions require the same syntax.
See: Is it possible to use Simulated Annealing or Genetic Algorithm for parameter estimation of a parametric System of ODEs? for an example of using ga to estimate the parameters of a system of ordinary differential equations.

John D'Errico
John D'Errico 2020 年 2 月 9 日
編集済み: John D'Errico 2020 年 2 月 9 日
You do not want to solve it using penalties. That tends to cause problems, making the solution more difficult. At best, the optimization will probably take more function evaluations than necessary. It can still work, but if a better solution exists, you want to use it...
The idea is to perform a transformation inside your objective function. For example, to insure that your parameters are decreasing, so x(1) >= x(2) >= x(3) >= ...
function result = objfun(x)
xtrans = x(1) - cumsum(x(2:end).^2);
% now compute your objective function, as a function of the vector xtrans.
(do stuff here, returning result, as a function of xtrans)
end
As you should see, the vector xtrans will always obey the requirement of being decreasing. When the optimizer is done, perform the same transformation on the vector it returns, to give you the unknowns as you wish, thus a decreasing sequence.
The only time that will even result in two consecutive elements as EXACTLY equal will be when the optimizer chose one of the variables in x as identically zero. That will essentually never happen, but if you truly need a strict inequality, then you might do something like...
dxmin = 1.e-15;
xtrans = x(1) - (cumsum(x(2:end).^2 + dxmin);
That will force a tiny increment always in the variables. You can choose dxmin to be whatever you wish as realistic for your problem. Just don't make it too tiny.
There are many other ways you can use simple internal transformations to achieve your goals.

カテゴリ

Help Center および File ExchangeMultiobjective Optimization についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by