optimization routine for function that is not smooth and also has a lot of local minimum

3 ビュー (過去 30 日間)
xueqi
xueqi 2013 年 2 月 18 日
Hi, is anyone can give me some suggestion about optimization routine for function that is not smooth and also has a lot of local minimum?
  1 件のコメント
José-Luis
José-Luis 2013 年 2 月 18 日
編集済み: José-Luis 2013 年 2 月 18 日
There are many many optimization routines. None is guaranteed to give you the correct result, given the characteristics of your function, short of exhaustive testing.

サインインしてコメントする。

回答 (5 件)

Alan Weiss
Alan Weiss 2013 年 2 月 19 日
If you want to minimize a nonsmooth function, the general recommendation is to try patternsearch. If there are many local minima, the general recommendation is to use a variety of start points. If you have finite bounds on all components, lb and ub, then you can obtain random start points as follows:
x0 = lb + rand(size(lb)).*(ub - lb);
For more methods of choosing start points, see this section.
Good luck,
Alan Weiss
MATLAB mathematical toolbox documentation
  4 件のコメント
xueqi
xueqi 2013 年 2 月 20 日
Thanks! So you think that the problem is not coming from the function being not smooth? I will try to add some constraints soon and see how it goes. But please stay with me in this post. I will tell you what happen and may still need your advice.
xueqi
xueqi 2013 年 3 月 1 日
Yes. I do find that the value of the function is changing dramatically with respect to one variable. When near one particular number (which is the value that I want to estimate) it shows concavity. But as it gets far away from that particular value, the function goes to Inf quickly. I think it is why fmincon cant work correctly. But I can't really add proper constraint to this variable...What should I do?

サインインしてコメントする。


Thorsten
Thorsten 2013 年 2 月 18 日
http://en.wikipedia.org/wiki/Simulated_annealing

Mohsen  Davarynejad
Mohsen Davarynejad 2013 年 2 月 18 日
All the algorithms suitable for black-box optimization can be used. These algorithms, including Genetic algorithms (GA) and Particle swarm optimization (PSO), do not make any assumptions regarding the problem at hand, and thus they neither require the function to be convex, nor do they require the availability of the gradient of the function.
  5 件のコメント
xueqi
xueqi 2013 年 2 月 20 日
I have 8 independent variables to estimate. And this is the code I use for coding ga
if true
% A=[1 1 1 0 0 0 0 0];
b=[1];
lb=[0.1;0.1;0.1;0.0001;1.1;0.001;1.1;0.001];
ub=[1;1;1;1;Inf;1;Inf;1];
options=optimset('Algorithm','interior-point','Display','iter','MaxIter',50,'TolFun', 1e-1);
[x,fval,exitflag,output,lambda,grad] = fmincon(@beta,[0.2,0.2,0.2,0.01,100,0.1,100,0.1],A,b,[],[],lb,ub,[],options)
%x= ga(@beta,8,A,b,[],[],lb,ub,[],options)
end
xueqi
xueqi 2013 年 2 月 20 日
It is just it seems take forever for ga to get the result....for my function beta, it takes nearly 1 minutes if I just assign a particular values to get the result. But when minimize beta, it takes several days even for just around 10 iterations. I am not sure that this is normal....

サインインしてコメントする。


Juan Camilo Medina
Juan Camilo Medina 2013 年 3 月 2 日
編集済み: Juan Camilo Medina 2013 年 3 月 2 日
Based on my experience, the best way is to use simulated annealing, which is less heuristic than GA and it's powered by a Markov chain. fmincon() or any other gradient-based algorithm won't work well since the function is non-smooth.
[x,fval,exitFlag,output] = simulannealbnd(ObjectiveFunction,X0)
you can also try
x = fminsearch(fun,x0)

Matt J
Matt J 2013 年 3 月 3 日
編集済み: Matt J 2013 年 3 月 3 日
I mentioned this to you in another post, but I'll document it here as well. I think the best way to deal with an ill-behaved objective function is to replace it with a better one. I.e., question your initial reasoning behind this function and see if there are alternative formulations that you haven't considered.
If nothing else, it is suspicious that you think you absolutely must accept the non-smoothness of the function. If you've thought about the problem formulation carefully enough, there are almost always smooth approximations that you can find. The local minima are another matter.
If you can't see an alternative yourself, it may be time to walk us through the development of the function you're thinking of so we can see where you might have taken a wrong turn.
  41 件のコメント
Matt J
Matt J 2013 年 3 月 10 日
編集済み: Matt J 2013 年 3 月 10 日
One idea would be, for large alpha, beta to write the loglikelihood in terms of Stirling's approximation of Beta(alpha,beta),
llh(x,alpha, beta)= alpha*log(x)+beta*log(1-x) + 0.5*log(2*pi)...
(alpha+beta-.5)*log(alpha+beta) - ...
(alpha-.5)*log(alpha) - ...
(beta-.5)*log(beta) - ...
xueqi
xueqi 2013 年 3 月 12 日
There are something wrong with the maxmin function. Maybe it is the casue of all the problems...

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeSolver Outputs and Iterative Display についてさらに検索

タグ

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by