Unconstrained Optimization with Additional Parameters

1 回表示 (過去 30 日間)
Javer
Javer 2011 年 8 月 23 日
Hello
I have a problem which is very similar to this unconstrained optimization example using fminunc with additional parameters here (bowlpeakfun function example):
My problem is that I want to use the large Scale algorithm where the derivatives of the objective function are supplied. Therefore if I rewrite my objective function as
function [y, grad] = bowlpeakfun(x, a, b, c)
y = (x(1)-a).*exp(-((x(1)-a).^2+(x(2)-b).^2))+((x(1)-a).^2+(x(2)-b).^2)/c;
if nargout >1
grad = gradient of y;
end
And then set the anonymous function/optimization as
a = 2;b = 3; c = 10;
f = @(x)bowlpeakfun(x,a,b,c)
x0 = [-.5; 0];
options = optimset('GradObj','on');
[x, fval] = fminunc(f,x0,options)
I get an error 'Failure in initial user-supplied objective function evaluation. FMINUNC cannot continue' which I think its related somehow to the fact that the anonymous function doesn't see that there is a gradient associated with bowlpeakfun. Redifining the anonymous function as
[f,grad] = @(x)bowlpeakfun(x,a,b,c)
does not work either. Any help much appreciated.
Thanks

回答 (6 件)

Alan Weiss
Alan Weiss 2011 年 8 月 24 日
I assume that your line
grad = gradient of y;
is not intended to be taken literally, but you are just saving the space that would be taken by writing the entire gradient.
Have you tried to evaluate
[fval gradfval] = f(x0)
to see why MATLAB is throwing an error?

Javer
Javer 2011 年 8 月 24 日
Hi Alan
Thanks for the help. Yes,
grad = gradient of y;
is not to be taken literally. It is just the gradient of the function. To clarify, my function is
function [y, grady] = myfun(x, params)
y = f(x,params) % x is the variable, params are the parameters
if nargout >1
grady = f'(x,params); % f' gradient of y wrt x
end
If I set fminunc to use the Medium Scale algoritm and then set the anonymous function/optimization as
params = 'given values'
f = @(x) myfun(x,params)
x0 = ['any initial values'];
options = optimset('LargeScale','off');
[xnew, fval] = fminunc(f,x0,options)
There is no problem and I get my solution; the optimiser will work out the derivatives numerically. Now, what I want is to be able to use my own analytical derivatives with the Large-Scale algorithm. So, if I write
params = 'given values'
[f gradf] = @(x) myfun(x,params)
Matlab doesn't like this anonymous function and throws 'Error: Only functions can return multiple values'. If I now set the problem as (hoping Matlab will pick out my supplied analytical derivatives)
params = 'given values'
f = @(x) myfun(x,params)
x0 = ['any initial values'];
options = optimset('GradObj','on');
[xnew, fval] = fminunc(f,x0,options)
Matlab will throw another error
??? Error using ==> uminus Too many output arguments.... Caused by: Failure in initial user-supplied objective function evaluation. FMINUNC cannot continue.
So, it seems what I am having problems with is setting up the problem correctly.
Thanks
  2 件のコメント
Walter Roberson
Walter Roberson 2011 年 8 月 24 日
grady = f'(x,params);
is not a valid line of code. The "'" character cannot occur in that syntax.
Javer
Javer 2011 年 8 月 25 日
Sorry, with the ' in f'(x,params) I only meant that myfun returns both the function value and the derivative if requested, whichever they might be.

サインインしてコメントする。


Walter Roberson
Walter Roberson 2011 年 8 月 24 日
Use a subfunction instead of an anonymous function to pass the additional parameter, and pass the handle to the subfunction to fminunc .

Javer
Javer 2011 年 8 月 25 日
Thanks Walter
It seems I am not being lucky with that approach either. My function is as created above:
function [y, grady] = myfun(x, params)
y = f(x,params) % x is the variable, params are the parameters
if nargout >1
grady = f'(x,params); % f' gradient of y wrt x
end
And now I create the subfunction you alluded to as
function [x,fval,exitflag,output] = runnested(x0,params,options)
%
[x,fval,exitflag,output] = fminunc(@nestedfun,x0,options);
%
function [y grady] = nestedfun(x)
[y grady] = myfun(x,params);
end
end
And finally, the optimization
params= ['any desired values'];
options = optimset('Display','iter','MaxIter',100,'GradObj','on');
[x,fval,exitflag,output] = runnested(x0,net,options);
And MATLAB thows the following
??? Error using ==> uminus Too many output arguments. ... Caused by: Failure in initial user-supplied objective function evaluation. FMINUNC cannot continue.
Not quite sure what else I could try next.
Thanks

Steve Grikschat
Steve Grikschat 2011 年 9 月 6 日
We've yet to see your code for the analytical gradient calculation (hopefully we don't ;) ). I suspect that the error might be there.
Have you tried calling the function with two outputs as Alan suggested? i.e.
params = ...
f = @(x) myfun(x,params);
[fval0,grad0] = myfun(x0);
What do you get there?

Javer
Javer 2011 年 9 月 6 日
I am not quite sure I am following you both not. If I follow your suggestion
params = ...
f = @(x) myfun(x,params);
[fval0,grad0] = myfun(x0);
Matlab doesn't like and says that Input argument "params" is undefined.
f(x0)
works OK but only returns the objective function (not the derivative).
[fval0,grad0] = f(x0,params);
fails with 'Too many output arguments' And
[fval0,grad0] = myfun(x0,params);
works OK and return the function and its derivatives. But I am left with an analytical derivative which I can't use with the large scale algorithm.
  1 件のコメント
Steve Grikschat
Steve Grikschat 2011 年 9 月 7 日
Let's go back to the nested function example you wrote a few posts back. From the error it appears to me that the problem lies inside the function with a computation involving unary minus (uminus) or negation.
Try using the debugger to step into the 1st evaluation of your objective and gradient function from within fminunc to see where the error lies.
If you're not familiar, see the help here:
http://www.mathworks.com/help/techdoc/matlab_env/brqxeeu-175.html

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeGet Started with Optimization Toolbox についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by