I seem to be have been able to fool lsqnonlin into doing what I want by using the attached classdef. However, I still wonder, nervously, why the Optimization Toolbox is trying to prevent me from doing this.
A=rand(3); A(:,end)=1;
opts=optimoptions('lsqnonlin','Algorithm','trust-region-reflective',...
'OptimalityTol',1e-20,'FunctionTol',1e-30,'StepTol',1e-20,...
'SpecifyObjective', true, 'JacobianMultiplyFcn',@jmfunc);
[x,res]=lsqnonlin(@(x)resFcn(x,A), rand(3,1), [],[],opts)
Local minimum possible.
lsqnonlin stopped because the final change in the sum of squares relative to
its initial value is less than the value of the function tolerance.
x = 3×1
-0.0000
0.0000
4.0000
res = 1.5777e-30
function [r,Jinfo]=resFcn(x,A)
r=[A*x;x(:).^2/2]-[4;4;4;0;0;8];
s.A=A;
s.x=x;
Jinfo=JMInfo(s);
end
function W=jmfunc(Jinfo,Y,flag)
A=Jinfo.s.A;
x=Jinfo.s.x;
switch sign(flag)
case 0
W=A'*(A*Y)+Y.*x.^2;
case 1
W = [A*Y ;x.*Y];
case -1
P=Y(1:end/2);
Q=Y(end/2+1:end);
W = A.'*P+x.*Q;
end
end