lsqnonlin optimization: large condition number of Jacobian matrix at all iterations, but full rank

16 ビュー (過去 30 日間)
I use lsqnonlin to solve a non-linear data-fitting problem (fitting parameters of a partial differential equation). The minimization problem is
f = ||g^sim(params) - g^exp ||^2
Currently, I optimize 18 parameters and the vectors g^sim and g^exp have 1470 entries, which is a collection of vectors with 147 entries at 10 different times.
The exact parameters (re-identification) are given by
0
0.245923
0.125367
0.37129
0.604339
0.729706
0.242791
0.488714
0.84713
0.0215718
0.00518026
0
0.00484491
0.0188392
0.0413179
0.0717639
0.109767
0.154998
Here is my code:
opts = optimoptions('lsqnonlin', ...
'StepTolerance', 1e-9, ...
'FunctionTolerance', 1e-9, ...
'OptimalityTolerance', 1e-9, ...
'MaxIterations', 250,...
'SpecifyObjectiveGradient', true, ...
'CheckGradients', false);
sol = lsqnonlin(@(params)objFun(params, g_exp), E0, lb, ub, opts);
function [f,J] = objFun(params, g_exp)
g_sim = ...; %call pde solver
J = ...; &call pde solver
f = g_sim - g_exp;
%scaling of f and J is explained later...
end
Running lsqnonlin with a given start vector, lsqnonlin returned exitflag=3 after 25 iterations and 26 function calls; The sum of squares is 4.3470e-16 and the firstorderopt 3.2449e-08. Also, the reference parameters from above are perfectly re-identified up to the 6th digit after the decimal point indicating the perfect fit.
However, what I observed is that the Jacobian matrix
J = d g^sim(params) / d params
at all 25 iterations has condition numbers
cond(J) \approx 1e11
rank(J) = 18
This is unconvient since I would like to compute some quality measures like the correlation matrix, which does not really make sense for such high condition numbers (the product J'*J has, consequently, even greater condition numbers).
Also, the optimization fails for some other start vectors which indicates that there might be a problem with my Jacobian or the parameters.
Based on my knowledge, such high condition numbers can be traced back to a bad scaling. Currently, I scale like
w=1/abs(max(g_exp(g_exp~=0)));
W=w*ones(length(g_exp),1);
%scale residual and jacobian
f=W(:).*r(:);
J = J.*W(:);
I also tried different scaling approaches which are very common in my field. So I think the issue is not related to scaling the optimization problem.
As you can see, rank(J)=18 at all iterations, which (if I am not mitstaken) indicates that the parameters are not linearly dependent on each other.
Having all that said, what might be reasons why I have so high condition numbers and what could I try to reduce them? Is my data vector g_exp maybe not appropriate?
I am also wondering why the optimization works very well under these circumstances.

採用された回答

Matt J
Matt J 2023 年 6 月 2 日
編集済み: Matt J 2023 年 6 月 2 日
Your scaling isn't really doing anything meaningful, since all residuals are weighted equally. Really, you are just multiplying your Jacobian by a scalar, which cannot change its cond() number, e.g.,
J=rand(3,2);
cond(J)
ans = 8.0366
cond(rand*J)
ans = 8.0366
Also, another scaling you need to consider, maybe even more importantly than the scaling of the residual, are the units of your parameters. This would introduce weights on the columns of your Jacobian, not just the rows.
Another possibility is simply that your problem is over-parametrized, creating a continuum of non-unique solutions. That's not always a big deal, but may account for why the optimization seems to "fail" (in your words) at different initial points. You haven't described what you're interpreting as a failure, but I'm assuming it means you get unexpected results for the parameters.
As you can see, rank(J)=18 at all iterations, which (if I am not mitstaken) indicates that the parameters are not linearly dependent on each other.
Right, but that's often not helpful. You are using the rank() command's default tolerance setting, which may or may not be appropriate for your problem. It's very easy to construct matrices which pass rank()'s default criteria, but shouldn't be considered full rank, e.g.,
A=diag([1e-14,1]);
cond(A)
ans = 1.0000e+14
rank(A)
ans = 2
rank(A,1e-10)
ans = 1
  39 件のコメント
SA-W
SA-W 2023 年 6 月 7 日
What I wanted to say is "to end up with a SET OF linear systems" to successively solve the pde with a Newton-Raphson scheme.
Anyway, do you have a reference where the procedure of implementing a pde as nonlinear constraint is described? Maybe your own work in case you have done something similar already.
This approach seems to be not so common in the realm of my literature.
Matt J
Matt J 2023 年 6 月 7 日
編集済み: Matt J 2023 年 6 月 7 日
No, I don't have a reference, but I think it should be obvious. A PDE, like any other equation, is of the form ceq(x)=0 which is the form that the Optimization Toolbox solvers require for nonlinear equality constraints.

サインインしてコメントする。

その他の回答 (1 件)

John D'Errico
John D'Errico 2023 年 6 月 2 日
編集済み: John D'Errico 2023 年 6 月 2 日
@SA-W - A full rank does NOT mean the parameters are not linearly dependent. With that high of a condition number, it often does mean they are VERY nearly dependent, just not exactly so. However, poor choices of units can often cause high condition numbers, and Matt has attempted to tell you exactly that. Listen to what Matt is telling you.
Consider these two matrices:
format long g
small = 1e-11;
A = [1 1+small;1 1]
A = 2×2
1 1.00000000001 1 1
B = [1 small/4;1 -small/4]
B = 2×2
1 2.5e-12 1 -2.5e-12
cond(A)
ans =
400003843933.334
cond(B)
ans =
400000000000
Both matrices have almost identically the same (and very large) condition number. However the A matrix cannot be simply repaired, because the columns are virtually linearly dependent. The B matrix has a problem essentially because of a poor choice of units. The two columns of B are very different. and in fact, are orthogonal to each other. However, the linear algebra will have difficulties with both cases, because the condition number of B is as large as that of A.

カテゴリ

Help Center および File ExchangeGeneral PDEs についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by