現在この質問をフォロー中です
- フォローしているコンテンツ フィードに更新が表示されます。
- コミュニケーション基本設定に応じて電子メールを受け取ることができます。
Relation or Pattern between curves
1 回表示 (過去 30 日間)
古いコメントを表示
I am having the following curves, and I am trying to find a relation between them, or a statistical factor to use it so i can predict curves just from one:
I would appreciate any help, Thanks!
3 件のコメント
Star Strider
2023 年 5 月 20 日
The ‘Duplicate’ flag is likely not appropriate here. This is the original post. There are (at least) two other duplicate posts related to it that I saw.
Image Analyst
2023 年 5 月 20 日
But if the question was posted multiple times, perhaps with slight modifications each time, the question becomes which is the one to answer. Presumably the latest, most recent one is the final/best one is the one to answer, rather than the first/oldest one.
採用された回答
Star Strider
2023 年 5 月 17 日
The plot appears to be incomplete.
What do the curves look like between 0 (or whatever the minimum independent value is) and 150? What are the independent variable values?
The parts of the curves displayed appear to be inverted Logistic function curves, so one option might be to estimate the parameters of each of them and then compare those parameters.
That objective function could be —
lgfcn = @(b,x) b(1) - b(2)./(1 + exp(-b(3).*(x-b(4))));
.
23 件のコメント
John D'Errico
2023 年 5 月 17 日
@Star Strider - good idea, but wrong model. The logistic you suggest will be symmetrical around the midpoint. So the left and right tails will look like each other in that model. But the curves shown are strongly skewed. So that cannot be a viable model for this process.
Star Strider
2023 年 5 月 18 日
This gives reasonably decent fits to the data, considering that we have no information on the process that created it. (A mathematical description of that process would be the best model for it.) This uses a slightly changed version of the modified logistic function that I originally proposed.
The funciton fits are plotted in the same colours as the data. It might be best to plot the data and the fit each together in a subplot or tiledlayout array. I plotted them on the same axes here since that’s how they were originally presented.
Try this —
LD = load('Data.mat')
LD = struct with fields:
Inter_cubic: [5×249 double]
Data = LD.Inter_cubic;
x = 1:size(Data,2);
lgfcn = @(b,x) b(1) .* -b(2)./(1 + exp(-b(3).*((x-max(x)).*b(4)-b(5))));
PopSz = 100;
Parms = 5;
for k = 1:size(Data,1)
optsAns = optimoptions('ga', 'PopulationSize',PopSz, 'InitialPopulationMatrix',randi(1E+4,PopSz,Parms).*1E-3.*[Data(1,1) -1 -1 0.01 -1], 'MaxGenerations',5E3, 'FunctionTolerance',1E-10); % Options Structure For 'Answers' Problems
B(:,k) = ga(@(b) norm(Data(k,:)-lgfcn(b,x)), Parms, [],[],[],[],[],[],[],[],optsAns);
end
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
figure
hold on
for k = 1:size(Data,1)
hp1 = plot(x, Data(k,:), 'd', 'MarkerSize',2, 'DisplayName',"Inter\_cubic Row "+k);
hp2 = plot(x, lgfcn(B(:,k),x), 'DisplayName',"Function Fit "+k);
hp2.Color = hp1.Color;
end
hold off
grid
legend('Location','best', 'NumColumns',size(Data,1))
Parameters = array2table(B, 'VariableNames',compose('Inter_cubic Row %d',1:size(Data,1)), 'RowNames',compose('b(%d)',1:size(Data,1)))
Parameters = 5×5 table
Inter_cubic Row 1 Inter_cubic Row 2 Inter_cubic Row 3 Inter_cubic Row 4 Inter_cubic Row 5
_________________ _________________ _________________ _________________ _________________
b(1) 8.0385 9.4816 3.8418 63.688 6.8332
b(2) -1.819 -2.116 -5.281 -0.321 -2.953
b(3) -0.791 -0.806 -2.233 -2.8395 -4.7949
b(4) 0.09692 0.09419 0.06454 0.06038 0.04561
b(5) -3.7631 -3.4574 -2.163 -1.9266 -1.095
The fitted lines aren’t perfect, however they are quite close.
.
Star Strider
2023 年 5 月 18 日
I was experimenting with the function. The first parameter was originally added, and I forgot to simplify it when I changed the code for ‘lgfcn’.
Simplifying it —
LD = load('Data.mat')
LD = struct with fields:
Inter_cubic: [5×249 double]
Data = LD.Inter_cubic;
x = 1:size(Data,2);
lgfcn = @(b,x) b(1)./(1 + exp(-b(2).*((x-max(x)).*b(3)-b(4))));
PopSz = 100;
Parms = 4;
for k = 1:size(Data,1)
optsAns = optimoptions('ga', 'PopulationSize',PopSz, 'InitialPopulationMatrix',randi(1E+4,PopSz,Parms).*1E-3.*[Data(1,1) -1 0.01 -1], 'MaxGenerations',5E3, 'FunctionTolerance',1E-10); % Options Structure For 'Answers' Problems
B(:,k) = ga(@(b) norm(Data(k,:)-lgfcn(b,x)), Parms, [],[],[],[],[],[],[],[],optsAns);
end
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
figure
hold on
for k = 1:size(Data,1)
hp1 = plot(x, Data(k,:), 'd', 'MarkerSize',2, 'DisplayName',"Inter\_cubic Row "+k);
hp2 = plot(x, lgfcn(B(:,k),x), 'DisplayName',"Function Fit "+k);
hp2.Color = hp1.Color;
end
hold off
grid
legend('Location','northoutside', 'NumColumns',size(Data,1))
Parameters = array2table(B, 'VariableNames',compose('Inter_cubic Row %d',1:size(Data,1)), 'RowNames',compose('b(%d)',1:size(B,1)))
Parameters = 4×5 table
Inter_cubic Row 1 Inter_cubic Row 2 Inter_cubic Row 3 Inter_cubic Row 4 Inter_cubic Row 5
_________________ _________________ _________________ _________________ _________________
b(1) 14.236 20.072 20.15 20.369 20.375
b(2) -0.824 -1.6437 -1.5384 -1.9802 -2.289
b(3) 0.08379 0.04973 0.090338 0.0763 0.0942
b(4) -2.7504 -1.8677 -2.939 -2.3413 -2.4704
.
Star Strider
2023 年 5 月 19 日
load('Data.mat')
Data = Inter_cubic;
x = 1:size(Data,2);
objfcn = @(b,x) b(1) + b(2).*tanh(b(3).*(x+b(4)));
for k = 1:size(Data,1)
B0 = [Data(k,1); 10; 0.1; -200];
mdl = fitnlm(x,Data(k,:), objfcn, B0)
B(:,k) = mdl.Coefficients.Estimate;
R2(k) = mdl.Rsquared.Adjusted;
end
mdl =
Nonlinear regression model:
y ~ b1 + b2*tanh(b3*(x + b4))
Estimated Coefficients:
Estimate SE tStat pValue
________ _________ _______ ___________
b1 2.0155 0.7002 2.8785 0.0043485
b2 -12.956 0.71557 -18.106 4.399e-47
b3 0.018988 0.0006045 31.411 6.9664e-88
b4 -236.73 2.8622 -82.709 5.1885e-181
Number of observations: 249, Error degrees of freedom: 245
Root Mean Squared Error: 0.381
R-Squared: 0.992, Adjusted R-Squared 0.992
F-statistic vs. constant model: 1e+04, p-value = 4.3e-256
mdl =
Nonlinear regression model:
y ~ b1 + b2*tanh(b3*(x + b4))
Estimated Coefficients:
Estimate SE tStat pValue
________ __________ _______ ___________
b1 7.892 0.19599 40.267 5.1454e-110
b2 -12.314 0.20445 -60.232 7.2457e-149
b3 0.029224 0.00060136 48.597 8.5211e-128
b4 -219.01 0.69285 -316.1 6.1264e-322
Number of observations: 249, Error degrees of freedom: 245
Root Mean Squared Error: 0.385
R-Squared: 0.996, Adjusted R-Squared 0.996
F-statistic vs. constant model: 1.92e+04, p-value = 2e-290
mdl =
Nonlinear regression model:
y ~ b1 + b2*tanh(b3*(x + b4))
Estimated Coefficients:
Estimate SE tStat pValue
________ __________ _______ ___________
b1 9.9895 0.038228 261.31 9.6776e-302
b2 -10.287 0.039946 -257.53 3.388e-300
b3 0.059716 0.00062939 94.88 3.5387e-195
b4 -216.29 0.11342 -1906.9 0
Number of observations: 249, Error degrees of freedom: 245
Root Mean Squared Error: 0.199
R-Squared: 0.999, Adjusted R-Squared 0.999
F-statistic vs. constant model: 7.36e+04, p-value = 0
mdl =
Nonlinear regression model:
y ~ b1 + b2*tanh(b3*(x + b4))
Estimated Coefficients:
Estimate SE tStat pValue
________ __________ _______ ___________
b1 10.258 0.023407 438.26 0
b2 -10.028 0.02426 -413.36 0
b3 0.079354 0.00063438 125.09 5.4588e-224
b4 -218.66 0.061486 -3556.3 0
Number of observations: 249, Error degrees of freedom: 245
Root Mean Squared Error: 0.142
R-Squared: 0.999, Adjusted R-Squared 0.999
F-statistic vs. constant model: 1.39e+05, p-value = 0
mdl =
Nonlinear regression model:
y ~ b1 + b2*tanh(b3*(x + b4))
Estimated Coefficients:
Estimate SE tStat pValue
________ __________ _______ ___________
b1 10.56 0.015929 662.98 0
b2 -9.7661 0.016335 -597.86 0
b3 0.11607 0.00079811 145.43 8.3072e-240
b4 -222.21 0.035038 -6342 0
Number of observations: 249, Error degrees of freedom: 245
Root Mean Squared Error: 0.108
R-Squared: 1, Adjusted R-Squared 1
F-statistic vs. constant model: 2.19e+05, p-value = 0
Parameters = array2table(B, 'VariableNames',compose('Data Row %d',1:size(Data,1)), 'RowNames',compose('b(%d)',1:size(B,1)))
Parameters = 4×5 table
Data Row 1 Data Row 2 Data Row 3 Data Row 4 Data Row 5
__________ __________ __________ __________ __________
b(1) 2.0155 7.892 9.9895 10.258 10.56
b(2) -12.956 -12.314 -10.287 -10.028 -9.7661
b(3) 0.018988 0.029224 0.059716 0.079354 0.11607
b(4) -236.73 -219.01 -216.29 -218.66 -222.21
figure
tiledlayout(size(Data,1),1)
for k = 1:size(Data,1)
nexttile
plot(x, Data(k,:), 'd', 'MarkerSize',1)
hold on
plot(x,objfcn(B(:,k),x), '-r')
hold off
grid
ylim([0 30])
ylabel("Row "+k)
text(215,25,sprintf('R^2 = %.2f',R2(k)), 'FontSize',9)
end
The tanh function is a ratio of two exponential functions so it lends itself to this sort of function fit. The first parameter shifts it in the ‘y’ direction, the second parameter scales its amplitude, the third parameter scales its slope, and the fourth parameter scales its location in the ‘x’ direction.
.
Star Strider
2023 年 5 月 19 日
I initially just used randn. When I found values that privded the best fit, I used approximations of those for the initial values. I suspect ga would have found them quickly, and in this instance, it did. The only specific requirement was to set the scale for the fourth parameter to be negative, and larger than the other values, and that only because the magnitude of ‘x’ was much greater than the magnitudes of the other parameter values. Dividing ‘x’ by 100 and changing the sign of ‘b(4)’ in ‘objfcn’ could make even that not be necessary. It comes up with a reasonably accurate estimate for ‘B0’ in every instance, that fitnlm then refines.
Try this —
load('Data.mat')
Data = Inter_cubic;
x = 1:size(Data,2);
objfcn = @(b,x) b(1) + b(2).*tanh(b(3).*(x+b(4)));
PopSz = 100;
Parms = 4;
for k = 1:size(Data,1)
optsAns = optimoptions('ga', 'PopulationSize',PopSz, 'InitialPopulationMatrix',randi(1E+4,PopSz,Parms).*[ones(1,3)*1E-4 -0.1], 'MaxGenerations',5E3, 'FunctionTolerance',1E-10); % Options Structure For 'Answers' Problems
B0 = ga(@(b) norm(Data(k,:)-objfcn(b,x)), Parms, [],[],[],[],[],[],[],[],optsAns)
mdl = fitnlm(x,Data(k,:), objfcn, B0)
B(:,k) = mdl.Coefficients.Estimate;
R2(k) = mdl.Rsquared.Adjusted;
end
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
B0 = 1×4
8.5122 5.6045 -0.1005 -209.5831
mdl =
Nonlinear regression model:
y ~ b1 + b2*tanh(b3*(x + b4))
Estimated Coefficients:
Estimate SE tStat pValue
_________ __________ _______ ___________
b1 2.0156 0.70014 2.8788 0.0043446
b2 12.956 0.71551 18.107 4.3494e-47
b3 -0.018988 0.00060451 -31.411 6.9762e-88
b4 -236.73 2.862 -82.714 5.1157e-181
Number of observations: 249, Error degrees of freedom: 245
Root Mean Squared Error: 0.381
R-Squared: 0.992, Adjusted R-Squared 0.992
F-statistic vs. constant model: 1e+04, p-value = 4.3e-256
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
B0 = 1×4
10.0702 9.8639 -0.0432 -212.1799
mdl =
Nonlinear regression model:
y ~ b1 + b2*tanh(b3*(x + b4))
Estimated Coefficients:
Estimate SE tStat pValue
_________ __________ _______ ___________
b1 7.892 0.196 40.266 5.1596e-110
b2 12.314 0.20445 60.231 7.2637e-149
b3 -0.029224 0.00060135 -48.597 8.5125e-128
b4 -219.01 0.69286 -316.1 6.1264e-322
Number of observations: 249, Error degrees of freedom: 245
Root Mean Squared Error: 0.385
R-Squared: 0.996, Adjusted R-Squared 0.996
F-statistic vs. constant model: 1.92e+04, p-value = 2e-290
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
B0 = 1×4
10.4556 -9.8869 0.0612 -215.3375
mdl =
Nonlinear regression model:
y ~ b1 + b2*tanh(b3*(x + b4))
Estimated Coefficients:
Estimate SE tStat pValue
________ __________ _______ __________
b1 9.9895 0.038229 261.31 9.702e-302
b2 -10.287 0.039946 -257.53 3.397e-300
b3 0.059716 0.00062938 94.881 3.532e-195
b4 -216.29 0.11343 -1906.9 0
Number of observations: 249, Error degrees of freedom: 245
Root Mean Squared Error: 0.199
R-Squared: 0.999, Adjusted R-Squared 0.999
F-statistic vs. constant model: 7.36e+04, p-value = 0
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
B0 = 1×4
10.9978 9.2900 -0.1028 -217.2195
mdl =
Nonlinear regression model:
y ~ b1 + b2*tanh(b3*(x + b4))
Estimated Coefficients:
Estimate SE tStat pValue
_________ __________ _______ ___________
b1 10.258 0.023407 438.26 0
b2 10.028 0.02426 413.36 0
b3 -0.079354 0.00063438 -125.09 5.4558e-224
b4 -218.66 0.061486 -3556.3 0
Number of observations: 249, Error degrees of freedom: 245
Root Mean Squared Error: 0.142
R-Squared: 0.999, Adjusted R-Squared 0.999
F-statistic vs. constant model: 1.39e+05, p-value = 0
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
B0 = 1×4
10.3432 10.0783 -0.0899 -222.7092
mdl =
Nonlinear regression model:
y ~ b1 + b2*tanh(b3*(x + b4))
Estimated Coefficients:
Estimate SE tStat pValue
________ __________ _______ __________
b1 10.56 0.015929 662.98 0
b2 9.7661 0.016335 597.86 0
b3 -0.11607 0.00079812 -145.43 8.331e-240
b4 -222.21 0.035038 -6342 0
Number of observations: 249, Error degrees of freedom: 245
Root Mean Squared Error: 0.108
R-Squared: 1, Adjusted R-Squared 1
F-statistic vs. constant model: 2.19e+05, p-value = 0
Parameters = array2table(B, 'VariableNames',compose('Data Row %d',1:size(Data,1)), 'RowNames',compose('b(%d)',1:size(B,1)))
Parameters = 4×5 table
Data Row 1 Data Row 2 Data Row 3 Data Row 4 Data Row 5
__________ __________ __________ __________ __________
b(1) 2.0156 7.892 9.9895 10.258 10.56
b(2) 12.956 12.314 -10.287 10.028 9.7661
b(3) -0.018988 -0.029224 0.059716 -0.079354 -0.11607
b(4) -236.73 -219.01 -216.29 -218.66 -222.21
figure
tiledlayout(size(Data,1),1)
for k = 1:size(Data,1)
nexttile
plot(x, Data(k,:), 'd', 'MarkerSize',1)
hold on
plot(x,objfcn(B(:,k),x), '-r')
hold off
grid
ylim([0 30])
ylabel("Row "+k)
text(215,25,sprintf('R^2 = %.2f',R2(k)), 'FontSize',9)
end
.
Hidd_1
2023 年 5 月 21 日
Can there be other functions to fit the same Data with the same or better accuracy?
Star Strider
2023 年 5 月 21 日
There could be. I initially used the logistic function, then found that tanh provided an even better and more robust fit.
You are certainly free to experiment with them, however with an adjusted R² of close to 1, I doubt other functions can improve on tanh, at least with these data. Other functions may be more appropriate to different data. My code should be robust to other functions as coded in ‘objfcn’, although it may require changing some of the parameters, such as ‘Parms’.
Star Strider
2023 年 5 月 22 日
The ‘b(3)’ parameter (that controls the slope) varies between about -0.01 and -0.11, so I would be reluctant to eliminate it.
The ‘B0’ result is quite similar in every call, so perhaps only one ga call (before the loop, using ‘x’ and the first row of the matrix) would be necessary, using that ‘B0’ for all the fitnlm calls.
Hidd_1
2023 年 5 月 22 日
From the results of the fitting, do you think that the system can be described mathematically?
Star Strider
2023 年 5 月 22 日
From just the fit, no. The fit is empirical.
Describing it mathematically would require knowing the process that created it.
Hidd_1
2023 年 5 月 22 日
In case I wanted to do a Curve fitting between the curves, is it easier to interpolate between the b's or should I interpolate between the data sets?
Star Strider
2023 年 5 月 22 日
It wouild be best to interpolate between the data sets. They asll have the same numbers of points.
One possible way of interpolating them would be to initially plot them as a surface to understand how they relate to each other —
load(websave('Data','https://www.mathworks.com/matlabcentral/answers/uploaded_files/1387204/Data.mat'))
Data = Inter_cubic;
x = 1:size(Data,2);
r = (1:size(Data,1));
xr = repmat(x,1,size(Data,1)).';
Datar = reshape(Data.',[],1);
rr = repmat(r,1,size(Data,2));
DataF = scatteredInterpolant(xr(:), rr(:), Datar);
rq = 3.5*ones(size(x));
NewData = DataF(x(:), rq(:));
figure
surfc(x, r, Data)
hold on
plot3(x, rq, NewData, '-r', 'LineWidth',2)
hold off
colormap(turbo)
view(40,30)
xlabel('X')
ylabel('Row')
figure
plot(x, NewData)
hold on
plot(x, mean(Data([3 4],:)))
hold off
There are then a number of ways to interpolate this, my favourite being scatteredInterpolant. I am at something of a loss to explain the reason it is oscillating like that, unless the data oscillates more than I thought it does. It normally produces a smooth interpolation. The mean of the neighbouring data do not oscillate.
.
Hidd_1
2023 年 5 月 22 日
In case I have another variable, which is related to this process, would that help to kind gets what's behind the b's, right? I mean would I able to get a clearer picture about what the b's reflect physically
Star Strider
2023 年 5 月 23 日
To fully understand the parameters (b’s), it would be necessary to know what process created the data in the first place. Then, they would likely have some physical meaning (and the objective function would likely reflect that process, not simply reflect the curve fitting).
In my ‘objfcn’, the parameters translate and scale the tanh function, so that ‘b(1)’ translates (shifts) it with respect to its dependent variable (‘y’) value, ‘b(2)’ scales its amplitude with respect to the dependent variable value, ‘b(3)’ scales the slope of the tanh function, and ‘b(4)’ translates (shifts) the tanh function with respect to the independent variable ‘x’ value. They have no physical significance beyond that, and they all relate only to the tanh function, not to the process that created the data.
To fully understand what the ‘b’ parameters mean physically, (with respect to the data) it would be necessary to understand how they relate to the process that created the data. Since I do not know what that is, I cannot comment on it.
Hidd_1
2023 年 5 月 23 日
Will it be possible to do nl fitting over 3 or more Data set (variables)?
I mean if I defined a function objfcn = @(b,x,z) , will the fitnlm works?
Star Strider
2023 年 5 月 23 日
It depends on what ‘z’ is, what your objective function is, and what it does with ‘z’.
If ‘z’ is an extra parameter, and is already defined in the workspace, and ‘objfcn’ was defined as:
objfcn = @(b,x,z) ... ;
it would be prresented to fitnlm as:
@(b,x)objfcn(b,x,z)
So in that instance, ‘z’ would not be a parameter to be optimised, or a specific independent variable.
Hidd_1
2023 年 6 月 10 日
Do you know how to perform a validation test on this non-linear regression model?
Star Strider
2023 年 6 月 11 日
Hidd_1
2023 年 6 月 11 日
編集済み: Hidd_1
2023 年 6 月 11 日
How can we validate a non-linear regression model when it only fits the measured data but may not accurately represent the underlying mathematical description of the physical process?
for instance: Should a cross-correlation analysis be performed to validate a non-linear regression model on additional measurements?
Star Strider
2023 年 6 月 11 日
‘How can we validate a non-linear regression model when it only fits the measured data but may not accurately represent the underlying mathematical description of the physical process?’
Without knowing the process that created the data (and I do not know what it is here), and creating a mathematical model of it, there only way to model the data is empirically, that is, fitting the data to a function that just looks like it would work. There is no way to ‘validate’ it in the sense of determining how well the estimated parameters describe the underlying process, if the underlying process is not actually known.
‘Should a cross-correlation analysis be performed to validate a non-linear regression model on additional measurements?’
I would simply fit the new measurments to the existing empirical model and see how well the empirical model fits the data. I am not certain there is any other way to determine that.
If the process that created the data was known, then it might be possible to see how well the estimated parameters of the empirical model modeled the actual mathematical model of the process (that is, compared to its parameters). Without having the mathematical model of the process that created the data, we are restricted to using an empirical model. There is simply no other option.
In the absence of a mathematical model of the process that created the data, other empirical models can certainly be hypothesized and tested. For example, if a different input produced a different output, procedures described in the System Identification Toolbox could be used to derive different models, or perhaps one model that could fit outputs from other inputs, providing that both the inputs and corresponding outputs were available. I am not certain what benefit a cross-correlation analysis would have, when it is possible to fit the data, however I may not understand the sort of analysis you want to do.
その他の回答 (1 件)
Image Analyst
2023 年 5 月 17 日
You forgot to attach your data! It should be possible to fit any particular curve to some sigmoid kind of curve, if you have the model for it, with fitnlm. I'm attaching some demos using various models so you can see how it's done.
6 件のコメント
Hidd_1
2023 年 5 月 18 日
編集済み: Hidd_1
2023 年 5 月 18 日
Here is the data:
for the X-Axis:
x = 0:1:248;
I enclosed the data for the Y-axis. (Each row is a curve, there are 5 row = 5 curves)
I don't have a mathematical model for the data, I've tried curve fitting but than found difficulties finding the relation between the curves.
I have tried the following code for a sigmoid function but I got an error:
modelfun = @(x)(1/(1+exp(-x)));
beta0 = Data(1,1);
mdl = fitnlm(Data(1,:),modelfun,beta0)
I would appreciate any feedback, & Thanks a lot!
Alex Sha
2023 年 5 月 18 日
Try the fitting function below:
For the first row data:
Sum Squared Error (SSE): 0.10950384442613
Root of Mean Square Error (RMSE): 0.0209708005475935
Correlation Coef. (R): 0.999987584910591
R-Square: 0.999975169975316
Parameter Best Estimate
--------- -------------
p1 -0.0249084190435688
p2 -0.0763243107482052
p3 0.142316610539568
p4 6.04837338439208
p5 29.8257426089756
p6 15.1516810377477
p7 4.69340638351262
p8 18.1352678248419
p9 -34.4425824172933
p10 -20.5645162561211
For the second row data:
Sum Squared Error (SSE): 0.490607218014365
Root of Mean Square Error (RMSE): 0.0443881753680807
Correlation Coef. (R): 0.999971420379912
R-Square: 0.999942841576619
Parameter Best Estimate
--------- -------------
p1 0.0929778051317028
p2 -0.0761628900340691
p3 -0.178882792529087
p4 -8.72339392625596
p5 2.37142383251581
p6 8.93100805985586
p7 -18.9633995662387
p8 11.9590466793433
p9 39.6871533513552
p10 9.02043998188305
For the third row data:
Sum Squared Error (SSE): 0.112293028574181
Root of Mean Square Error (RMSE): 0.0212361959486675
Correlation Coef. (R): 0.999993593414753
R-Square: 0.999987186870549
Parameter Best Estimate
--------- -------------
p1 -0.122327538545226
p2 0.365168326229641
p3 -0.413118508834921
p4 15.4162121444737
p5 -0.230218900430568
p6 4.24989701962221
p7 25.9289257568436
p8 -61.9155528043671
p9 91.7605255786753
p10 0.644726102395323
For the fourth row data:
Sum Squared Error (SSE): 0.130148263082422
Root of Mean Square Error (RMSE): 0.0228622787026929
Correlation Coef. (R): 0.999992227866027
R-Square: 0.999984455792459
Parameter Best Estimate
--------- -------------
p1 -0.620509642952949
p2 -0.831361297046103
p3 -0.158884676824082
p4 2.7433055327425
p5 0.22148229931868
p6 16.8000287134801
p7 138.373410204698
p8 152.0209560771
p9 34.3737366450204
p10 0.545395269938482
For the fifth row data:
Sum Squared Error (SSE): 0.0393241104660582
Root of Mean Square Error (RMSE): 0.0125669469037695
Correlation Coef. (R): 0.999997424563047
R-Square: 0.999994849132727
Parameter Best Estimate
--------- -------------
p1 -0.599320668258529
p2 -0.44341678007072
p3 -0.279977917807812
p4 3.70706464742905
p5 -7.18206037822055
p6 23.1054679279924
p7 127.864208391053
p8 100.743337230697
p9 62.8901303777835
p10 0.679481485801022
Alex Sha
2023 年 5 月 19 日
It likes Neural Network fitting, by using sigmoid transfer function in hide layer, and linear transfer function in output layer, like the picture below:
the corresponded function will then be:
Hidd_1
2023 年 5 月 19 日
編集済み: Hidd_1
2023 年 5 月 19 日
Thnak a lot Alex!
I've tried to implement your algorithm using your objective function, but I got bad results:
clc
clear all
close all
DF = load('Inter_cubic.mat');
Data = DF.Inter_cubic(:,1:230);
x = (1:size(Data,2))';
% Define the fitting function
Sig = @(p,x) p(4)./(1 + exp(-p(1).*x + p(7))) + p(5)./(1 + exp(-p(2).*x + p(8))) + p(6)./(1 + exp(-p(3).*x + p(9))) + p(10);
% Initial guess of coefficients
beta0 = ones(1,10);
for k = 1:size(Data,1)
% Put the data for this curve into a table
y = Data(k,:)';
tbl = table(x,y);
% Fit the model
mdl{k} = fitnlm(tbl,Sig,beta0);
% Plot the fit against the data
figure
hold on
plot(x,Data(k,:),'o')
plot(x,predict(mdl{k},x))
end
I coudn't achieve your result, can you please check what went wrong, please!
Alex Sha
2023 年 5 月 20 日
GA only has the so-called global optimization capability in theory, but in practice it will be far from the same, even Matlab's global optimization toolbox, the results are often unsatisfactory。
参考
カテゴリ
Help Center および File Exchange で Nonlinear Regression についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!エラーが発生しました
ページに変更が加えられたため、アクションを完了できません。ページを再度読み込み、更新された状態を確認してください。
Web サイトの選択
Web サイトを選択すると、翻訳されたコンテンツにアクセスし、地域のイベントやサービスを確認できます。現在の位置情報に基づき、次のサイトの選択を推奨します:
また、以下のリストから Web サイトを選択することもできます。
最適なサイトパフォーマンスの取得方法
中国のサイト (中国語または英語) を選択することで、最適なサイトパフォーマンスが得られます。その他の国の MathWorks のサイトは、お客様の地域からのアクセスが最適化されていません。
南北アメリカ
- América Latina (Español)
- Canada (English)
- United States (English)
ヨーロッパ
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom(English)
アジア太平洋地域
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)