How to fit a data to a model without using Statistical Toolbox?
23 ビュー (過去 30 日間)
古いコメントを表示
For my data (30x2 table), given a model y = a1 + a2 * x^4 + a3 * x^5, I wanted to find the coefficients of my model. I was always directed to linear regression and data fitting which says it needs statistical toolbox but I don't have one. How do I make do of this?
1 件のコメント
Mathieu NOE
2022 年 11 月 7 日
hello
for polynomial fit you don't need any specific toolbox
read / look for polyfit / polyval
回答 (3 件)
Bruno Luong
2022 年 11 月 14 日
編集済み: Bruno Luong
2022 年 11 月 14 日
Polynomial fits can be done with linear algebra, no fminsearch is required, but yeah clearly the model cannot fit data with three coefs as stated
% Same data as the Excel file
data=[-0.0834 -0.2409
-0.0826 -0.2405
-0.0775 -0.236
-0.0719 -0.2275
-0.0658 -0.2264
-0.0521 -0.2174
-0.048 -0.2135
-0.04 -0.2127
-0.0279 -0.2064
-0.0266 -0.2062
-0.022 -0.2034
-0.0197 -0.1971
-0.0193 -0.197
-0.0186 -0.1925
-0.0119 -0.1882
-0.0112 -0.1876
-0.0096 -0.166
0.0034 -0.1624
0.0065 -0.1562
0.0108 -0.1551
0.0158 -0.1539
0.0181 -0.1447
0.0312 -0.1422
0.0323 -0.1364
0.036 -0.1214
0.0667 -0.0999
0.0679 -0.0842
0.0734 -0.061
0.0877 -0.0533
0.0884 -0.0503
-0.0834 -0.2409
-0.0826 -0.2405
-0.0775 -0.236
-0.0719 -0.2275
-0.0658 -0.2264
-0.0521 -0.2174
-0.048 -0.2135
-0.04 -0.2127
-0.0279 -0.2064
-0.0266 -0.2062
-0.022 -0.2034
-0.0197 -0.1971
-0.0193 -0.197
-0.0186 -0.1925
-0.0119 -0.1882
-0.0112 -0.1876
-0.0096 -0.166
0.0034 -0.1624
0.0065 -0.1562
0.0108 -0.1551
0.0158 -0.1539
0.0181 -0.1447
0.0312 -0.1422
0.0323 -0.1364
0.036 -0.1214
0.0667 -0.0999
0.0679 -0.0842
0.0734 -0.061
0.0877 -0.0533
0.0884 -0.0503
];
x=data(:,1);
y=data(:,2);
M=x.^[0,4,5];
P=M\y;
a3=P(3);
a2=P(2);
a1=P(1);
P=[a3 a2 0 0 0 a1];
P5 = polyfit(x,y,5);
xi=linspace(min(x),max(x));
ymodel=polyval(P,xi);
yP5=polyval(P5,xi);
close all
h = plot(x,y,'og',xi,ymodel,'-b',xi,yP5,'-r');
legend(h,'data','model with three coefs','Polynomial 5th degree (6 coefs)','location','northwest')
0 件のコメント
Santosh Fatale
2022 年 11 月 11 日
Hi Decy,
If you have polynomial function for your model and interested to find coefficients from the data, you can use "polyfit" function as follows:
p = polyfit(x, y, n);
Here inouts 'x' and 'y' are your data points and inout 'n' represent the highest degree of the polynomial. The coefficients of polynomial are in variable 'p' and are in descending order of power of independent variable in the polynomial.
For more information about "polyfit" function follow this link:
2 件のコメント
the cyclist
2022 年 11 月 11 日
Regarding this solution, I don't believe it will work for @Decy's case. They are not trying to fit the typical
y = a0 + a1*x + a2*x^2 * a3*x^3 + a4*x^4 + a5*x^5
They are trying to fit
y = a1 + a2 * x^4 + a3 * x^5
Note that they do not have linear, quadratic, or cubic terms.
I don't believe polyfit can do this fit.
Mathieu NOE
2022 年 11 月 14 日
hello @Decy
it seems the model choosen is not realy a good match with the data (or vice versa)
here a demo using fminsearch with your specific polynomial model
% load data
data = csvread('data.csv');
x=data(:,1);
y=data(:,2);
%polynomial fit : y = a1 + a2 * x^4 + a3 * x^5
f = @(a1,a2,a3,x) a1 + a2 * x.^4 + a3 * x.^5;
obj_fun = @(params) norm(f(params(1),params(2),params(3),x)-y);
sol = fminsearch(obj_fun, [mean(y),1,1]);
a1_sol = sol(1);
a2_sol = sol(2);
a3_sol = sol(3);
x_fit = linspace(min(x),max(x),100);
y_fit = f(a1_sol, a2_sol, a3_sol, x_fit);
Rsquared = my_Rsquared_coeff(y,f(a1_sol, a2_sol, a3_sol, x)); % correlation coefficient
figure(3);plot(x,y,'o',x_fit,y_fit,'-')
title(['Data fit - R squared = ' num2str(Rsquared)]);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function Rsquared = my_Rsquared_coeff(data,data_fit)
% R2 correlation coefficient computation
% The total sum of squares
sum_of_squares = sum((data-mean(data)).^2);
% The sum of squares of residuals, also called the residual sum of squares:
sum_of_squares_of_residuals = sum((data-data_fit).^2);
% definition of the coefficient of correlation is
Rsquared = 1 - sum_of_squares_of_residuals/sum_of_squares;
end
1 件のコメント
Mathieu NOE
2022 年 11 月 14 日
a simple demo to show that the model you want to use would be appropriate if the data would show a S curved shape , like this :
% load data
data = csvread('data.csv');
x=data(:,1);
% y=data(:,2);
% dummy y data
a1 = -0.1743;
a2 = 300;
a3 = 2e4;
y= a1 + a2 * x.^4 + a3 * x.^5 + 0.01*randn(size(x));
%polynomial fit : y = a1 + a2 * x^4 + a3 * x^5
f = @(a1,a2,a3,x) a1 + a2 * x.^4 + a3 * x.^5;
obj_fun = @(params) norm(f(params(1),params(2),params(3),x)-y);
sol = fminsearch(obj_fun, [mean(y),0,0]);
a1_sol = sol(1);
a2_sol = sol(2);
a3_sol = sol(3);
x_fit = linspace(min(x),max(x),100);
y_fit = f(a1_sol, a2_sol, a3_sol, x_fit);
Rsquared = my_Rsquared_coeff(y,f(a1_sol, a2_sol, a3_sol, x)); % correlation coefficient
figure(3);plot(x,y,'o',x_fit,y_fit,'-')
title(['Data fit - R squared = ' num2str(Rsquared)]);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function Rsquared = my_Rsquared_coeff(data,data_fit)
% R2 correlation coefficient computation
% The total sum of squares
sum_of_squares = sum((data-mean(data)).^2);
% The sum of squares of residuals, also called the residual sum of squares:
sum_of_squares_of_residuals = sum((data-data_fit).^2);
% definition of the coefficient of correlation is
Rsquared = 1 - sum_of_squares_of_residuals/sum_of_squares;
end
参考
カテゴリ
Help Center および File Exchange で Linear and Nonlinear Regression についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!