genetic algorithm combined for non-linear regression

4 ビュー (過去 30 日間)
msh
msh 2015 年 4 月 18 日
コメント済み: dmr 2020 年 8 月 13 日
Hi,
I was suggested and I believe that is really a good idea, that by using a genetic algorithm to determine my initial guesses for a non-linear regression could minimize ill-conditioned initial guesses. However, I am really struggling to implement this idea.
So my purpose is use the genetic algorithm to find some good initial guesses for a non-linear regression (nlinfit).
The function that I use for the non-linear regression is the following
function y = objective(ksiz,x)
y=exp(x*ksiz);
ksiz are coefficients to be found by the non-linear regression, calling it as follows:
I call the non-linear regression as follows:
ksi1 = nlinfit(X(1:T-1,:),ex1','objective',ksi10);
where ksi1, are the optimum choice (coefficients) for the fit, and ksi0 the corresponding guesses.
X is the matrix of the explanatory variables, and ex1 is the data of the dependent variable. The matrix X is for example defined as:
X = [ones(T-1,1) log(kh(1:T-1,1)) log(zh(T:T-1,1)) log((A(1:T-1,1)))]
where T is the length of sample, and kh, zh, A are my data. As you notice I also include a constant term, captured by the vector of ones. I attach for you the sample with those data
UPDATE
My algorithm essentially is trying to find a fixed point, of some parameters that I use to approximate an unknown function through a simple polynomial. The fixed point involves a loop over which the data are generated and uses a regression to minimize the errors between the approximated function and the true one. The data sample to be used in the regression are determined endogenously through my models equations.
I start with a guess for the parameters:
%% Guess of the coefficients to be used in the polynomial
be1=[1.9642; 0.5119; 0.0326; -2.5694];
be2=[1.8016; 1.3169; 0.0873; -2.6706];
be3=[1.9436; 0.5082; 0.0302; -2.5742];
be4=[0.6876; 0.5589; 0.0330; -2.6824];
be5=[2.0826; 0.5509; 0.0469; -2.5404];
%% My guess for the initial values to be used in the non-linear regression
ksi1=be1;
ksi2=be2;
ksi3=be3;
ksi4=be4;
ksi5=be5;
In the "sample.mat" file I include the initialization for the variables used in the polynomial.
In my loop anything that does not change within the loop is a parameter already assigned a value.
%%% Loop to find the fixed point through homotopy
while dif> crit
for t=1:T
%%basis points of the polynomial of Degree 1
X(t,:) = Ord_Polynomial_N([log(kh(t,1)) log(zh(t,1)) log((A(t,1))) ], D);
%%the function that I approximate
psi1(t)=exp((X(t,:)*be1));
ce(t)=psi1(t)^(1/(1-gamma));
if eps == 1
v(t)=((1-beta)^(1-beta))*(beta*ce(t))^(beta);
u(t)=1-beta;
else
v(t)=(1+beta^(eps)*ce(t)^(eps-1))^(1/(eps-1));
u(t)=v(t)^(1-eps);
end
% approximate some other functions
psi2(t)=exp(X(t,:)*be2);
psi3(t)=exp(X(t,:)*be3);
psi4(t)=exp(X(t,:)*be4);
psi5(t)=exp(X(t,:)*be5);
%%%Generate the sample
kh(t+1)= psi2(t)/psi3(t);
kh(t+1)= kh(t+1)*(kh(t+1)>khlow)*(kh(t+1)<khup)+khlow*(kh(t+1)<khlow)+khup*(kh(t+1)>khup);
theta(t+1) = 1/(1+kh(t+1));
phi(t+1) = psi4(t)/psi5(t);
zh(t+1) = (phi(t+1)/(1-phi(t+1)))*(1-theta(t+1));
zh(t+1) = zh(t+1)*(zh(t+1)>zhlow)*(zh(t+1)<zhup)+zhlow*(zh(t+1)<zhlow)+zhup*(zh(t+1)>zhup);
end
%%%Using the sample before, generate the 'true' function
for t=1:T
Rk(t) = 1+A(t)*alpha*kh(t)^(alpha-1)-delta1;
Rh(t) = 1+A(t)*(1-alpha)*kh(t)^(alpha)-delta1+eta(t);
R(t) = 1 + A(t)*ai(t)*(1-alpha)*(q)^(alpha)-delta2;
Rx(t) = (Rk(t)*(1-theta(t)) + Rh(t)*theta(t))*(1-phi(t))+ phi(t)*R(t);
end
for t=1:T-1
%%This was approximated by psi1
ex1(t)=v(t+1)^(1-gamma)*Rx(t+1)^(1-gamma);
%%This was approximated by psi2
ex2(t)=v(t+1)^(1-gamma)*Rx(t+1)^(-gamma)*Rk(t+1)*(kh(t+1)) ;
% This was approximated by psi3
ex3(t)=v(t+1)^(1-gamma)*Rx(t+1)^(-gamma)*Rh(t+1);
% This was approximated by psi4
ex4(t)=v(t+1)^(1-gamma)*Rx(t+1)^(-gamma)*Rk(t+1)*phi(t+1);
this was approximated by psi5
ex5(t)=v(t+1)^(1-gamma)*Rx(t+1)^(-gamma)*R(t+1);
end
W_new=[kh zh];
%%Convergent criterion
dif= mean(mean(abs(1-W_new./W_old))) ;
disp(['Difference: ' num2str(dif)])
%%%Update of the coefficients through regressions
opts = statset('MaxIter',60000);
ksi1 = nlinfit(X(1:T-1,:),ex1','objective',ksi1,opts);
ksi2 = nlinfit(X(1:T-1,:),ex2','objective',ksi2,opts);
ksi3 = nlinfit(X(1:T-1,:),ex3','objective',ksi3,opts);
ksi4 = nlinfit(X(1:T-1,:),ex4','objective',ksi4,opts);
ksi5 = nlinfit(X(1:T-1,:),ex5','objective',ksi5,opts);
%%Homotopy
be1 = update*ksi1 + (1-update)*be1;
be2 = update*ksi2 + (1-update)*be2;
be3 = update*ksi3 + (1-update)*be3;
be4 = update*ksi4 + (1-update)*be4;
be5 = update*ksi5 + (1-update)*be5;
W_old=W_new;
end
I would really appreciate your assistance on how to use the genetic algorithm. I am confused by the mathwork documentation as my problem here seems much simpler than the available examples I found.
  1 件のコメント
dmr
dmr 2020 年 8 月 13 日
can i ask how you found your sets of beta? is it via parameter estimation like maximum likelihood? i want to use genetics algorithm too and i have my parameters estimated by mle but of course i only got one set (except if you add other sets from the iterations).

サインインしてコメントする。

回答 (0 件)

カテゴリ

Help Center および File ExchangeGenetic Algorithm についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by