Monte Carlo for OLS histogram
5 ビュー (過去 30 日間)
Dear Matlab, I am running Monte Carlo simulations to illustrate properties of the least squares estimator beta. I am using the following code:
% sample size n n=100;
% number of samples m m=1000;
for i=1:n x=4*randn(n,1); e=randn(n,1); y=alpha+beta*x+e; X=[ones(n,1) x]; beta_hatvec=(inv((X'*X)))*X'*y; beta_hat(1)=beta_hatvec(2); end
% compare to normal distribution histfit(beta_hat); mean(beta_hat<0.95)
However, after running I just obtain a blue bar at x=0 and the peak of the red line is also at x=0, while I would like to obtain a histogram similar to the figure attached (closer to x=1), to show that the distribution is not normal and there is a bias in my estimation. Could anyone explain me what am I doing wrong?
回答 (1 件)
Brendan Hamm 2017 年 9 月 20 日
You only ever assign to the 1st element of beta_hat. I think you need to recheck your code.
1. if you are intending to do m samples of the beta coefficient, then you need to run the simulation m times and not n.
2. You probably mean to store to the ith element of beta_hat:
3. You should not use the inv() function to compute the coefficients. you can accomplish the same with:
beta_hatvec = (X'*X)\X'*y % The denominator for matrices can be seen as being the inverse.
3b. You could also obtain the coefficients using polyfit (or fitlm of the Statistics and Machine Learning Toolbox) and avoid having to get low level with regression solutions.
4. I don't think mean(beta_hat<0.95) does what you expect. You are first creating a logical vector and taking the average of that. This is merely telling you the percentage of values which are less than 0.95. Maybe this is what you are looking for, but there are other ways of calculalting this which are more clear for someone reading your code.