Monte Carlo simulation of a linear regression model with a lagged dependet variable
2 ビュー (過去 30 日間)
古いコメントを表示
I have a linear regression model with a lagged dependet variable: y_t=beta_0 + beta_1 * y_{t-1} + u_t The initial starting point is y_0=2 and I know the real coefficients of beta_0=2 and beta_1=1. How can I perform a Monte Carlo Simulation that´s estimating the bias of the OLS coefficients?
0 件のコメント
回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Linear and Nonlinear Regression についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!