Dear Statalisters,
I have a question about the interpretation of the result of a Monte Carlo simulation.
I performed the following Monte Carlo study:
capture program drop mcprog
program mcprog
clear
drawnorm alpha_i, n(300)
expand 10
drawnorm nu_it e_it, n(3000)
g x_it=nu_it+alpha_i
drop nu_it
g y_it=2+alpha_i+3*x_it+e_it
regress y_it x_it
end
simulate _b _se, reps(100): mcprog
sum
Variable Obs Mean Std. Dev. Min Max
_b_x_it 100 3.49566 .0248297 3.44117 3.562293
_b_cons 100 2.001484 .0318429 1.931775 2.08348
_se_x_it 100 .0158684 .0002836 .0152547 .0164563
_se_cons 100 .0223772 .0003237 .0216304 .023222
I'm using OLS estimator with panel data to show that it is a wrong procedure and obviously my estimates will be not consistent. I know that there will be a violation of the assumption of exogeneity for OLS since the regressor x_it is correlated with alpha_i and for that reason when I perform the simulation the standard deviation of _b_x_it across simulated samples and the average estimate of the standard error _se_x_it should be different. But for which exact reason they are different? And in particular why the standard deviation of _b_x_it is higher than the average estimate of the std error?
Thank you in advance
M. B.
I have a question about the interpretation of the result of a Monte Carlo simulation.
I performed the following Monte Carlo study:
capture program drop mcprog
program mcprog
clear
drawnorm alpha_i, n(300)
expand 10
drawnorm nu_it e_it, n(3000)
g x_it=nu_it+alpha_i
drop nu_it
g y_it=2+alpha_i+3*x_it+e_it
regress y_it x_it
end
simulate _b _se, reps(100): mcprog
sum
Variable Obs Mean Std. Dev. Min Max
_b_x_it 100 3.49566 .0248297 3.44117 3.562293
_b_cons 100 2.001484 .0318429 1.931775 2.08348
_se_x_it 100 .0158684 .0002836 .0152547 .0164563
_se_cons 100 .0223772 .0003237 .0216304 .023222
I'm using OLS estimator with panel data to show that it is a wrong procedure and obviously my estimates will be not consistent. I know that there will be a violation of the assumption of exogeneity for OLS since the regressor x_it is correlated with alpha_i and for that reason when I perform the simulation the standard deviation of _b_x_it across simulated samples and the average estimate of the standard error _se_x_it should be different. But for which exact reason they are different? And in particular why the standard deviation of _b_x_it is higher than the average estimate of the std error?
Thank you in advance
M. B.
Comment