Hi Stata Experts:
I am using xtabond2 to estimate the impact of interaction term on outcome variable. The interaction term is based on two categorical variables.
My code is
i.A#i.B is my main variable. I did not add B in the model, because the results omit B as well.
The results show some of cells of interaction terms were omitted. I assume it may be because collinearity. As interaction term is my interest variable, can someone tell me how to recover the omitted categories?
Thank you!
Connie
I am using xtabond2 to estimate the impact of interaction term on outcome variable. The interaction term is based on two categorical variables.
My code is
Code:
xtabond2 lfstfy L.lfstfy i.A i.A#i.B i.housect i.Rstate i.year, gmmstyle(lfstfy i.A i.A#i.B, laglimits(2 3) collapse) ivstyle(i.Rstate i.year, equation(level)) noleveleq robust orthogonal
The results show some of cells of interaction terms were omitted. I assume it may be because collinearity. As interaction term is my interest variable, can someone tell me how to recover the omitted categories?
Code:
| Robust lfstfy | Coef. Std. Err. z P>|z| [95% Conf. Interval] ------------------------+---------------------------------------------------------------- lfstfy | L1. | .0452652 .0227559 1.99 0.047 .0006645 .089866 | A | 1 | 0 (empty) 2 | -.003393 .0722695 -0.05 0.963 -.1450385 .1382526 3 | -.1802281 .0732506 -2.46 0.014 -.3237966 -.0366597 4 | -.0337546 .0654204 -0.52 0.606 -.1619763 .0944671 5 | .0199551 .0949678 0.21 0.834 -.1661784 .2060885 6 | -.0485592 .0858893 -0.57 0.572 -.2168992 .1197807 7 | -.0333336 .0667647 -0.50 0.618 -.1641901 .0975228 8 | -.0004754 .10503 -0.00 0.996 -.2063304 .2053796 9 | -.0693442 .0917054 -0.76 0.450 -.2490834 .1103951 | A#B | 1 0 | 0 (empty) 1 1 | .200792 .3310918 0.61 0.544 -.4481361 .8497201 1 2 | -.4242789 .2050095 -2.07 0.038 -.8260902 -.0224676 2 0 | 0 (empty) 2 1 | .3346134 .3740984 0.89 0.371 -.398606 1.067833 2 2 | -.5909363 .2094511 -2.82 0.005 -1.001453 -.1804196 3 0 | 0 (empty) 3 1 | .7852543 .3483723 2.25 0.024 .1024572 1.468051 3 2 | -.2557902 .1799161 -1.42 0.155 -.6084192 .0968388 4 0 | 0 (empty) 4 1 | .3312339 .3220198 1.03 0.304 -.2999133 .9623811 4 2 | -.4006439 .196792 -2.04 0.042 -.7863491 -.0149387 5 0 | 0 (empty) 5 1 | 0 (omitted) 5 2 | 0 (omitted) 6 0 | 0 (empty) 6 1 | .3071944 .4038064 0.76 0.447 -.4842516 1.098641 6 2 | -.4528794 .2312391 -1.96 0.050 -.9060997 .0003409 7 0 | 0 (empty) 7 1 | .101094 .3735869 0.27 0.787 -.6311228 .8333108 7 2 | -.3408805 .2023023 -1.69 0.092 -.7373858 .0556247 8 0 | 0 (empty) 8 1 | .7406616 .4301529 1.72 0.085 -.1024225 1.583746 8 2 | -.8448848 .3371103 -2.51 0.012 -1.505609 -.1841608 9 0 | 0 (empty) 9 1 | .6990125 .370923 1.88 0.059 -.0279832 1.426008 9 2 | -.3290207 .215716 -1.53 0.127 -.7518164 .0937749
Connie
Comment