Dear Statalist,
I have the following problem. I have written a command that uses ml to estimates a model, but I got my cluster/robust standard errors using a self-written routine in Mata that used e(V) and e(b) as input to do the computations (i.e., analytical gradients are written in Mata).
Everything works fine, but given this way to proceed, I am having problems to parse the e(V) and e(b)objectswhen there are omitted variables because of colinearity.
Hence, I would like to be able to:
Below you might find a sketch that illustrates what I want (doing it mostly manually).
First, I run a regression with a repeated variable that is omitted. Here I have noticed that there is an o. that resembles the equation name of the matrices. Still, unfortunately, it doesn't follow their behavior, meaning I can't extract them with matrix omitted = e(b)[1,"o.:"] because I get the error equation o. not found.
This is what I would like to do automatically, keep only the parameters associated with variables that were not omitted.
This is an okay input (plus the same to e(b)) to my Mata procedure to compute robust/cluster standard errors. Therefore, just let's assume that I got my robust matrix V_robust.
Finally, I want to place back the robust errors in the same positions as they should be placed, and in that way, when invoking ml display, everything shows on-screen nicely. For now, the way that I am doing it is recovering the column and row number using colnum and rownum, respectively.
Thank you in advance,
Álvaro
Stata 16.1 MP
Win10/Linux Mint 19.1
https://alvarogutyerrez.github.io/
I have the following problem. I have written a command that uses ml to estimates a model, but I got my cluster/robust standard errors using a self-written routine in Mata that used e(V) and e(b) as input to do the computations (i.e., analytical gradients are written in Mata).
Everything works fine, but given this way to proceed, I am having problems to parse the e(V) and e(b)objectswhen there are omitted variables because of colinearity.
Hence, I would like to be able to:
- First, get only the covariates associated with the regressors that were not omitted from e(V) and e(b)objects.
- Second, using the objects from above, compute the robust/cluster variance-covariance matrices.
- Finally, I would like to put the values back in the original e(V) output keeping its original structure.
Below you might find a sketch that illustrates what I want (doing it mostly manually).
First, I run a regression with a repeated variable that is omitted. Here I have noticed that there is an o. that resembles the equation name of the matrices. Still, unfortunately, it doesn't follow their behavior, meaning I can't extract them with matrix omitted = e(b)[1,"o.:"] because I get the error equation o. not found.
Code:
clear all
sysuse auto , replace
version 12
tempvar constant
gen `constant' = 1
reg price mp mp `constant' , nocons
matrix V = e(V)
mat li V
symmetric V[3,3]
o.
mpg mpg __000000
mpg 2817.1347
o.mpg 0 0
__000000 -59997.356 0 1370802.5
Code:
**Doing manually what I want to automatize
mat V_non_omitted = (V[1,1],V[1,3]) \ (V[1,3],V[3,3])
mat colnames V_non_omitted = "mpg" "`constant'"
mat rownames V_non_omitted = "mpg" "`constant'"
mat li V_non_omitted
symmetric V_non_omitted[2,2]
mpg __000000
mpg 2817.1347
__000000 -59997.356 1370802.5
Code:
*This should emulate my procedure to compute robust/cluster standard errors
mat V_robust = V_non_omitted *1.5
mat li V_robust
*Robust Matrix
symmetric V_robust[2,2]
mpg __000000
mpg 4225.7021
__000000 -89996.033 2056203.8
Code:
*Put back the std error where they belong foreach i in `row_names' { foreach j in `col_names' { *recovers rownum of variable `i' local row_var_i = rownumb(V,"`i'") *recovers colnum of variable `j' local col_var_j = colnumb(V,"`j'") *replace the robust values in the original var-covar. matrix V[`row_var_i', `col_var_j'] = V_robust["`i'", "`j'"] } } symmetric V_robust[2,2] mpg __000000 mpg 4225.7021 __000000 -89996.033 2056203.8 . mat li V symmetric V[3,3] o. mpg mpg __000000 mpg 4225.7021 o.mpg 0 0 __000000 -89996.033 0 2056203.8
Thank you in advance,
Álvaro
Stata 16.1 MP
Win10/Linux Mint 19.1
https://alvarogutyerrez.github.io/
Comment