Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Sureg estimation with robust standard errors

    Dear all,

    I am aware that there have been questions about SUREG regressions before, but mine is a bit different:

    I am trying to estimate a system of equations with constraints using robust standard errors, as I have reason to belive that my errors are not iid.
    Since Stata's sureg command does not allow robust errors, I did some research and found the mysureg command, which can be accessed here:
    http://www.stata-press.com/data/ml4.html
    This command allows for robust errors. However, Stata seems to have some issues calculating the system as it iterates the ML function more than 100 times and displays the MLF is not concave after 20 or so iterations. When I try to estimate the same system via sureg, there is no problem with the estimation.
    I find this a bit puzzling, as the maximization procedure between sureg and mysureg should not really be different. They both use ML to estimate the same system of equations.
    Does anyone have an idea why the one converges and the other does not?
    I also tried to set initial values for the mysureg MLF by typing
    Code:
    ml model mysureg $equations, constraints($constr) vce(robust)
    ml init val
    ,
    where val is a vector of parameter estimates for the system obtained by sureg.
    However, when I afterwards type ml check, I get an r(2000) error and am not able to start the optimization procedure at all.

    So, I am left a bit confused if mysureg is actually the command I want to use here. Does any of you have any suggestions what I should further try?

  • #2
    Ok, so I now found the sem command which allows to estimate equation systems with constraints and robust errors.
    I just tried to estimate
    Code:
    sem $eqlist, constraints($constr) vce(robust)
    and Stata returns that my matsize is too small (I have already set it to 11000).
    My model is quite big, it includes ten equations each with 42 regressors.
    But again, it is no problem to estimate it via sureg but runs into the matsize error when I use sem even though they should be equivalent.
    Any suggestions?

    Best,
    Mike

    Comment


    • #3
      Have you tried -suest- instead. They are pretty much similar with few differences.
      The skeleton for -suest- is something like this-

      reg Y1 X1 X2
      est sto model1
      reg Y2 X1 X2
      est sto model2
      suest model1 model2,robust

      Comment


      • #4
        Hello
        Do you know if there is a similar command to suest fitted for cox resression?
        Thank you
        Noam

        Comment


        • #5
          So is the "mysureg" command a legit Stata command? I googled and did not find any convincing evidence that it is a legit command. Does anyone have more information on this? I really want to use mysureg because it allows the estimation of clustered SE within the context of sureg. Someone said that cluster SE can also be estimated by suest, but it seems to me that suest is not estimating SURE model. The coefficients from suest are the same as I got from estimating each equation using separate OLS and different from what I get from sureg.
          Last edited by shem shen; 08 Apr 2021, 10:17.

          Comment


          • #6
            There are quite a few loose things said on this thread.

            1. -suest- does not estimate Seemingly Unrelated Regressions. It estimates system Ordinary Least Squares and provides the correct variance for that, but this is a different estimator from -sureg-.

            2. User contributed command -mysureg- and Stata's native -sem- are legitimate alternatives to -sureg-, in fact they should be equivalent to -sureg, isure-. -mysureg- and Stata's native -sem- use maximum likelihood estimation, which is again a different estimator from the default two step Generalised Least Squares of -sureg- , but should be equivalent to the iterated -sureg, isure-. And yes, maximum likelihood estimators are a bit finicky, and occasionally fail.

            Comment


            • #7
              Originally posted by Joro Kolev View Post
              There are quite a few loose things said on this thread.

              1. -suest- does not estimate Seemingly Unrelated Regressions. It estimates system Ordinary Least Squares and provides the correct variance for that, but this is a different estimator from -sureg-.

              2. User contributed command -mysureg- and Stata's native -sem- are legitimate alternatives to -sureg-, in fact they should be equivalent to -sureg, isure-. -mysureg- and Stata's native -sem- use maximum likelihood estimation, which is again a different estimator from the default two step Generalised Least Squares of -sureg- , but should be equivalent to the iterated -sureg, isure-. And yes, maximum likelihood estimators are a bit finicky, and occasionally fail.
              Thank you Joro! Just a quick follow up question: I noticed that the results from sureg, isure is a bit different from sureg (see example below, where the coefficients and SE differ for the first equation "price foreign weight length", and do not differ for the next two equations). Is there any statistical reason to believe that the GLS esitmator should be preferred to MLE under certain circumstances?

              Code:
              sysuse auto
              sureg (price foreign weight length) (mpg foreign weight) (displ foreign weight)
              sureg (price foreign weight length) (mpg foreign weight) (displ foreign weight), isure
              
              
              . sureg (price foreign weight length) (mpg foreign weight) (displ foreign weight)
              
              Seemingly unrelated regression
              --------------------------------------------------------------------------
              Equation             Obs   Parms        RMSE    "R-sq"       chi2        P
              --------------------------------------------------------------------------
              price                 74       3    1967.769    0.5488      89.74   0.0000
              mpg                   74       2    3.337283    0.6627     145.39   0.0000
              displacement          74       2    39.60002    0.8115     318.62   0.0000
              --------------------------------------------------------------------------
              
              ------------------------------------------------------------------------------
                           |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
              -------------+----------------------------------------------------------------
              price        |
                   foreign |    3575.26   621.7961     5.75   0.000     2356.562    4793.958
                    weight |   5.691462   .9205043     6.18   0.000     3.887307    7.495618
                    length |  -88.27114    31.4167    -2.81   0.005    -149.8467   -26.69554
                     _cons |   4506.212   3588.044     1.26   0.209    -2526.225    11538.65
              -------------+----------------------------------------------------------------
              mpg          |
                   foreign |  -1.650029   1.053958    -1.57   0.117    -3.715748    .4156902
                    weight |  -.0065879   .0006241   -10.56   0.000     -.007811   -.0053647
                     _cons |    41.6797   2.121197    19.65   0.000     37.52223    45.83717
              -------------+----------------------------------------------------------------
              displacement |
                   foreign |   -25.6127   12.50621    -2.05   0.041    -50.12441   -1.100984
                    weight |   .0967549   .0074051    13.07   0.000     .0822411    .1112686
                     _cons |  -87.23548   25.17001    -3.47   0.001    -136.5678   -37.90317
              ------------------------------------------------------------------------------
              
              . 
              . sureg (price foreign weight length) (mpg foreign weight) (displ foreign weight), isure
              
              Iteration 1:   tolerance =  .06856944
              Iteration 2:   tolerance =  .00286513
              Iteration 3:   tolerance =  .00011183
              Iteration 4:   tolerance =  4.353e-06
              Iteration 5:   tolerance =  1.694e-07
              
              Seemingly unrelated regression, iterated 
              --------------------------------------------------------------------------
              Equation             Obs   Parms        RMSE    "R-sq"       chi2        P
              --------------------------------------------------------------------------
              price                 74       3    1967.779    0.5488      89.70   0.0000
              mpg                   74       2    3.337283    0.6627     145.39   0.0000
              displacement          74       2    39.60002    0.8115     318.62   0.0000
              --------------------------------------------------------------------------
              
              ------------------------------------------------------------------------------
                           |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
              -------------+----------------------------------------------------------------
              price        |
                   foreign |   3575.348   621.8388     5.75   0.000     2356.566     4794.13
                    weight |   5.688091   .9204615     6.18   0.000      3.88402    7.492163
                    length |  -88.14562   31.41455    -2.81   0.005     -149.717   -26.57423
                     _cons |   4492.775   3587.858     1.25   0.210    -2539.298    11524.85
              -------------+----------------------------------------------------------------
              mpg          |
                   foreign |  -1.650029   1.053958    -1.57   0.117    -3.715748    .4156902
                    weight |  -.0065879   .0006241   -10.56   0.000     -.007811   -.0053647
                     _cons |    41.6797   2.121197    19.65   0.000     37.52223    45.83717
              -------------+----------------------------------------------------------------
              displacement |
                   foreign |   -25.6127   12.50621    -2.05   0.041    -50.12441   -1.100984
                    weight |   .0967549   .0074051    13.07   0.000     .0822411    .1112686
                     _cons |  -87.23548   25.17001    -3.47   0.001    -136.5678   -37.90317
              ------------------------------------------------------------------------------

              Comment


              • #8
                For such systems where all regressors are exogenous in the system, the -sureg- (two stage GLS) and -sureg, isure- (iterated to convergence GLS, theoretically should be equivalent to maximum likelihood estimation) provide consistent estimates under the same set of assumptions. There is no reason to prefer two step GLS over maximum likelihood (or for that matter prefer maximum likelihood over two step GLS) they are asymptotically equivalent. Yet they are not numerically equivalent.


                Originally posted by shem shen View Post

                Thank you Joro! Just a quick follow up question: I noticed that the results from sureg, isure is a bit different from sureg (see example below, where the coefficients and SE differ for the first equation "price foreign weight length", and do not differ for the next two equations). Is there any statistical reason to believe that the GLS esitmator should be preferred to MLE under certain circumstances?

                Code:
                sysuse auto
                sureg (price foreign weight length) (mpg foreign weight) (displ foreign weight)
                sureg (price foreign weight length) (mpg foreign weight) (displ foreign weight), isure
                
                
                . sureg (price foreign weight length) (mpg foreign weight) (displ foreign weight)
                
                Seemingly unrelated regression
                --------------------------------------------------------------------------
                Equation Obs Parms RMSE "R-sq" chi2 P
                --------------------------------------------------------------------------
                price 74 3 1967.769 0.5488 89.74 0.0000
                mpg 74 2 3.337283 0.6627 145.39 0.0000
                displacement 74 2 39.60002 0.8115 318.62 0.0000
                --------------------------------------------------------------------------
                
                ------------------------------------------------------------------------------
                | Coef. Std. Err. z P>|z| [95% Conf. Interval]
                -------------+----------------------------------------------------------------
                price |
                foreign | 3575.26 621.7961 5.75 0.000 2356.562 4793.958
                weight | 5.691462 .9205043 6.18 0.000 3.887307 7.495618
                length | -88.27114 31.4167 -2.81 0.005 -149.8467 -26.69554
                _cons | 4506.212 3588.044 1.26 0.209 -2526.225 11538.65
                -------------+----------------------------------------------------------------
                mpg |
                foreign | -1.650029 1.053958 -1.57 0.117 -3.715748 .4156902
                weight | -.0065879 .0006241 -10.56 0.000 -.007811 -.0053647
                _cons | 41.6797 2.121197 19.65 0.000 37.52223 45.83717
                -------------+----------------------------------------------------------------
                displacement |
                foreign | -25.6127 12.50621 -2.05 0.041 -50.12441 -1.100984
                weight | .0967549 .0074051 13.07 0.000 .0822411 .1112686
                _cons | -87.23548 25.17001 -3.47 0.001 -136.5678 -37.90317
                ------------------------------------------------------------------------------
                
                .
                . sureg (price foreign weight length) (mpg foreign weight) (displ foreign weight), isure
                
                Iteration 1: tolerance = .06856944
                Iteration 2: tolerance = .00286513
                Iteration 3: tolerance = .00011183
                Iteration 4: tolerance = 4.353e-06
                Iteration 5: tolerance = 1.694e-07
                
                Seemingly unrelated regression, iterated
                --------------------------------------------------------------------------
                Equation Obs Parms RMSE "R-sq" chi2 P
                --------------------------------------------------------------------------
                price 74 3 1967.779 0.5488 89.70 0.0000
                mpg 74 2 3.337283 0.6627 145.39 0.0000
                displacement 74 2 39.60002 0.8115 318.62 0.0000
                --------------------------------------------------------------------------
                
                ------------------------------------------------------------------------------
                | Coef. Std. Err. z P>|z| [95% Conf. Interval]
                -------------+----------------------------------------------------------------
                price |
                foreign | 3575.348 621.8388 5.75 0.000 2356.566 4794.13
                weight | 5.688091 .9204615 6.18 0.000 3.88402 7.492163
                length | -88.14562 31.41455 -2.81 0.005 -149.717 -26.57423
                _cons | 4492.775 3587.858 1.25 0.210 -2539.298 11524.85
                -------------+----------------------------------------------------------------
                mpg |
                foreign | -1.650029 1.053958 -1.57 0.117 -3.715748 .4156902
                weight | -.0065879 .0006241 -10.56 0.000 -.007811 -.0053647
                _cons | 41.6797 2.121197 19.65 0.000 37.52223 45.83717
                -------------+----------------------------------------------------------------
                displacement |
                foreign | -25.6127 12.50621 -2.05 0.041 -50.12441 -1.100984
                weight | .0967549 .0074051 13.07 0.000 .0822411 .1112686
                _cons | -87.23548 25.17001 -3.47 0.001 -136.5678 -37.90317
                ------------------------------------------------------------------------------

                Comment


                • #9
                  Originally posted by Joro Kolev View Post
                  For such systems where all regressors are exogenous in the system, the -sureg- (two stage GLS) and -sureg, isure- (iterated to convergence GLS, theoretically should be equivalent to maximum likelihood estimation) provide consistent estimates under the same set of assumptions. There is no reason to prefer two step GLS over maximum likelihood (or for that matter prefer maximum likelihood over two step GLS) they are asymptotically equivalent. Yet they are not numerically equivalent.



                  Thank you Joro!

                  Comment

                  Working...
                  X