Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Compare sum of coefficients across different regression subsamples

    Dear Statalisters
    I have looked through previous posts on suest on tests related to two coefficients for difference between the subsamples. I have come across Stata code/command for F-test, chi2, and Z-test from those posts.
    However, I was unable to find anything specific to my issue (it could be I have overlooked or missed post related my query.

    I am using panel data. Let’s say DV is my dependent variable and IV is my independent variable. I have a number of control variables as CV. I want to subsample based the variable S, so my regression looks as follows:

    xtreg DV IV1 IV2 CV1 CV2 CV3 i.fyear if S==0, fe robust …. (1)
    xtreg DV IV1 IV2 CV1 CV2 CV3 i.fyear if S==1, fe robust ….. (2)

    I would like to test if the coefficients of b1+b2 (b_IV1 plus b_ IV2) in Eq. (1) is significantly different from the coefficients of b1+b2 (b_IV1 plus b_ IV2) in Eq. (2).

    Hoping someone will be able to guide me on this.
    Thank you,

  • #2
    Well, having read previous posts on -suest-, you are presumably aware that you can't do this because -suest- does not support -xtreg-.

    However, you can get the comparison you are looking for as follows. I will assume that IV1, IV2 and CV1 are continuous variables, and that CV2 and CV3 are discrete, to illustrate the approach. If that is not the case, you will need to modify the code accordingly.

    Code:
    xtreg DV i.S##(c.(IV1 IV2 CV1) i.(CV2 CV3 )), fe vce(cluster panelvar)
    lincom 1.S#IV1 + 1S.#IV2
    The -xtreg- command above uses an interaction between S and all of the predictors of the model, thus completely emulating two separate subset regressions. Since the 1.S#whatever terms represent differences between the S = 0 and S = 1 coefficients, the -lincom- command calculates the difference between the S = 0 and S = 1 values of _b[IV1] +_b[IV2].

    If you are not familiar with the i. and c. prefixes and the ## operator, read -help fvvarlist- so you will learn about one of Stata's very best features!

    Comment


    • #3
      Learning how to do this manually the way how Clyde shows is better. By the way in econometrics this is called a Chow test for structural stability of the coefficients.

      But in the case of fixed effects regression, we can trick -regress- to estimate it, so in fact here we can use -suest- (which is not always true, this is why it is good to know the approach Clyde shows).

      Here is how, and we use the undocumented, as far as I can see, option -regress, absorb-:

      Code:
      . sysuse auto, clear
      (1978 Automobile Data)
      
      . regress price mpg headroom if foreign, absorb(rep)
      
      Linear regression, absorbing indicators         Number of obs     =         21
                                                      F(2, 16)          =       4.75
                                                      Prob > F          =     0.0240
                                                      R-squared         =     0.4071
                                                      Adj R-squared     =     0.2589
                                                      Root MSE          =       1912
      
      ------------------------------------------------------------------------------
             price |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
      -------------+----------------------------------------------------------------
               mpg |  -210.8535   70.52512    -2.99   0.009    -360.3601   -61.34696
          headroom |  -66.90347   974.0123    -0.07   0.946    -2131.717     1997.91
             _cons |   11573.76   2762.907     4.19   0.001     5716.661    17430.86
      ------------------------------------------------------------------------------
      
      . est sto foreign
      
      . regress price mpg headroom if !foreign, absorb(rep)
      
      Linear regression, absorbing indicators         Number of obs     =         48
                                                      F(2, 41)          =       8.90
                                                      Prob > F          =     0.0006
                                                      R-squared         =     0.3306
                                                      Adj R-squared     =     0.2326
                                                      Root MSE          =     2793.5
      
      ------------------------------------------------------------------------------
             price |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
      -------------+----------------------------------------------------------------
               mpg |  -473.0364   114.3748    -4.14   0.000    -704.0209   -242.0519
          headroom |   -535.417   536.7575    -1.00   0.324    -1619.421    548.5868
             _cons |   17129.81   3365.443     5.09   0.000     10333.16    23926.46
      ------------------------------------------------------------------------------
      
      . suest foreign .
      
      Simultaneous results for foreign, .
      
                                                      Number of obs     =         69
      
      -------------------------------------------------------------------------------
                    |               Robust
                    |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
      --------------+----------------------------------------------------------------
      foreign_mean  |
                mpg |  -210.8535   80.27714    -2.63   0.009    -368.1939   -53.51323
           headroom |  -66.90347   567.3178    -0.12   0.906    -1178.826    1045.019
              _cons |   11573.76   2523.328     4.59   0.000      6628.13     16519.4
      --------------+----------------------------------------------------------------
      foreign_lnvar |
              _cons |   15.11185   .2678884    56.41   0.000      14.5868     15.6369
      --------------+----------------------------------------------------------------
      _LAST_mean    |
                mpg |  -473.0364   155.7149    -3.04   0.002     -778.232   -167.8408
           headroom |   -535.417   421.3216    -1.27   0.204    -1361.192    290.3583
              _cons |   17129.81   4012.619     4.27   0.000     9265.223     24994.4
      --------------+----------------------------------------------------------------
      _LAST_lnvar   |
              _cons |    15.8701   .2955829    53.69   0.000     15.29077    16.44943
      -------------------------------------------------------------------------------
      
      . test [foreign_mean]mpg + [foreign_mean]headroom = [_LAST_mean]mpg + [_LAST_mean]headroom
      
       ( 1)  [foreign_mean]mpg + [foreign_mean]headroom - [_LAST_mean]mpg - [_LAST_mean]headroom = 0
      
                 chi2(  1) =    0.89
               Prob > chi2 =    0.3456
      
      .

      Comment


      • #4
        Originally posted by Clyde Schechter View Post
        Well, having read previous posts on -suest-, you are presumably aware that you can't do this because -suest- does not support -xtreg-.

        However, you can get the comparison you are looking for as follows. I will assume that IV1, IV2 and CV1 are continuous variables, and that CV2 and CV3 are discrete, to illustrate the approach. If that is not the case, you will need to modify the code accordingly.

        Code:
        xtreg DV i.S##(c.(IV1 IV2 CV1) i.(CV2 CV3 )), fe vce(cluster panelvar)
        lincom 1.S#IV1 + 1S.#IV2
        The -xtreg- command above uses an interaction between S and all of the predictors of the model, thus completely emulating two separate subset regressions. Since the 1.S#whatever terms represent differences between the S = 0 and S = 1 coefficients, the -lincom- command calculates the difference between the S = 0 and S = 1 values of _b[IV1] +_b[IV2].

        If you are not familiar with the i. and c. prefixes and the ## operator, read -help fvvarlist- so you will learn about one of Stata's very best features!
        Thank you so much Clyde for your prompt reply. Found it extremely helpful. Much appreciated.

        Comment


        • #5
          Originally posted by Joro Kolev View Post
          Learning how to do this manually the way how Clyde shows is better. By the way in econometrics this is called a Chow test for structural stability of the coefficients.

          But in the case of fixed effects regression, we can trick -regress- to estimate it, so in fact here we can use -suest- (which is not always true, this is why it is good to know the approach Clyde shows).

          Here is how, and we use the undocumented, as far as I can see, option -regress, absorb-:

          Code:
          . sysuse auto, clear
          (1978 Automobile Data)
          
          . regress price mpg headroom if foreign, absorb(rep)
          
          Linear regression, absorbing indicators Number of obs = 21
          F(2, 16) = 4.75
          Prob > F = 0.0240
          R-squared = 0.4071
          Adj R-squared = 0.2589
          Root MSE = 1912
          
          ------------------------------------------------------------------------------
          price | Coef. Std. Err. t P>|t| [95% Conf. Interval]
          -------------+----------------------------------------------------------------
          mpg | -210.8535 70.52512 -2.99 0.009 -360.3601 -61.34696
          headroom | -66.90347 974.0123 -0.07 0.946 -2131.717 1997.91
          _cons | 11573.76 2762.907 4.19 0.001 5716.661 17430.86
          ------------------------------------------------------------------------------
          
          . est sto foreign
          
          . regress price mpg headroom if !foreign, absorb(rep)
          
          Linear regression, absorbing indicators Number of obs = 48
          F(2, 41) = 8.90
          Prob > F = 0.0006
          R-squared = 0.3306
          Adj R-squared = 0.2326
          Root MSE = 2793.5
          
          ------------------------------------------------------------------------------
          price | Coef. Std. Err. t P>|t| [95% Conf. Interval]
          -------------+----------------------------------------------------------------
          mpg | -473.0364 114.3748 -4.14 0.000 -704.0209 -242.0519
          headroom | -535.417 536.7575 -1.00 0.324 -1619.421 548.5868
          _cons | 17129.81 3365.443 5.09 0.000 10333.16 23926.46
          ------------------------------------------------------------------------------
          
          . suest foreign .
          
          Simultaneous results for foreign, .
          
          Number of obs = 69
          
          -------------------------------------------------------------------------------
          | Robust
          | Coef. Std. Err. z P>|z| [95% Conf. Interval]
          --------------+----------------------------------------------------------------
          foreign_mean |
          mpg | -210.8535 80.27714 -2.63 0.009 -368.1939 -53.51323
          headroom | -66.90347 567.3178 -0.12 0.906 -1178.826 1045.019
          _cons | 11573.76 2523.328 4.59 0.000 6628.13 16519.4
          --------------+----------------------------------------------------------------
          foreign_lnvar |
          _cons | 15.11185 .2678884 56.41 0.000 14.5868 15.6369
          --------------+----------------------------------------------------------------
          _LAST_mean |
          mpg | -473.0364 155.7149 -3.04 0.002 -778.232 -167.8408
          headroom | -535.417 421.3216 -1.27 0.204 -1361.192 290.3583
          _cons | 17129.81 4012.619 4.27 0.000 9265.223 24994.4
          --------------+----------------------------------------------------------------
          _LAST_lnvar |
          _cons | 15.8701 .2955829 53.69 0.000 15.29077 16.44943
          -------------------------------------------------------------------------------
          
          . test [foreign_mean]mpg + [foreign_mean]headroom = [_LAST_mean]mpg + [_LAST_mean]headroom
          
          ( 1) [foreign_mean]mpg + [foreign_mean]headroom - [_LAST_mean]mpg - [_LAST_mean]headroom = 0
          
          chi2( 1) = 0.89
          Prob > chi2 = 0.3456
          
          .
          Thank you Joro for showing the detail method. Highly appreciate.

          Comment


          • #6
            Originally posted by Joro Kolev View Post
            Learning how to do this manually the way how Clyde shows is better. By the way in econometrics this is called a Chow test for structural stability of the coefficients.

            But in the case of fixed effects regression, we can trick -regress- to estimate it, so in fact here we can use -suest- (which is not always true, this is why it is good to know the approach Clyde shows).

            Here is how, and we use the undocumented, as far as I can see, option -regress, absorb-:

            Code:
            . sysuse auto, clear
            (1978 Automobile Data)
            
            . regress price mpg headroom if foreign, absorb(rep)
            
            Linear regression, absorbing indicators Number of obs = 21
            F(2, 16) = 4.75
            Prob > F = 0.0240
            R-squared = 0.4071
            Adj R-squared = 0.2589
            Root MSE = 1912
            
            ------------------------------------------------------------------------------
            price | Coef. Std. Err. t P>|t| [95% Conf. Interval]
            -------------+----------------------------------------------------------------
            mpg | -210.8535 70.52512 -2.99 0.009 -360.3601 -61.34696
            headroom | -66.90347 974.0123 -0.07 0.946 -2131.717 1997.91
            _cons | 11573.76 2762.907 4.19 0.001 5716.661 17430.86
            ------------------------------------------------------------------------------
            
            . est sto foreign
            
            . regress price mpg headroom if !foreign, absorb(rep)
            
            Linear regression, absorbing indicators Number of obs = 48
            F(2, 41) = 8.90
            Prob > F = 0.0006
            R-squared = 0.3306
            Adj R-squared = 0.2326
            Root MSE = 2793.5
            
            ------------------------------------------------------------------------------
            price | Coef. Std. Err. t P>|t| [95% Conf. Interval]
            -------------+----------------------------------------------------------------
            mpg | -473.0364 114.3748 -4.14 0.000 -704.0209 -242.0519
            headroom | -535.417 536.7575 -1.00 0.324 -1619.421 548.5868
            _cons | 17129.81 3365.443 5.09 0.000 10333.16 23926.46
            ------------------------------------------------------------------------------
            
            . suest foreign .
            
            Simultaneous results for foreign, .
            
            Number of obs = 69
            
            -------------------------------------------------------------------------------
            | Robust
            | Coef. Std. Err. z P>|z| [95% Conf. Interval]
            --------------+----------------------------------------------------------------
            foreign_mean |
            mpg | -210.8535 80.27714 -2.63 0.009 -368.1939 -53.51323
            headroom | -66.90347 567.3178 -0.12 0.906 -1178.826 1045.019
            _cons | 11573.76 2523.328 4.59 0.000 6628.13 16519.4
            --------------+----------------------------------------------------------------
            foreign_lnvar |
            _cons | 15.11185 .2678884 56.41 0.000 14.5868 15.6369
            --------------+----------------------------------------------------------------
            _LAST_mean |
            mpg | -473.0364 155.7149 -3.04 0.002 -778.232 -167.8408
            headroom | -535.417 421.3216 -1.27 0.204 -1361.192 290.3583
            _cons | 17129.81 4012.619 4.27 0.000 9265.223 24994.4
            --------------+----------------------------------------------------------------
            _LAST_lnvar |
            _cons | 15.8701 .2955829 53.69 0.000 15.29077 16.44943
            -------------------------------------------------------------------------------
            
            . test [foreign_mean]mpg + [foreign_mean]headroom = [_LAST_mean]mpg + [_LAST_mean]headroom
            
            ( 1) [foreign_mean]mpg + [foreign_mean]headroom - [_LAST_mean]mpg - [_LAST_mean]headroom = 0
            
            chi2( 1) = 0.89
            Prob > chi2 = 0.3456
            
            .
            Hi Joro
            A follow-up question on your method. Could you kindly tell me if I use your method and I want to find the significance level of the term [foreign_mean]mpg + [foreign_mean]headroom or in other words just the level of significance of coefficients of b1+b2 (not compared to another group/sub-sample).

            Thank you.

            Comment


            • #7
              Hi Mabel, you do everything the same, and at the end you do not use test, but you use -lincom- like Clyde did.

              And by the way if you really want to check the sum of two coefficients within one equation, you do not need -suest-. -Lincom- post estimation of the equation of interest does the trick.

              Here:

              Code:
              . sysuse auto
              (1978 Automobile Data)
              
              . qui regress price mpg headroom if foreign, absorb(rep)
              
              . est sto foreign
              
              . qui regress price mpg headroom if !foreign, absorb(rep)
              
              . suest foreign .
              
              Simultaneous results for foreign, .
              
                                                              Number of obs     =         69
              
              -------------------------------------------------------------------------------
                            |               Robust
                            |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
              --------------+----------------------------------------------------------------
              foreign_mean  |
                        mpg |  -210.8535   80.27714    -2.63   0.009    -368.1939   -53.51323
                   headroom |  -66.90347   567.3178    -0.12   0.906    -1178.826    1045.019
                      _cons |   11573.76   2523.328     4.59   0.000      6628.13     16519.4
              --------------+----------------------------------------------------------------
              foreign_lnvar |
                      _cons |   15.11185   .2678884    56.41   0.000      14.5868     15.6369
              --------------+----------------------------------------------------------------
              _LAST_mean    |
                        mpg |  -473.0364   155.7149    -3.04   0.002     -778.232   -167.8408
                   headroom |   -535.417   421.3216    -1.27   0.204    -1361.192    290.3583
                      _cons |   17129.81   4012.619     4.27   0.000     9265.223     24994.4
              --------------+----------------------------------------------------------------
              _LAST_lnvar   |
                      _cons |    15.8701   .2955829    53.69   0.000     15.29077    16.44943
              -------------------------------------------------------------------------------
              
              . lincom [foreign_mean]mpg + [foreign_mean]head
              
               ( 1)  [foreign_mean]mpg + [foreign_mean]headroom = 0
              
              ------------------------------------------------------------------------------
                           |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
              -------------+----------------------------------------------------------------
                       (1) |   -277.757   570.3211    -0.49   0.626    -1395.566    840.0517
              ------------------------------------------------------------------------------
              If you want to check sum across equations, and for this you do need -suest-

              Code:
              . lincom [foreign_mean]mpg + [_LAST_mean]mpg
              
               ( 1)  [foreign_mean]mpg + [_LAST_mean]mpg = 0
              
              ------------------------------------------------------------------------------
                           |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
              -------------+----------------------------------------------------------------
                       (1) |  -683.8899   175.1901    -3.90   0.000    -1027.256   -340.5238
              ------------------------------------------------------------------------------
              Note that the one before the last one is equivalent to the following which does not involve -suest-:

              Code:
              . qui regress price mpg headroom if foreign, absorb(rep) robust
              
              . lincom mpg + headroom
              
               ( 1)  mpg + headroom = 0
              
              ------------------------------------------------------------------------------
                     price |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
              -------------+----------------------------------------------------------------
                       (1) |   -277.757   637.9888    -0.44   0.669    -1630.233    1074.719
              ------------------------------------------------------------------------------
              Well, it is not equivalent but it should be. Probably there is some fiddle with the degrees of freedom.

              Comment


              • #8
                Originally posted by Joro Kolev View Post
                Hi Mabel, you do everything the same, and at the end you do not use test, but you use -lincom- like Clyde did.

                And by the way if you really want to check the sum of two coefficients within one equation, you do not need -suest-. -Lincom- post estimation of the equation of interest does the trick.

                Here:

                Code:
                . sysuse auto
                (1978 Automobile Data)
                
                . qui regress price mpg headroom if foreign, absorb(rep)
                
                . est sto foreign
                
                . qui regress price mpg headroom if !foreign, absorb(rep)
                
                . suest foreign .
                
                Simultaneous results for foreign, .
                
                Number of obs = 69
                
                -------------------------------------------------------------------------------
                | Robust
                | Coef. Std. Err. z P>|z| [95% Conf. Interval]
                --------------+----------------------------------------------------------------
                foreign_mean |
                mpg | -210.8535 80.27714 -2.63 0.009 -368.1939 -53.51323
                headroom | -66.90347 567.3178 -0.12 0.906 -1178.826 1045.019
                _cons | 11573.76 2523.328 4.59 0.000 6628.13 16519.4
                --------------+----------------------------------------------------------------
                foreign_lnvar |
                _cons | 15.11185 .2678884 56.41 0.000 14.5868 15.6369
                --------------+----------------------------------------------------------------
                _LAST_mean |
                mpg | -473.0364 155.7149 -3.04 0.002 -778.232 -167.8408
                headroom | -535.417 421.3216 -1.27 0.204 -1361.192 290.3583
                _cons | 17129.81 4012.619 4.27 0.000 9265.223 24994.4
                --------------+----------------------------------------------------------------
                _LAST_lnvar |
                _cons | 15.8701 .2955829 53.69 0.000 15.29077 16.44943
                -------------------------------------------------------------------------------
                
                . lincom [foreign_mean]mpg + [foreign_mean]head
                
                ( 1) [foreign_mean]mpg + [foreign_mean]headroom = 0
                
                ------------------------------------------------------------------------------
                | Coef. Std. Err. z P>|z| [95% Conf. Interval]
                -------------+----------------------------------------------------------------
                (1) | -277.757 570.3211 -0.49 0.626 -1395.566 840.0517
                ------------------------------------------------------------------------------
                If you want to check sum across equations, and for this you do need -suest-

                Code:
                . lincom [foreign_mean]mpg + [_LAST_mean]mpg
                
                ( 1) [foreign_mean]mpg + [_LAST_mean]mpg = 0
                
                ------------------------------------------------------------------------------
                | Coef. Std. Err. z P>|z| [95% Conf. Interval]
                -------------+----------------------------------------------------------------
                (1) | -683.8899 175.1901 -3.90 0.000 -1027.256 -340.5238
                ------------------------------------------------------------------------------
                Note that the one before the last one is equivalent to the following which does not involve -suest-:

                Code:
                . qui regress price mpg headroom if foreign, absorb(rep) robust
                
                . lincom mpg + headroom
                
                ( 1) mpg + headroom = 0
                
                ------------------------------------------------------------------------------
                price | Coef. Std. Err. t P>|t| [95% Conf. Interval]
                -------------+----------------------------------------------------------------
                (1) | -277.757 637.9888 -0.44 0.669 -1630.233 1074.719
                ------------------------------------------------------------------------------
                Well, it is not equivalent but it should be. Probably there is some fiddle with the degrees of freedom.
                Thank you so much Joro for getting back so fast. I will actually need to do both within one equation as we all as between the sub-groups. I will also need to report all the coefficients of independent variables with the significance level, and sum of b1 and b2 within one equation and sub-groups.
                It is very helpful.

                Comment


                • #9
                  I'll take the liberty of going on a tangent here. I nearly always use -lincom- instead of -test-. The reason is that all you get form -test- is a test statistic and p-value which, as I have belabored endlessly on this Forum, I think are of very little use. -lincom- provides you with an estimate of the difference (or ratio if you use -eform-) along with its confidence interval, which I think is far more useful in almost any situation. And if you really want the test statistic and p-values anyway, they're also there in -lincom-'s output. -lincom- gives you everything that you can find in -test- and more; -test- gives only the least informative part of -lincom-'s results.

                  Comment


                  • #10
                    I of course agree with everything you say, Clyde. For a single restriction -lincom- supersedes -test-.

                    But for multiple restrictions we need to revert to -test-, as -lincom- cannot handle multiple restrictions.

                    Code:
                    . lincom ([foreign_mean]mpg + [foreign_mean]head) ([foreign_mean]mpg + [_LAST_mean]mpg)
                    unknown function ()
                    r(133);
                    
                    
                    . test ([foreign_mean]mpg=[foreign_mean]head) ([foreign_mean]mpg=[_LAST_mean]mpg)
                    
                     ( 1)  [foreign_mean]mpg - [foreign_mean]headroom = 0
                     ( 2)  [foreign_mean]mpg - [_LAST_mean]mpg = 0
                    
                               chi2(  2) =    2.38
                             Prob > chi2 =    0.3048
                    Originally posted by Clyde Schechter View Post
                    I'll take the liberty of going on a tangent here. I nearly always use -lincom- instead of -test-. The reason is that all you get form -test- is a test statistic and p-value which, as I have belabored endlessly on this Forum, I think are of very little use. -lincom- provides you with an estimate of the difference (or ratio if you use -eform-) along with its confidence interval, which I think is far more useful in almost any situation. And if you really want the test statistic and p-values anyway, they're also there in -lincom-'s output. -lincom- gives you everything that you can find in -test- and more; -test- gives only the least informative part of -lincom-'s results.

                    Comment


                    • #11
                      At the risk of annoying people for self-promotion: xlincom from SSC may be of help here. test does joint significance tests when you give it multiple expressions, but xlincom can get coefficients, standard errors etc. for multiple linear combinations. You can still do a joint significance test afterwards.
                      Code:
                      . sysuse auto
                      (1978 Automobile Data)
                      
                      . qui regress price mpg headroom if foreign, absorb(rep)
                      
                      . est sto foreign
                      
                      . qui regress price mpg headroom if !foreign, absorb(rep)
                      
                      . qui suest foreign .
                      
                      .
                      . xlincom (foreign_mean:mpg - foreign_mean:headroom) (foreign_mean:mpg - _LAST_mean:mpg), post
                      
                              lc_1:  foreign_mean:mpg - foreign_mean:headroom = 0
                              lc_2:  foreign_mean:mpg - _LAST_mean:mpg = 0
                      
                      ------------------------------------------------------------------------------
                                   |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                      -------------+----------------------------------------------------------------
                              lc_1 |  -143.9501   575.6055    -0.25   0.803    -1272.116    984.2161
                              lc_2 |   262.1829   175.1901     1.50   0.135    -81.18333    605.5491
                      ------------------------------------------------------------------------------
                      
                      . test (lc_1) (lc_2)
                      
                       ( 1)  lc_1 = 0
                       ( 2)  lc_2 = 0
                      
                                 chi2(  2) =    2.38
                               Prob > chi2 =    0.3048

                      Comment


                      • #12
                        Joro Kolev Actually, when I said "nearly always," the exception I had in mind is precisely the one you mentioned: a joint test of multiple hypotheses.

                        Wouter Wakker Thank you for letting everyone know about that. It looks very useful.

                        Comment


                        • #13
                          Thank you Clyde Schechter , Joro Kolev & Wouter Wakker for sharing your insights and knowledge as I have learnt many new things.
                          Cheers

                          Comment


                          • #14
                            Hi Clyde Schechter and Joro Kolev, a very insightful thread!
                            I would like to take your opinion on the following: when comparing three coefficients
                            Code:
                            test b1=b2=b3
                            (1) b1-b2=0
                            (2) b1-b3=0
                            . This does not seem to be correct as I am interested in comparing whether the three coefficients are equal.

                            I also tried
                            Code:
                            test (b1=b2=b3) (b2=b3)
                            but would get
                            Code:
                            (1) b1-b2=0
                            (2) b1-b3=0
                            (3) b2-b3=0
                            constraint 1 dropped
                            Is there something I am missing? Truly appreciated!
                            Last edited by Dana Lindsay; 24 Oct 2020, 08:36.

                            Comment


                            • #15
                              What you are missing is that if b1 = b2 and b1 = b3, then it automatically follows that b2 = b3. So b1 = b2, b1 = b3, and b2 = b3 are not three independent hypotheses: any one of them is an immediate consequence of the other two. Consequently any attempt to test all three will, for reasons of linear algebra, be first reduced to a test of two of them.

                              Comment

                              Working...
                              X