Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Testing Equality of Coefficients in Panel regressions with fixed effects

    Dear All,
    Here is a sample data set to illustrate my problem

    Code:
    clear
    input float(DEP_VAR KEY_INDEP_VAR INDEP_VAR2 CLASSIFIER) int YEAR long ID float(INDEP_VAR3 INDEP_VAR4)
       .4196444  .4024687        . 0 2000  2    .1149007  6.520957
       .4196444  .4009704        . 0 2001  2   .24896222  6.663493
       .4196444  .3998423        . 0 2002  2   .26072007  6.900324
       .4196444  .4140014        . 0 2003  2   .26668724  7.151483
       .4196444 .41139024        . 0 2004  2   .25815514  7.537394
       .4196444  .4105872        . 0 2005  2    .1658034  7.910522
       .4196444  .4080161 8.696139 0 2006  2   .27452102  8.568722
       .4196444  .4055055 8.823968 0 2007  2    .2495299  8.568722
       .4196444  .4028676 8.910974 0 2008  2   .27516872  8.568722
       .4196444  .4002361 8.995829 0 2009  2   .23221815  8.568722
       .4196444  .3967899 9.092073 0 2010  2   .21986148  8.568722
       .4196444  .3946235 9.177827 0 2011  2   .17028862  8.568722
       .4196444  .3935488 9.246742 0 2012  2   .20220695  8.568722
       .4196444   .391612 9.314875 0 2013  2    .1909197  8.568722
       .4196444  .3902445 9.380201 0 2014  2   .17756027  8.568722
       .4196444  .3889025 9.442431 0 2015  2   .15742895  8.568722
       .4196444 .38874185 9.502946 0 2016  2   .15947066  8.568722
       .4196444 .38785505 9.564058 0 2017  2   .16643506  8.568722
       .4196444  .3870677 9.624699 0 2018  2   .17196298  8.568722
       .4196444  .3866235 9.678946 0 2019  2    .1627214  8.568722
       .4196444  .3867989  9.69989 0 2020  2   .15263812  8.568722
     .001782375  .4024687        . 1 2000  3    .4352566  4.117358
      .07506486  .4009704        . 1 2001  3   .27886382  3.289281
      .08015507  .3998423        . 1 2002  3    .3831359  3.460993
      .11080937  .4140014        . 1 2003  3   .27745432  3.358309
      .12318628 .41139024        . . 2004  3    .2706878  3.253812
      .05495676  .4105872        . 0 2005  3   .17578977  2.961079
      .11250882  .4080161 8.696139 0 2006  3   .11942988  3.264652
        .185817  .4055055 8.823968 . 2007  3   .09715997 3.1463895
      .26649624  .4028676 8.910974 . 2008  3    .0983249 3.2026124
      .29137003  .4002361 8.995829 0 2009  3   .08459466 3.5195744
       .2339619  .3967899 9.092073 1 2010  3           0  3.330574
       .2133918  .3946235 9.177827 . 2011  3           0    3.4401
      .25490505  .3935488 9.246742 1 2012  3           0   3.43285
       .3379982   .391612 9.314875 0 2013  3     .090852 3.6888964
       .4196444  .3902445 9.380201 . 2014  3   .08868325  3.998591
       .4196444  .3889025 9.442431 0 2015  3   .10033272 4.1175566
     .030795567 .38874185 9.502946 0 2016  3           0  3.472645
      .05519127 .38785505 9.564058 . 2017  3           0 3.7213614
              .  .3870677 9.624699 . 2018  3           0 3.9333274
              .  .3866235 9.678946 0 2019  3  .003942816  5.369102
     .003141012  .3867989  9.69989 0 2020  3  .012372442  5.479082
      .08817343  .4024687        . 0 2000  4    .4714862  5.483004
      .08232908  .4009704        . 0 2001  4    .3274185  5.448893
      .09160918  .3998423        . 0 2002  4    .3392883  5.393401
      .05861457  .4140014        . 0 2003  4    .3676925  5.371376
      .10387107 .41139024        . 0 2004  4    .3055782  5.387999
      .14490883  .4105872        . 0 2005  4    .3012273  5.316881
      .11353124  .4080161 8.696139 1 2006  4    .2390373   5.27695
      .05594482  .4055055 8.823968 0 2007  4   .20271237  5.295784
      .05106938  .4028676 8.910974 0 2008  4   .08163238  5.316136
      .02873014  .4002361 8.995829 0 2009  4   .09768975  5.264062
       .0416321  .3967899 9.092073 0 2010  4    .0921433  5.281218
     .012291734  .3946235 9.177827 0 2011  4    .0394261  5.299751
      .01151117  .3935488 9.246742 0 2012  4    .0227324  5.355769
     .012942152   .391612 9.314875 0 2013  4   .05889014  5.279907
     .011257377  .3902445 9.380201 1 2014  4   .09329288  5.394837
      .14022571  .3889025 9.442431 0 2015  4   .07126821    5.8229
      .16539016 .38874185 9.502946 1 2016  4   .23762584  5.897147
      .14289762 .38785505 9.564058 0 2017  4   .25934657  6.103859
      .15344945  .3870677 9.624699 0 2018  4   .24311064   6.11808
      .14759725  .3866235 9.678946 0 2019  4   .19974297  6.076487
     .005506694  .3867989  9.69989 . 2020  4   .19937025  5.935676
       .4189282  .4024687        . 0 2000  5   .27946457  6.126015
      .37712145  .4009704        . 1 2001  5     .302858  6.233906
       .4114297  .3998423        . 1 2002  5   .36280185  6.296311
       .4196444  .4140014        . 0 2003  5    .3777492  6.172634
       .3711165 .41139024        . 0 2004  5    .3274431  6.185753
       .4196444  .4105872        . . 2005  5    .3055141  5.757123
       .4196444  .4080161 8.696139 1 2006  5    .4111101  5.924014
       .4196444  .4055055 8.823968 0 2007  5     .427788  6.638553
       .4196444  .4028676 8.910974 0 2008  5    .6071155  6.766109
       .4196444  .4002361 8.995829 0 2009  5    .5500529  6.971066
       .4196444  .3967899 9.092073 0 2010  5    .4761737  7.170271
       .4196444  .3946235 9.177827 0 2011  5    .2524227  7.187074
       .4196444  .3935488 9.246742 0 2012  5   .13400272  7.313036
       .4196444   .391612 9.314875 0 2013  5    .3108177  7.406482
       .4196444  .3902445 9.380201 0 2014  5   .29745048  7.546591
       .4196444  .3889025 9.442431 1 2015  5    .4017045  7.579047
       .4196444 .38874185 9.502946 0 2016  5    .3191169  7.547992
       .4196444 .38785505 9.564058 0 2017  5    .3220162   7.61064
       .4196444  .3870677 9.624699 0 2018  5   .26413673  7.585197
       .4196444  .3866235 9.678946 0 2019  5   .29524708   7.72386
       .4196444  .3867989  9.69989 0 2020  5    .3047621  7.768736
      .12951735  .4024687        . 1 2000  6     .418003 4.7449183
       .3599949  .4009704        . . 2001  6     .376466  4.891608
      .22343902  .3998423        . 1 2002  6    .4193791  4.934823
        .191076  .4140014        . . 2003  6   .54715073  4.905567
       .2988399 .41139024        . 0 2004  6    .5833143   4.84162
      .20355476  .4105872        . 0 2005  6   .58209556 4.6623416
      .10293768  .4080161 8.696139 . 2006  6    .5992895  4.636973
     .029037526  .4055055 8.823968 0 2007  6    .6091003   4.61447
     .034348644  .4028676 8.910974 0 2008  6    .6160207  4.494298
      .05626638  .4002361 8.995829 . 2009  6    .4965535  3.974428
      .05887687  .3967899 9.092073 0 2010  6   .14456286 3.9208505
      .05325947  .3946235 9.177827 0 2011  6 .0020694498   4.05321
     .023846464  .3935488 9.246742 0 2012  6     .520685  4.837115
     .027485143   .391612 9.314875 0 2013  6    .3078271 4.6877933
      .02552763  .3902445 9.380201 0 2014  6    .2202885 4.6981616
     .032119755  .3889025 9.442431 0 2015  6           0 4.3039594
      .02902087 .38874185 9.502946 0 2016  6   .01071296 4.3052044
     .001782375 .38785505 9.564058 0 2017  6  .014784257  4.220066
     .001782375  .3870677 9.624699 0 2018  6           0 4.0271215
     .001782375  .3866235 9.678946 0 2019  6           0 4.3393836
     .003908983  .3867989  9.69989 0 2020  6           0  4.022595
     .025522973  .4024687        . . 2000  7   .25962606  3.552472
       .0340437  .4009704        . 0 2001  7    .6901785 2.5649605
     .005055512  .3998423        . 0 2002  7    .6901785  2.143764
     .001782375  .4140014        . . 2003  7   .06947316  1.988455
      .04273269 .41139024        . . 2004  7  .033040736  2.395206
      .03188438  .4105872        . 1 2005  7           0  2.418965
     .001782375  .4080161 8.696139 0 2006  7           0 2.2867687
              .  .4055055 8.823968 0 2007  7           0 2.3821938
              .  .4028676 8.910974 . 2008  7           0  2.448133
              .  .4002361 8.995829 0 2009  7           0    2.4539
    .0019835185  .3967899 9.092073 0 2010  7           0  2.470286
     .003724012  .3946235 9.177827 0 2011  7           0 2.5072925
     .017431254  .3935488 9.246742 0 2012  7           0  4.590325
     .015452233   .391612 9.314875 0 2013  7           0   4.63968
     .012894478  .3902445 9.380201 0 2014  7           0 4.6879807
      .08606582  .3889025 9.442431 . 2015  7  .032310825  6.315142
      .07120861 .38874185 9.502946 0 2016  7   .06169887  7.073259
      .06557298 .38785505 9.564058 0 2017  7   .11665864    7.4032
     .066191815  .3870677 9.624699 0 2018  7   .13404545  7.327958
     .070996754  .3866235 9.678946 0 2019  7    .2105256  7.455929
      .08445104  .3867989  9.69989 0 2020  7    .3286501  7.583769
       .4196444  .4024687        . 0 2000  8    .3849556  6.334149
       .4196444  .4009704        . 0 2001  8    .3726082  6.192103
       .4196444  .3998423        . 0 2002  8    .3943456  6.207791
       .4196444  .4140014        . 0 2003  8   .40386915  6.243865
       .4196444 .41139024        . 0 2004  8    .3925646   6.39773
       .4196444  .4105872        . 0 2005  8    .3369155  6.406634
       .4196444  .4080161 8.696139 1 2006  8   .31233305  6.559548
       .3657962  .4055055 8.823968 0 2007  8   .28584546  6.683999
       .3834459  .4028676 8.910974 . 2008  8    .2901247  6.793157
      .28356972  .4002361 8.995829 . 2009  8    .3298718  7.002202
        .416855  .3967899 9.092073 . 2010  8    .3807886  7.327249
       .4196444  .3946235 9.177827 1 2011  8    .3985596  7.535055
       .4196444  .3935488 9.246742 . 2012  8    .4194649  7.656405
       .4080241   .391612 9.314875 . 2013  8     .403813  7.718023
      .40852195  .3902445 9.380201 . 2014  8    .4055497  7.772045
       .3363777  .3889025 9.442431 . 2015  8    .4460748  7.933904
       .3549092 .38874185 9.502946 1 2016  8    .4425395  8.043776
       .3335692 .38785505 9.564058 1 2017  8    .4367075 8.3348675
       .3202611  .3870677 9.624699 1 2018  8     .376037  8.375969
       .3028027  .3866235 9.678946 1 2019  8    .3951805 8.3754835
      .26789102  .3867989  9.69989 1 2020  8    .3580452   8.49336
      .17573717  .4024687        . 0 2000  9   .42445725   3.74948
      .18327375  .4009704        . 0 2001  9    .3716353  3.570936
      .13279758  .3998423        . 0 2002  9    .2183188  4.363394
      .10132074  .4140014        . . 2003  9     .391231 4.6513433
      .12927654 .41139024        . 1 2004  9    .2310229 4.6460023
      .11958592  .4105872        . 0 2005  9    .3238026   3.77982
       .1890452  .4080161 8.696139 . 2006  9    .0302392  3.442298
      .10846652  .4055055 8.823968 . 2007  9   .02312642 3.6674495
      .28170723  .4028676 8.910974 . 2008  9  .030944005  3.388571
       .3072265  .4002361 8.995829 . 2009  9   .06189213  3.525388
       .3236015  .3967899 9.092073 . 2010  9   .06974505  3.471983
      .28974172  .3946235 9.177827 . 2011  9  .070614636  3.632316
       .3024914  .3935488 9.246742 0 2012  9   .10134297 3.6503286
       .2761881   .391612 9.314875 0 2013  9   .03239963  5.134496
      .17067836  .3902445 9.380201 . 2014  9  .067145936   4.86458
       .2777974  .3889025 9.442431 0 2015  9    .3614368  6.680942
       .3698736 .38874185 9.502946 0 2016  9    .3145411  6.552244
       .4196444 .38785505 9.564058 0 2017  9    .2627067  6.168047
       .3769039  .3870677 9.624699 0 2018  9   .09294917  6.165583
       .3556239  .3866235 9.678946 0 2019  9   .05870803  6.423991
      .05760797  .3867989  9.69989 0 2020  9  .005036425   6.29673
       .4196444  .4024687        . 1 2000 10    .3803921  5.775288
       .4196444  .4009704        . 1 2001 10    .4027527  5.689853
       .4196444  .3998423        . 0 2002 10    .3976256  5.752839
       .4196444  .4140014        . 0 2003 10   .39306965  5.685172
       .4196444 .41139024        . 0 2004 10    .2848452   5.62852
       .4196444  .4105872        . 0 2005 10   .25784853  5.403455
       .4196444  .4080161 8.696139 0 2006 10   .27958253  5.349886
       .4196444  .4055055 8.823968 0 2007 10    .3784288  5.553452
       .4196444  .4028676 8.910974 0 2008 10    .3871575  5.734617
       .4196444  .4002361 8.995829 0 2009 10    .2340799  6.028867
       .4196444  .3967899 9.092073 0 2010 10   .16234614  6.091482
       .4196444  .3946235 9.177827 0 2011 10     .236366  6.320826
       .4196444  .3935488 9.246742 0 2012 10   .09839168  6.466332
       .4196444   .391612 9.314875 0 2013 10   .12476677   6.46118
       .4196444  .3902445 9.380201 0 2014 10   .07277682  6.439147
       .4196444  .3889025 9.442431 0 2015 10   .06257949  6.514164
       .3885663 .38874185 9.502946 0 2016 10           0  6.865293
       .3431408 .38785505 9.564058 0 2017 10           0  6.720148
      .20304494  .3870677 9.624699 0 2018 10 .0001718153  6.741126
       .4196444  .3866235 9.678946 0 2019 10   .23337205  7.344313
       .4196444  .3867989  9.69989 0 2020 10     .331839  7.534153
      .16257946  .4024687        . . 2000 11   .18689553  5.832678
      .14591911  .4009704        . 1 2001 11   .16285786   5.78719
      .12345189  .3998423        . 1 2002 11    .2114108  5.887971
      .09972193  .4140014        . 1 2003 11   .28243545  6.051654
      .06350448 .41139024        . 1 2004 11   .38774875  6.350501
      .05629202  .4105872        . 1 2005 11    .4127877  6.528805
      .04747233  .4080161 8.696139 1 2006 11    .4567783  6.765631
      .03846465  .4055055 8.823968 1 2007 11    .3720535  7.053907
      .03138966  .4028676 8.910974 1 2008 11   .41565245  7.327031
     .034022275  .4002361 8.995829 1 2009 11    .3437329  7.377018
      .03839217  .3967899 9.092073 1 2010 11    .3125515  7.545497
     .031392172  .3946235 9.177827 1 2011 11    .3208515   7.79481
     .025620727  .3935488 9.246742 1 2012 11    .3523318  7.741124
     .025113605   .391612 9.314875 1 2013 11   .27342692   7.82038
      .02584227  .3902445 9.380201 1 2014 11   .29526785  7.798272
      .02262329  .3889025 9.442431 1 2015 11    .3656212  7.777338
      .02813913 .38874185 9.502946 1 2016 11   .38316625  7.802012
     .035111096 .38785505 9.564058 1 2017 11   .34956875  8.007193
      .03139753  .3870677 9.624699 . 2018 11    .3118744  7.930224
      .04463003  .3866235 9.678946 . 2019 11   .28602308  7.868807
      .04558297  .3867989  9.69989 1 2020 11   .23891816   7.91596
      .04633692  .4024687        . 0 2000 12    .4220351  3.624404
       .4196444  .4009704        . 0 2001 12    .4500419  4.428351
       .4196444  .3998423        . 0 2002 12    .4607552 4.4297166
       .4196444  .4140014        . 0 2003 12    .4883931  4.643445
       .4196444 .41139024        . 0 2004 12    .3641308  4.948911
       .4196444  .4105872        . 0 2005 12   .19969048  4.821373
       .4196444  .4080161 8.696139 0 2006 12   .42810845  4.880197
       .4196444  .4055055 8.823968 0 2007 12   .50318766   5.02099
       .4196444  .4028676 8.910974 0 2008 12    .5110297  5.498688
       .4196444  .4002361 8.995829 0 2009 12    .3267261   5.54003
       .4196444  .3967899 9.092073 0 2010 12   .29564297  5.485782
       .4196444  .3946235 9.177827 0 2011 12    .3878596  5.658611
       .4196444  .3935488 9.246742 0 2012 12    .3464485  5.781709
       .4196444   .391612 9.314875 0 2013 12    .3117214  5.745099
       .4196444  .3902445 9.380201 0 2014 12   .43813875  5.887481
       .4196444  .3889025 9.442431 0 2015 12    .4008265  5.812134
       .4196444 .38874185 9.502946 0 2016 12    .3308759  5.597998
       .4196444 .38785505 9.564058 0 2017 12   .07524814  5.459126
       .4196444  .3870677 9.624699 0 2018 12   .04694226  5.630682
       .4196444  .3866235 9.678946 0 2019 12  .032955352  5.646551
       .4196444  .3867989  9.69989 0 2020 12   .05136236  5.921523
       .4196444  .4024687        . 1 2000 13     .437444  7.103093
       .4017219  .4009704        . . 2001 13    .3397913  6.770033
      .37165365  .3998423        . 1 2002 13   .29514235  6.741026
      .33229885  .4140014        . . 2003 13   .39552975  7.059959
       .3770875 .41139024        . 0 2004 13   .41941425  7.055874
       .3796169  .4105872        . 0 2005 13    .3695195  7.030344
       .3615903  .4080161 8.696139 0 2006 13    .4149515  7.150591
       .3163007  .4055055 8.823968 . 2007 13    .3704904  7.147031
      .26934403  .4028676 8.910974 0 2008 13    .3799973  7.340549
       .2842429  .4002361 8.995829 . 2009 13    .4167657  7.594751
      .25327224  .3967899 9.092073 . 2010 13   .51765305  7.823554
       .2013406  .3946235 9.177827 . 2011 13     .488953   7.89587
        .245086  .3935488 9.246742 0 2012 13    .4536096  7.885529
       .2275681   .391612 9.314875 . 2013 13    .3689959    7.8635
       .2326928  .3902445 9.380201 . 2014 13    .4180498  7.902614
       .2022766  .3889025 9.442431 0 2015 13    .3983751  7.693952
       .2486449 .38874185 9.502946 0 2016 13     .434764  7.817435
      .19909233 .38785505 9.564058 0 2017 13    .3335799  8.194481
      .16636637  .3870677 9.624699 0 2018 13    .4775562  8.475845
    end
    My regression for subsamples (where classifier is the basis of subsampling)

    Code:
    . reghdfe DEP_VAR KEY_INDEP_VAR INDEP_VAR2 INDEP_VAR3 INDEP_VAR4 if CLASSIFIER==1,absorb (ID YEAR) cluster(ID YEAR
    > )
    (dropped 6 singleton observations)
    (MWFE estimator converged in 6 iterations)
    note: KEY_INDEP_VAR is probably collinear with the fixed effects (all partialled-out values are close to zero; tol
    >  = 1.0e-09)
    note: INDEP_VAR2 is probably collinear with the fixed effects (all partialled-out values are close to zero; tol = 
    > 1.0e-09)
    Warning: VCV matrix was non-positive semi-definite; adjustment from Cameron, Gelbach & Miller applied.
    
    HDFE Linear regression                            Number of obs   =         21
    Absorbing 2 HDFE groups                           F(   2,      4) =       1.54
    Statistics robust to heteroskedasticity           Prob > F        =     0.3198
                                                      R-squared       =     0.9732
                                                      Adj R-squared   =     0.8929
    Number of clusters (ID)      =          5         Within R-sq.    =     0.1309
    Number of clusters (YEAR)    =          9         Root MSE        =     0.0531
    
                                     (Std. err. adjusted for 5 clusters in ID YEAR)
    -------------------------------------------------------------------------------
                  |               Robust
          DEP_VAR | Coefficient  std. err.      t    P>|t|     [95% conf. interval]
    --------------+----------------------------------------------------------------
    KEY_INDEP_VAR |          0  (omitted)
       INDEP_VAR2 |          0  (omitted)
       INDEP_VAR3 |  -.2292339   .1771522    -1.29   0.265    -.7210871    .2626193
       INDEP_VAR4 |  -.0185086   .0620067    -0.30   0.780    -.1906667    .1536494
            _cons |   .3745461   .4009051     0.93   0.403    -.7385448    1.487637
    -------------------------------------------------------------------------------
    
    Absorbed degrees of freedom:
    -----------------------------------------------------+
     Absorbed FE | Categories  - Redundant  = Num. Coefs |
    -------------+---------------------------------------|
              ID |         5           5           0    *|
            YEAR |         9           9           0    *|
    -----------------------------------------------------+
    * = FE nested within cluster; treated as redundant for DoF computation

    Code:
    reghdfe DEP_VAR KEY_INDEP_VAR INDEP_VAR2 INDEP_VAR3 INDEP_VAR4 if CLASSIFIER==0,absorb (ID YEAR) cluster(ID YEAR
    > )
    (dropped 1 singleton observations)
    (MWFE estimator converged in 7 iterations)
    note: KEY_INDEP_VAR is probably collinear with the fixed effects (all partialled-out values are close to zero; tol
    >  = 1.0e-09)
    note: INDEP_VAR2 is probably collinear with the fixed effects (all partialled-out values are close to zero; tol = 
    > 1.0e-09)
    Warning: VCV matrix was non-positive semi-definite; adjustment from Cameron, Gelbach & Miller applied.
    
    HDFE Linear regression                            Number of obs   =        114
    Absorbing 2 HDFE groups                           F(   2,      9) =       3.58
    Statistics robust to heteroskedasticity           Prob > F        =     0.0719
                                                      R-squared       =     0.9126
                                                      Adj R-squared   =     0.8864
    Number of clusters (ID)      =         10         Within R-sq.    =     0.0799
    Number of clusters (YEAR)    =         15         Root MSE        =     0.0586
    
                                    (Std. err. adjusted for 10 clusters in ID YEAR)
    -------------------------------------------------------------------------------
                  |               Robust
          DEP_VAR | Coefficient  std. err.      t    P>|t|     [95% conf. interval]
    --------------+----------------------------------------------------------------
    KEY_INDEP_VAR |          0  (omitted)
       INDEP_VAR2 |          0  (omitted)
       INDEP_VAR3 |    .118895   .0813685     1.46   0.178    -.0651734    .3029634
       INDEP_VAR4 |   .0050003   .0101176     0.49   0.633    -.0178873    .0278878
            _cons |   .2150775   .0517016     4.16   0.002     .0981203    .3320346
    -------------------------------------------------------------------------------
    
    Absorbed degrees of freedom:
    -----------------------------------------------------+
     Absorbed FE | Categories  - Redundant  = Num. Coefs |
    -------------+---------------------------------------|
              ID |        10          10           0    *|
            YEAR |        15          15           0    *|
    -----------------------------------------------------+
    * = FE nested within cluster; treated as redundant for DoF computation


    How do I test the equality of coefficient. I tried

    Code:
     reghdfe DEP_VAR i.CLASSIFIER##c.KEY_INDEP_VAR i.CLASSIFIER##c.INDEP_VAR2 i.CLASSIFIER##c.INDEP_VAR3 i.CLASSIFIER##c.INDEP_VAR4 , absorb(ID YEAR) cluster(ID YEAR)
    (MWFE estimator converged in 7 iterations)
    note: KEY_INDEP_VAR is probably collinear with the fixed effects (all partialled-out values are close to zero; tol
    >  = 1.0e-09)
    note: INDEP_VAR2 is probably collinear with the fixed effects (all partialled-out values are close to zero; tol = 
    > 1.0e-09)
    Warning: VCV matrix was non-positive semi-definite; adjustment from Cameron, Gelbach & Miller applied.
    warning: missing F statistic; dropped variables due to collinearity or too few clusters
    
    HDFE Linear regression                            Number of obs   =        142
    Absorbing 2 HDFE groups                           F(   7,     11) =          .
    Statistics robust to heteroskedasticity           Prob > F        =          .
                                                      R-squared       =     0.9243
                                                      Adj R-squared   =     0.9011
    Number of clusters (ID)      =         12         Within R-sq.    =     0.1006
    Number of clusters (YEAR)    =         15         Root MSE        =     0.0550
    
                                                 (Std. err. adjusted for 12 clusters in ID YEAR)
    --------------------------------------------------------------------------------------------
                               |               Robust
                       DEP_VAR | Coefficient  std. err.      t    P>|t|     [95% conf. interval]
    ---------------------------+----------------------------------------------------------------
                  1.CLASSIFIER |  -1.745365   6.060437    -0.29   0.779     -15.0843    11.59357
                 KEY_INDEP_VAR |          0  (omitted)
                               |
    CLASSIFIER#c.KEY_INDEP_VAR |
                            1  |   3.499729   9.788516     0.36   0.727    -18.04465    25.04411
                               |
                    INDEP_VAR2 |          0  (omitted)
                               |
       CLASSIFIER#c.INDEP_VAR2 |
                            1  |    .053511   .2416144     0.22   0.829    -.4782787    .5853006
                               |
                    INDEP_VAR3 |   .1223725   .0815676     1.50   0.162    -.0571565    .3019016
                               |
       CLASSIFIER#c.INDEP_VAR3 |
                            1  |  -.1000779   .1821596    -0.55   0.594    -.5010084    .3008526
                               |
                    INDEP_VAR4 |   .0043943   .0104287     0.42   0.682    -.0185591    .0273477
                               |
       CLASSIFIER#c.INDEP_VAR4 |
                            1  |  -.0177442   .0209805    -0.85   0.416    -.0639219    .0284336
                               |
                         _cons |    .200541   .0561075     3.57   0.004     .0770493    .3240328
    --------------------------------------------------------------------------------------------
    
    Absorbed degrees of freedom:
    -----------------------------------------------------+
     Absorbed FE | Categories  - Redundant  = Num. Coefs |
    -------------+---------------------------------------|
              ID |        12          12           0    *|
            YEAR |        15          15           0    *|
    -----------------------------------------------------+
    * = FE nested within cluster; treated as redundant for DoF computation
    
    .
    The source of above is based on Clyde Schechter excellent reply in post-https://www.statalist.org/forums/forum/general-stata-discussion/general/1480186-equality-of-coefficients-test-with-reghdfe-subsample-analysis-or-interaction-terms?p=1480267#post1480267. I tried a few more similar posts all in vain and my results based on interaction is not even close to subsample. So what did I do wrong? Can someone support me here

  • #2
    reghdfe is from https://github.com/sergiocorreia/reghdfe/, as you are asked to explain in FAQ Advice #12. You need to interact the fixed effects as well. See the linked thread for an illustration:

    https://www.statalist.org/forums/for...erent-outcomes

    Comment


    • #3
      Your key independent variable appears to only vary across time, and so putting in time dummies eliminates it (along with VAR2). You need to decide if this is really the analysis you want to do before testing equality of coefficients that are not identified by your estimation method. The variable CLASSIFIER varies across i and t, which is why the interaction terms doesn't drop out. So, you can test whether the interaction is zero.

      Allowing CLASSIFIER to interact with ever fixed effects is a tricky business. My guess is that you really just want to estimate one model with fixed effects and include the interaction terms. But you still won't identify the coefficient on KEY_INDEPEN_VAR.

      Comment


      • #4
        Dear Andrew Musau, Thank you for helping me by providing an excellent reference to your illustration with a focus on fixed effects. I tried to follow your post "https://www.statalist.org/forums/forum/general-stata-discussion/general/1529316-equality-of-regression-coefficients-when-estimating-regressions-with-same-covariates-and-different-outcomes?p=1529393#post1529393". On the above data I ran the following code


        Code:
        reghdfe DEP_VAR i.CLASSIFIER##c.KEY_INDEP_VAR i.CLASSIFIER##c.INDEP_VAR2 i.CLASSIFIER##c.INDEP_VAR3 i.CLASSIFIER
        > ##c.INDEP_VAR4 , absorb(i.CLASSIFIER##ID i.CLASSIFIER##YEAR) cluster(ID YEAR)
        (dropped 7 singleton observations)
        (MWFE estimator converged in 9 iterations)
        note: 1bn.CLASSIFIER is probably collinear with the fixed effects (all partialled-out values are close to zero; to
        > l = 1.0e-09)
        note: KEY_INDEP_VAR is probably collinear with the fixed effects (all partialled-out values are close to zero; tol
        >  = 1.0e-09)
        note: 1bn.CLASSIFIER#c.KEY_INDEP_VAR is probably collinear with the fixed effects (all partialled-out values are c
        > lose to zero; tol = 1.0e-09)
        note: INDEP_VAR2 is probably collinear with the fixed effects (all partialled-out values are close to zero; tol = 
        > 1.0e-09)
        note: 1bn.CLASSIFIER#c.INDEP_VAR2 is probably collinear with the fixed effects (all partialled-out values are clos
        > e to zero; tol = 1.0e-09)
        Warning: VCV matrix was non-positive semi-definite; adjustment from Cameron, Gelbach & Miller applied.
        
        HDFE Linear regression                            Number of obs   =        135
        Absorbing 2 HDFE groups                           F(   4,     11) =       6.07
        Statistics robust to heteroskedasticity           Prob > F        =     0.0079
                                                          R-squared       =     0.9237
                                                          Adj R-squared   =     0.8889
        Number of clusters (ID)      =         12         Within R-sq.    =     0.0824
        Number of clusters (YEAR)    =         15         Root MSE        =     0.0583
        
                                                     (Std. err. adjusted for 12 clusters in ID YEAR)
        --------------------------------------------------------------------------------------------
                                   |               Robust
                           DEP_VAR | Coefficient  std. err.      t    P>|t|     [95% conf. interval]
        ---------------------------+----------------------------------------------------------------
                      1.CLASSIFIER |          0  (omitted)
                     KEY_INDEP_VAR |          0  (omitted)
                                   |
        CLASSIFIER#c.KEY_INDEP_VAR |
                                1  |          0  (omitted)
                                   |
                        INDEP_VAR2 |          0  (omitted)
                                   |
           CLASSIFIER#c.INDEP_VAR2 |
                                1  |          0  (omitted)
                                   |
                        INDEP_VAR3 |    .118895   .0811288     1.47   0.171    -.0596682    .2974583
                                   |
           CLASSIFIER#c.INDEP_VAR3 |
                                1  |  -.3481289   .1975629    -1.76   0.106    -.7829618     .086704
                                   |
                        INDEP_VAR4 |   .0050003   .0100878     0.50   0.630    -.0172028    .0272033
                                   |
           CLASSIFIER#c.INDEP_VAR4 |
                                1  |  -.0235089   .0572486    -0.41   0.689    -.1495122    .1024944
                                   |
                             _cons |   .2398837   .0709301     3.38   0.006     .0837675    .3959998
        --------------------------------------------------------------------------------------------
        
        Absorbed degrees of freedom:
        -----------------------------------------------------------+
               Absorbed FE | Categories  - Redundant  = Num. Coefs |
        -------------------+---------------------------------------|
             CLASSIFIER#ID |        15          15           0    *|
           CLASSIFIER#YEAR |        24          24           0    *|
        -----------------------------------------------------------+
        * = FE nested within cluster; treated as redundant for DoF computation
        For testing the equality of coefficient I ran the following code (I want to test the interaction of CLASSIFIER with INDEP_VAR3

        Code:
         test 1.CLASSIFIER#INDEP_VAR3 = 0.CLASSIFIER#INDEP_VAR3
        
         ( 1)  - 0b.CLASSIFIER#co.INDEP_VAR3 + 1.CLASSIFIER#c.INDEP_VAR3 = 0
        
               F(  1,    11) =    3.11
                    Prob > F =    0.1058
        My doubts
        1) Is my code on fixed effects correct (seems yes)
        2) Is this the right way to test equality of coefficients of INDEP_VAR3 in two groups of CLASSIFIER (0 &1)?
        2) Professor Andrew, can I test the equality of coefficient manually in the above example?.

        Comment


        • #5
          Dear Jeff Wooldridge . "Your observation is correct, and it's hardly surprising, coming from you, considering this perceptive insight."

          Your key independent variable appears to only vary across time, and so putting in time dummies eliminates it (along with VAR2).
          Yes, my example is a toy one. So the key independent variable is akin to a macro-level variable.
          You need to decide if this is really the analysis you want to do before testing equality of coefficients that are not identified by your estimation method. The variable CLASSIFIER varies across i and t, which is why the interaction terms doesn't drop out. So, you can test whether the interaction is zero.
          You are absolutely right as CLASSIFIER is a firm-level variable which varies

          Allowing CLASSIFIER to interact with ever fixed effects is a tricky business. My guess is that you really just want to estimate one model with fixed effects and include the interaction terms. But you still won't identify the coefficient on KEY_INDEPEN_VAR.
          Well, you are right with respect to your observation on KEY_INDEPEN_VAR, but in my original data set, I will be able to estimate the coefficient of it. However if I can find an approach to test the equality of coefficients in two groups with firm and year fixed effects, then I can use the same approach to test the equality of my main variable of interest in two groups. However, it will be great if you could have a look at my code and results in the above post
          Thanks and Regards

          Comment


          • #6
            A piece of advice: Don't use a toy example if it doesn't illustrate what you want to do. My suggestion is to include the interactions as you did above, but don't absorb separately by CLASSIFIER and ID. What if, say, I want to interact KEY_INDEPEN_VAR with, say, INDEP_VAR3? Should I then interact INDEP_VAR3 with all the IDs? I'd say not. Think of their being a single source of heterogeneity for each unit, and then decide on the functional form for the observed explanatory variables.

            Comment


            • #7
              Dear Professor Jeff Wooldridge. I am thankful to you for your advice. However, regarding "
              but don't absorb separately by CLASSIFIER and ID"
              , I didnt understand what I must do from your advice, my apologies for that. But if you have time, can you help in editing the code so that I will understand it.

              Here is what I have done-
              Code:
               
               reghdfe DEP_VAR i.CLASSIFIER##c.KEY_INDEP_VAR i.CLASSIFIER##c.INDEP_VAR2 i.CLASSIFIER##c.INDEP_VAR3 i.CLASSIFIER##c.INDEP_VAR4 , absorb(i.CLASSIFIER##ID i.CLASSIFIER##YEAR) cluster(ID YEAR)
              Do you imply a single interaction
              Code:
               absorb(i.CLASSIFIER#ID i.CLASSIFIER#YEAR)
              -but then the coefficients will be different from subsamples for replication purposes, isnt it

              Comment

              Working...
              X