Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Reghdfe & Testing equality of coefficients from two separate regressions

    Hi all

    I am studying a model of individual productivity spillovers in football. I have a panel data set of individual player performance attributes and team attributes across seasons.
    I use a two-way fixed effect model, controlling for individual fixed effects and team-by-season fixed effects. The dependent variable is a player performance index for player i and the independent variable of interest is a metric of the average productivity of his teammates (Let's call it avgprod).

    To estimate the model, I have been using the reghdfe command.
    And currently I trying to do some robustness checks by running the same regression but using a different metric of the average teammate productivity (Let's call it avgprod2).

    Is there a viable way to test if the coefficients on avgprod and avgprod2 from two separate reghdfe regressions are statistically different? suest doesn't work I'm afraid.

    Thank you

    Best
    Sam

  • #2
    Hi Sam,

    If this was just comparing two groups I would suggest doing a nested model and then using -test-

    However, in your case, I'm not sure. -suest- is not straightforward when writing a package (that's why areg, ivreg2, etc. don't allow it AFAIK).
    Have you thought about doing the stacking by hand (i.e. see the PDF for suest, example 3)?

    Just append the dataset, create a category for specification, and then add a nested model with vce(cluster ...) on that, and run -test- afterwards.

    Comment


    • #3
      Hi Sergio

      I'm not exactly sure what you meant by "comparing two groups"?
      I'm essentially comparing the coefficients of two different specifications of the independent variable of interest.

      Looks like the example 3 of suest is exactly what I'm trying to do. But is there anything particular to take note of given that I'm using reghdfe instead?

      Thank you.

      Best
      Sam

      Comment


      • #4
        Hi Sam,

        What I meant is that I can think of two main uses for -suest-. One is comparing the same model with different samples (e.g. does men and woman have the same betas). That can be done within -reghdfe- by estimating a broader / encompassing model and then applying -test- on a subset of parameters. That is not what you want. Instead, what you want is two different models (two different LHS from what I see).

        Now, I thought about adding support for -suest-, but it's not trivial so it doesn't work.

        What can you do? It's not easy at all, but one idea is to "stack" the two different specifications one on top of the other (with append) but with the same variable names. EG: instead of y1 and y2, name them both y. You will also need to create a variable indicating the specification (0 for the first, 1 for the second spec) and one indicating the original observation.

        Then, run the regression that will encompass both specifications, but adding interactions to your RHS (# or ## interactions) and to your absorbed variables so that the two specs are independent (except for the variance of the errors). Do this regression while clustering on the original observations (so the obs #1 of the first spec. is clustered together with obs #2 of the second spec). Finally, run -test-.

        I haven't done the procedure, and there may be some issues with it (assuming that both specs share the same residual variance) but it's my best bet. Others with more experience with -suest- may probably give you better advice. If you decide to run this, and it works, let me know for sure!

        Best,
        Sergio

        Comment


        • #5
          Hi Sergio

          I'm actually comparing two different RHS, not LHS. But I guess your points are still valid?

          It does sound a little complex to me, I'll need think about it, and perhaps post the detailed estimation equation here.

          Regards
          Sam

          Comment

          Working...
          X