Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Comparing two interaction terms using "test"

    Hello,

    I have two interaction terms, which I would like to compare statistically. Say, some variable X1 diminishes the relationship between X2 and Y to a greater extent than the relationship between X3 and Y. All X1, X2, and X3 are standardized.

    Can I use the test command to compare two moderation terms like I do below (test _b[c.x1_std#c.x2_std]=_b[c.x1_std#c.x3_std])?

    Thank you.

    Code:
    . webuse regress, clear
    
    . egen x1_std = std(x1)
    
    . egen x2_std = std(x2)
    
    . egen x3_std = std(x3)
    
    . reg y c.x1_std##c.x2_std c.x1_std##c.x3_std, noomitted
    note: x1_std omitted because of collinearity
    
          Source |       SS           df       MS      Number of obs   =       148
    -------------+----------------------------------   F(5, 142)       =     70.48
           Model |  3483.26677         5  696.653355   Prob > F        =    0.0000
        Residual |  1403.65215       142  9.88487427   R-squared       =    0.7128
    -------------+----------------------------------   Adj R-squared   =    0.7027
           Total |  4886.91892       147  33.2443464   Root MSE        =     3.144
    
    ------------------------------------------------------------------------------
               y |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
    -------------+----------------------------------------------------------------
          x1_std |   .6408871   .4571217     1.40   0.163     -.262756     1.54453
          x2_std |   1.956235   .4744294     4.12   0.000     1.018377    2.894092
                 |
        c.x1_std#|
        c.x2_std |  -.5591356   .5141205    -1.09   0.279    -1.575455    .4571836
                 |
          x3_std |  -5.296306   .4173887   -12.69   0.000    -6.121404   -4.471207
                 |
        c.x1_std#|
        c.x3_std |  -1.161696   .3973641    -2.92   0.004    -1.947209   -.3761817
                 |
           _cons |   20.02874   .3886223    51.54   0.000     19.26051    20.79698
    ------------------------------------------------------------------------------
    
    . test _b[c.x1_std#c.x2_std]=_b[c.x1_std#c.x3_std]
    
     ( 1)  c.x1_std#c.x2_std - c.x1_std#c.x3_std = 0
    
           F(  1,   142) =    0.53
                Prob > F =    0.4658


  • #2
    Well, this is just as legitimate with the interaction variables as it would be with any other variables.

    Frankly, the notion that you can infer that "this has a bigger effect than that" based on "the coefficient of this is bigger than the coefficient of that" is, in my view, false. At best, it is true under very stringent conditions about the distributions of the variables involved, conditions that are not satisfied simply because the variables have been standardized. Moreover, even when those conditions are met and the statement is true, when you start to think about actionable conclusions, you usually realize that this information is not actionable and is, at most, of theoretical interest.

    So, I wouldn't do this anyway. But if you want to do it, don't let the fact that the terms are interactions deter you: that makes no difference at all.

    Comment


    • #3
      Dear Clyde,

      Thank you for your quick reply and for sharing your concerns.

      What also interested me (assuming the conditions you mentioned are satisfied) is whether I can simply compare beta coefficients of the moderation terms using the test or lincom commands as I did above or I should somehow include the main effects (i.e. x2_std and x3_std) into the calculations?

      Regards,
      Marco

      Comment


      • #4
        It depends on what hypothesis you are testing. If your hypothesis is about comparing x2's moderation of x1's effect on y with x3's moderation of x1's effect on y, then only the interaction terms are relevant.

        If, however, your hypothesis is about the overall effect of x2 on y (both directly and by virtue of its moderation of x1's effect) compared to that of x3, then you need to incorporate x2 and x3 directly as well. But to do that you will need a sharp and specific hypothesis that conditions on particular values of x1, because, remember, if x2 modifies x1's effect on y, then it is also the case that x1 modifies x2's effect on y.

        Comment


        • #5
          Dear Clyde Schechter, thanks for these insightful comments.
          Which very stringent conditions would allow safely comparing the interaction terms of the variables under investigation?

          Comment


          • #6
            The frequency distributions need to be the same.

            Comment

            Working...
            X