Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Sorry if it sound difficult for me to understand statistics. I get the message that the predictive margins is not definable because pcomm*lcomm can take any possible combination at different predictive values of y. If that the case, what the coefficient -0.25 for c.pcomm#c.lcomm#c.ucomm means?

    Comment


    • #17
      It is not something that can be explained in simple terms. It is the third mixed partial derivative of y with respect to pcomm, lcomm, and ucomm. It is, in this respect, quite different from a two-way interaction of continuous variables, or a three way interaction of dichotomous variables. It is also difficult to understand it graphically because it requires emulating four dimensions. You could somewhat get it at graphically by creating a graph with a series of small repeats. For example, using the auto.dta, we can do something like this:

      Code:
      clear*
      sysuse auto
      
      regress price c.mpg##c.headroom##c.gear_ratio
      
      local headroom_values 2(.5)4.5
      local gear_ratio_values 2.25(0.25)3.75
      local mpg_values 15(10)35
      
      local repeats
      foreach h of numlist `headroom_values' {
          margins, at(gear_ratio = (`gear_ratio_values') mpg = (`mpg_values') headroom = `h')
          marginsplot, xdimension(mpg) name(h`=10*`h'', replace) title("headroom = `h'")
          local repeats `repeats' h`=10*`h''
      }
      graph combine `repeats', nocopies altshrink ycommon
      (You can also just do a single -marginsplot- after -margins, at(gear_ratio = (`gear_ratio_values') mpg = (`mpg_values') headroom = (`headroom_values'))-, but the resulting figure contains a huge legend that crushes the graph to almost invisibility, and if you specify -legend(off)- you get a graph with lots of overlapping and crossing lines that is hard to read at best, and, on top of that, there is nothing that helps you identify which line corresponds to what values.)

      I think it is a mistake to focus on interpreting individual terms in interaction models. Even in a simple interaction model of two dichotomous variables, where such an interpretation can be made in a fairly simple sentence, it is commonly misunderstood. And when you start interacting continuous variables, or going to higher order interactions, or, as in your case, both, it becomes pretty much a lost cause. It is better to focus on understanding the model as a whole and not try to put the effects of individual terms of the model into words. The only simple sentence you can see about a model as complicated as yours is that pcomm, lcomm, and ucomm are all continuous variables and the marginal effect of any of them depends on the concurrent values of the other two.

      Comment


      • #18
        It is better to focus on understanding the model as a whole and not try to put the effects of individual terms of the model into words
        The three interaction term is bothering me quite sometimes. Thanks for the advice, at least now I can stop searching and focus more on making sense of the overall model.

        The only simple sentence you can see about a model as complicated as yours is that pcomm, lcomm, and ucomm are all continuous variables and the marginal effect of any of them depends on the concurrent values of the other two.
        In order to see the marginal effect where I can see all the three variables in one graph, I run the following code

        Code:
             margins, at(lcomm=(0 1 2 ) pcomm= (0 1 2 3) ucomm=(0 1 2 )) asbalanced plot
        Click image for larger version

Name:	Graph9.png
Views:	1
Size:	16.6 KB
ID:	1533871



        I just need to make the graph more presentable so I can get an idea of the marginal effect of any of them when there are concurrent values of the other two variables. Isn't that correct?

        Comment


        • #19
          I don't think the graph you show came from the code you show in #18 because the legend of the graph shows no indication that values of lcomm have been taken into account, whereas the code clearly calls for that.

          I agree that looking at the marginal effect of each variable conditional on values of the other two would be a sensible way to look at this:

          Code:
          local vbles lcomm pcomm ucomm
          
          foreach v of local vbles {
              local other_vbles: subinstr local vbles "`v'" ""
              local ats
              foreach o of local other_vbles {
                  local ats `ats' `o' = (0 1 2 3)
              }
              margins, dydx(`v') at(`ats')
              marginsplot, name(`v'_mfx, name)
          }

          Comment


          • #20
            Hi Prof Clyde,

            I run the code you have given, but strange that I don't get to see the results. Am I doing correctly or not? Here what I entered

            Code:
             local vbles lcomm pcomm ucomm
            
            . foreach v of local vbles {
              2. 
            .     local other_vbles: subinstr local vbles "`v'" ""
              3. 
            .     local ats
              4. 
            .     foreach o of local other_vbles {
              5. 
            .         local ats `ats' `o' = (0 1 2 3)
              6. 
            .     }
              7. margins, dydx(`v') at(`ats')
              8. marginsplot, name(`v'_mfx, name)
            The code keep appear on result Stata Windows and not marginal effect result. Maybe I don't understand what you mean. Appreciate if you can give sample from auto.dta in #17. Thank you

            Comment


            • #21
              Prof Clyde Schechter,

              I think I've done it correctly this time. I still not interpret what all these pointing me to but I am very happy it works.

              Anything striking your eyes? Looks like interaction happen in the last graph only.

              By the way, what is the tail (sometime it short and sometime it long) is ? What it means (short/long tail)?

              Click image for larger version

Name:	Graph3.png
Views:	1
Size:	42.1 KB
ID:	1535027

              Comment


              • #22
                Well, as this is not my field of study, I am not well positioned to judge how much of an interaction is large enough to matter for practical purposes: you must decide that yourself, or in consultation with others in your discipline.

                That said, there is visible interaction when pcomm = 0, .5 and 1. For pcomm = 1.5 and 2 it is not really visible (at least not to my eye). For pcomm = 2.5 interaction is just perceptible, and at pcomm = 3, we see interaction again. Interaction is present whenever the lines on the graph are not parallel. Again, you must be the judge of whether the interaction is large enough to pay attention to.

                Comment


                • #23
                  Again, you must be the judge of whether the interaction is large enough to pay attention to.
                  Duly noted.

                  That said, there is visible interaction when pcomm = 0, .5 and 1. For pcomm = 1.5 and 2 it is not really visible (at least not to my eye). For pcomm = 2.5 interaction is just perceptible, and at pcomm = 3, we see interaction again. Interaction is present whenever the lines on the graph are not parallel.
                  If you've not mentioned, I would only think that interaction is only happen when the lines crossing among each other.

                  You help me a lot in understanding my research results. I thank you so much for that.

                  Comment

                  Working...
                  X