Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • OLS: Normalizing -> Centering -> Inerpreting results

    Hello Dear Community Members,

    Please help me with the following question:

    1. I have a direct impact model and estimate it using multiple univariate regression analysis (I am aware of assumptions).
    2. IVs have terrible distribution, so I normalize them taking a natural log (I intentionally want to retain the outliers).
    3. Model also includes interactions (three 2-way, one 3-way) and as a result there is multicollinearity issue, which I address by "centering" the variables involved in interactions (recomputing interactions with centered variables, of course).

    Please tell me the correct way of interpreting the unstandardized coefficients. I know I can add -beta- option, but I am not sure if it is correct to use this kind of betas in interpretation (since IVs were normalized and centered).

    Thank you in advance,
    Anton

    Results table attached.
    Click image for larger version

Name:	Screen Shot 2014-10-20 at 4.50.26 PM.png
Views:	1
Size:	49.7 KB
ID:	338605
    Last edited by Anton Ivanov; 20 Oct 2014, 14:49.

  • #2
    Here is one discussion:

    http://www3.nd.edu/~rwilliam/stats2/l53.pdf

    In general, a key reason for centering is so that a value of zero has a meaningful interpretation, e.g. a person with a score of zero has an average score.
    -------------------------------------------
    Richard Williams, Notre Dame Dept of Sociology
    Stata Version: 17.0 MP (2 processor)

    EMAIL: [email protected]
    WWW: https://www3.nd.edu/~rwilliam

    Comment


    • #3
      Originally posted by Richard Williams View Post
      Here is one discussion:

      http://www3.nd.edu/~rwilliam/stats2/l53.pdf

      In general, a key reason for centering is so that a value of zero has a meaningful interpretation, e.g. a person with a score of zero has an average score.
      Thank you for reply, Mr. Williams. And thank you for a good read.

      As for the notion that I normalized the variables, what is the proper way of interpreting "logged" coefficients? Shall I convert them "out of the natural log" ?

      Comment


      • #4
        As far as I understand it, I can say the following, for example, about ln_popul_c variable in my results: for a 10% increase in ln_popul_c, the difference in the expected mean pr score will be 3.91*ln(1.1)=.73

        Correct?

        Comment


        • #5
          As far as I understand it, I can say the following, for example, about ln_popul_c variable in my results: for a 10% increase in ln_popul_c, the difference in the expected mean pr score will be 3.91*ln(1.1)=.73
          The wording and logic are correct. But by my calculation 3.9*ln(1.1) = 0.37, not 0.73.

          Comment


          • #6
            Originally posted by Clyde Schechter View Post

            The wording and logic are correct. But by my calculation 3.9*ln(1.1) = 0.37, not 0.73.
            My bad, Mr. Schechter, it's a typo. And thank you for reply.

            Comment


            • #7
              You may have missed the small note in Richard's write up that points out centering does not fix colinearity. I know Belsey wrote on this probably in Belsey, Kuh, and Welsch, 1980. Regression Diagnostics. Centering changes the colinearity diagnostic numbers (something to do with removing colinearity with the constant I think), but doesn't really do anything to improve your estimates. Centering does change the parameter on the main effect since you're changing the value on which the interacting variable is zero. However, the total effect of the main and interaction (and I think the statistical significance on the interaction) are not changed.

              This is something you can try out easily with simulated data.

              Comment


              • #8
                Originally posted by Phil Bromiley View Post
                You may have missed the small note in Richard's write up that points out centering does not fix colinearity. I know Belsey wrote on this probably in Belsey, Kuh, and Welsch, 1980. Regression Diagnostics. Centering changes the colinearity diagnostic numbers (something to do with removing colinearity with the constant I think), but doesn't really do anything to improve your estimates. Centering does change the parameter on the main effect since you're changing the value on which the interacting variable is zero. However, the total effect of the main and interaction (and I think the statistical significance on the interaction) are not changed.

                This is something you can try out easily with simulated data.
                Thank you for the comment, Phil.

                Actually, in my analysis approach, I referred to Aiken (1991) (Aiken, L. S., & West, S. G. (1991). Multiple regression: Testing and interpreting interactions. Sage.), where he specifically suggests centering as a way to decrease correlation b/w the variables in case interactions are included.

                Please see the attached correlations for centered and not centered variables. One is still high, but I'll have to look into that cause that is strange.

                Click image for larger version

Name:	image_1000.png
Views:	1
Size:	63.5 KB
ID:	345746
                Last edited by Anton Ivanov; 22 Oct 2014, 16:57.

                Comment

                Working...
                X