Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Let's be clear: standardizing variables will have no impact on any properly done interpretations of any kind of regression model. Standardizing is just a combination of centering and rescaling. So standardizing variables should be done when, and only when, the standardized variable is easier to understand than the unstandardized one. This occurs primarily, if not exclusively, with interval-level measurements that have only arbitrary units. When a measurement has meaningful units of its own, standardization just obfuscates the findings. In the end, of course, it is possible to de-obfuscate the findings that come with the use of standardized variables by inverting the standardization process on the results (if you have access to the data so that you can find out the appropriate means and standard deviations to use). And anyone who takes the trouble to do this will reach the same conclusions that would be reached with analysis of unstandardized variables. But this approach is antithetical to the basic premise of research, wherein we are trying to clarify things and explain them in terms that are easily comprehensible to our intended audience, not to put obstacles in their path.
    Last edited by Clyde Schechter; 16 Nov 2017, 13:12.

    Comment


    • #17
      Originally posted by Clyde Schechter View Post
      Let's be clear: standardizing variables will have no impact on any properly done interpretations of any kind of regression model. Standardizing is just a combination of centering and rescaling. So standardizing variables should be done when, and only when, the standardized variable is easier to understand than the unstandardized one. This occurs primarily, if not exclusively, with interval-level measurements that have only arbitrary units. When a measurement has meaningful units of its own, standardization just obfuscates the findings. In the end, of course, it is possible to de-obfuscate the findings that come with the use of standardized variables by inverting the standardization process on the results (if you have access to the data so that you can find out the appropriate means and standard deviations to use). And anyone who takes the trouble to do this will reach the same conclusions that would be reached with analysis of unstandardized variables. But this approach is antithetical to the basic premise of research, wherein we are trying to clarify things and explain them in terms that are easily comprehensible to our intended audience, not to put obstacles in their path.
      Dear Clyde,

      Many thanks again! Sorry for my late response to your answer. I totally understand your point. Actually, the primary reason for me to use standardization is just to make coefficients comparable (to avoid some extraordinary coefficients).

      I know that standardization does not change statistical property. However, I find two weird situations:

      1. I tried three random-effects regressions with original variables, standardized variables, and logarithm variables, respectively. The regression completely with original variables and the one completely with standardized variables give the same statistical parameters (e.g. R-squared and p-value). But the one completely with logarithm variables give different statistical parameters.

      2. I tried two System GMM regressions, other things being equal, one completely with original variables and the other one completely with standardized variables. But the two regressions give different statistical parameter (e.g. p-value).

      I am not sure about the reason. In theory, I think all the regressions should give the same statistical parameters (except coefficients), as model specifications are the same.

      Thank you very much again.

      Comment


      • #18
        Your expectations are wrong There is no reason to think that a regression with logarithmically transformed variables will produce similar parameters to the results of a linear (standardized or not) regression. The log transformation makes it a completely different model.

        I cannot comment on GMM regressions--I don't know anything about them.

        Comment


        • #19
          Originally posted by Clyde Schechter View Post
          Your expectations are wrong There is no reason to think that a regression with logarithmically transformed variables will produce similar parameters to the results of a linear (standardized or not) regression. The log transformation makes it a completely different model.

          I cannot comment on GMM regressions--I don't know anything about them.
          Thank you so much again!

          Comment

          Working...
          X