Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Lower R-squared when adding regressors

    Hello dear readers,
    I am working on regression analysis on panel data with time series.
    I have noticed that, when adding the last control variable to my model (a dummy variable), the R-squared diminishes by about 3% with respect to the model without the dummy.

    Is it actually possible that this dummy, whose coefficient is pretty high and is (or seems to be) significative at 0.1%, is invalidating my results? I am used to the notion that regressors always raise the R-squared level, even if they are not adding any explanatory power to the regression.

    I checked forcorrelation between regressors (fairly low), and then collinearity - the tests are negative.
    Any idea as to why this may happen? Thank you very much!

  • #2
    Ambra:
    it's difficult (for me, at any rate), to reply positively without seeing what you typed and what Stata gave you back (as per FAQ).
    For instance, I fail to get what you mean by collinearity test being negative.
    Kind regards,
    Carlo
    (Stata 18.0 SE)

    Comment


    • #3
      I agree with Carlo. Also, did you lose any cases when you added the dummy because of missing data?
      -------------------------------------------
      Richard Williams, Notre Dame Dept of Sociology
      Stata Version: 17.0 MP (2 processor)

      EMAIL: [email protected]
      WWW: https://www3.nd.edu/~rwilliam

      Comment


      • #4
        Dear both,
        thank you for your answers; unfortunately, I am forbidden to upload any (even temporary) result of the analysis I am performing - I understand that this may be a constraint.
        My apologies for the misunderstanding: with 'the results of the collinearity test are negative' I mean that I can't reject the null hypothesis of absence of multicollinearity.
        As per Richard's comment, I did not check that point, but I will, thank you for your suggestion.

        Comment


        • #5
          Ambra:
          how did you test multicollinearity?
          Kind regards,
          Carlo
          (Stata 18.0 SE)

          Comment


          • #6
            Carlo,

            After pre-checking with -pwcorr- for pairwise correlation coefficients, I performed -vif, uncentered- after the regressions and, additionally -fgtest- (user-written command for Farrar-Glauber multicollinearity test) - alhough the last one was more a confirmation of the results of the VIF test.

            I must say that indeed, I detected one multicollinear variable that I excluded from the analysis. Also, as Richard suggested, there were some missing values that prevented the dummy from changing value in the regression. The results make much more sense now.

            Comment


            • #7
              Ambra:
              -estat vif- is not supported by -xtreg- that would be the first choice when you deal with panel dataset with continuous dependent variable.
              As far as I can get the description of your approach, it seems that you used -regress-, instead, which rarely outperforms -xtreg- in panel data regression.
              Again, even though your dataset is subject to a confidentiality agreement, posting what you typed and what Stata gave you back (changing the original name of variables if needed) is always the best way to get helpful and efficient replies.
              Kind regards,
              Carlo
              (Stata 18.0 SE)

              Comment

              Working...
              X