Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Multicollinearity check after -xtreg-

    Hi everybody.

    I have a question about fixed effects regression models. I am running a two-way fixed effects model on an unbalanced panel data of 56 companies over 6 years. The analysis is aimed at identifying the relationship between capital structure and profitability (return on equity) of a firm. I am using two control variables: SIZE (measured as natural logarithm of sales) and GROWTH (measured as annual growth rate of sales), whereas the dependent variable is TDA (total debt on assets' book value).

    Therefore, my question is: how is it possible to check for multicollinearity after -xtreg- ? Other previous posts about this topic show that the easiest way should be to run a pooled OLS regression including all dummies (time and entity):

    Code:
    reg ROE TDA GROWTH SIZE i.year i.Companyname, robust
    and then using VIFs measures to identify collinearity. Nevertheless, I found out that fixed effects models are bound to produce very large VIFs, so these values could be misleading.

    I have attached a pdf file with all relevant results to provide further details. As it is possible to see, variable labelled "SIZE" shows really high VIF, as well as "TDA". Therefore, do I have to consider the whole model wrong?

    Thank you all in advance for your answers.
    Attached Files

  • #2
    Dear Antonio,

    Why are you worried about collinearity? Anyway, a model won't be "wrong" just because of collinearity.

    All the best,

    Joao

    Comment


    • #3
      Mr Silva, thank you very much for your answer.

      I am a bit concerned since collinearity affects not only control variable (SIZE) but also the explanatory variable (TDA) and, at least from a theoretical point of view, this may bias the whole inference process. I have also tried to standardize the regressors and to delete all possible outliers, but nothing seems to work properly. May this situation be caused by the specification of the regression or by the methodology (i.e. fixed effects) used?

      Comment


      • #4
        Dear Antonio,

        Collinearity does not cause bias, so you do not have to worry about that. However, deleting some observations may cause bias and inconsistency, so I would advise against doing that. Please check a good textbook to review the consequences of collinearity; I suggest you read Chapter 23 of Arthur Goldberger's "A course in econometrics".

        All the best

        Joao
        PS: You may also want to read this thread: http://www.statalist.org/forums/foru...ity-panel-data
        Last edited by Joao Santos Silva; 07 Aug 2015, 09:18. Reason: Added PS.

        Comment


        • #5
          Suppose you have two independent variables, x1 and x2, that are high collinear in the sense that the R2 of a regression of x1 on x2 is practically (not exactly) equal to one.
          Suppose that you regress y on x1 and get a relatively high R2 and a p-value for the coefficient on x1 of zero to the third decimal.
          You now regress y on x2 and the same thing happens.
          Now you regress y on x1 and x2 and the p-values of the coefficients for both variables are very high. (The standard errors are large)
          What does this mean.? That once you account for x1, x2 adds very littel information; and vice versa for x2 versus x1.
          There is no issue of biasedness arising here.
          You would want to decide whether you should keep x1 or x2 or both.

          Comment


          • #6
            Thank you both for your helpful answers.

            I have just another question, since I am still a bit confused. In case of severe collinearity standard errors are supposed to be inflated, as well as p-values of the coefficients, as Mr de Souza stated with the example. However, my data show a completely different situation, due to the fact that VIFs values are extremely large but standard errors are reasonably in line with the coefficients magnitude. Is this enough to state that collinearity may be safely ignored in my case?
            Last edited by Antonio Sicari; 07 Aug 2015, 10:18.

            Comment


            • #7
              Dear Antonio,

              Collinearity does not make the standard errors invalid; standard errors may be large to reflect the fact that some coefficients are difficult to estimate (as in the example Eric gave), but that is how it should be. In your case the coefficients are estimated with reasonable precision and therefore I would not worry at all about this. Even if you had a problem with collinearity, there would be little or nothing that you could do to solve it.

              All the best,

              Joao

              Comment


              • #8
                Antonio,
                If you have access to Wooldridge, Introductory Econometrics, 5ed,, please read what he has to say about VIFs on page 94 (international edition)

                Comment


                • #9
                  I have checked carefully all you suggested in your posts, and I ended up with the following conclusion: I should not be worried too much about collinearity between regressors unless I am not interested in distinguishing specifically the individual impact of each explanatory variable on the dependent variable, in that case it may be tough to separate their effects. Could you please confirm this?

                  I would like to thank you both for your answers and helpful suggestions,

                  Kind regards,

                  Antonio

                  Comment


                  • #10
                    Yes, I think that is about right :-)

                    Joao

                    Comment

                    Working...
                    X