Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Issues with Dynamic Panel Data with Fixed Effects

    Good evening all,

    I am trying to determine the effect of financialisation on inequality. I have panel data for 6 countries covering a period of 25 years. My dependent variable is the Gini coefficient, my independent variables that work as a proxy for financialisation are the volume of stocks traded as a percentage of GDP and the level of corporate debt as a percentage of GDP. Following several articles I read about similar subjects I've also included a lag of the dependent variable as well as several control variables including the female labour participation rate, GDP growth, unemployment rate, union density, social expenditure as a percentage of gdp, and trade as a percentage of GDP.

    I ran a regression using panel-corrected standard errors, however it's been brought to my attention that dynamic panel models with fixed effects produce biased results. My supervisor recommended using GMM or Hausman-Taylor estimators instead, however by researching online I've found that GMM does not work with a N<T panel, and I don't have time-invariant variables in order to use Hausman-Taylor, so this is not possible (as far as I understand).

    By taking this into account, is it possible to use one of those two? If it is, how can I reliably test which of my variables can be treated as endogenous? If it's not possible, what other type of regression could I run instead?

    Thank you for your time!

  • #2
    GMM and Hausman-Taylor tackle different issues. If you are not interested in the coefficients of time-invariant variables, then Hausman-Taylor does not help here.

    If your T is large, then you may not have to worry about the dynamic panel data bias resulting from the inclusing of the lagged dependent variable. This bias vanishes with increasing T.
    https://twitter.com/Kripfganz

    Comment


    • #3
      Sebastian,

      Thank you for your prompt response. Is t=25 considered large when n=6? Otherwise I cannot really increase the number of years due to availability of data.

      Comment


      • #4
        With such a small N, there is not much else you can do besides a conventional fixed-effects regression with xtreg.
        https://twitter.com/Kripfganz

        Comment


        • #5
          Dear Proffessor Kripfganz, Why is this the case that with a small N you cannot compute a dynamic panel model?

          Comment


          • #6
            The weighting matrix used for GMM estimation in an overidentified model (when you have more instruments than regressors) is clustered on the group-level. For this weighting matrix to be estimated precisely, you need to have a large number of groups (large N). The same applies to the calculation of standard errors.

            You can still estimate a panel model with small N if you do not use an optimal weighting matrix (i.e., use a one-step instead of a two-step GMM estimator). For efficiency and correct coverage of your confidence intervals, this may require that a homoskedasticity assumption holds. (For some estimators, including the so-called system GMM estimator, no optimal one-step weighting matrix exists. Statistical inference based on small N would then be unreliable in any case.)

            One way to circumvent those problems is to estimate the group-specific effects explicitly by including dummy variables for each group. This is something you can only reliably do when N is small (and T relatively large). You can then simply use the regress or ivregress commands to estimate your model, effectively ignoring the panel structure. When T is large, the bias from including a lagged dependent variable becomes negligible.
            https://twitter.com/Kripfganz

            Comment

            Working...
            X