Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Sebastian Kripfganz
    replied
    I cannot replicate this error message on my computer. Would it be possible for you to send me your data set by e-mail and the exact command lines you typed, so that I can investigate the problem?

    Leave a comment:


  • Israel Garcia
    replied
    Hi Sebastian,
    Thanks for the reply, I will try to figure it out.
    By the way, after estimating the model I get an error when using
    Code:
    estat serial
    xtdpbc_score(): 3260 nonclass found where class required
    <istmt>: - function returned error

    I have updated the command again but the problem persists, do you know what I could be doing wrong?
    Thanks

    Leave a comment:


  • Sebastian Kripfganz
    replied
    I am afraid I do not see a simple way of computing marginal effects for the long-run effects. You can use the nlcom command to compute the long-run effects, but you cannot combine it with margins. The only way I see is to work out the expression for the desired marginal effect yourself and to then calculate it with the nlcom command.

    Leave a comment:


  • Israel Garcia
    replied
    Hi Sebastian,
    Thanks a lot for the update, the estat serial feature is very useful!
    I have a question regarding long-run effects, interactions and marginal effects with your command xtdpdbc.
    I run the following regression:
    Code:
    xtdpdbc y x i.dummy##i.dummy2
    If I want to get different marginal effects in the short run I could do
    Code:
    margins r.dummy, dydx(dummy2)
    or
    Code:
    margins, dydx(dummy2) at(dummy=(0 1)
    Is there anything similar to get the same marginal effects for the long-run?
    Thanks.

    Leave a comment:


  • Sebastian Kripfganz
    replied
    With the usual thanks to Kit Baum, the xtdpdbc command is now also available from SSC, in case you cannot install it directly from my website:
    Code:
    ssc install xtdpdbc

    Leave a comment:


  • Sebastian Kripfganz
    replied
    Yesterday's version unfortunately contained a bug which could lead to unexpected behavior of the command under older Stata versions (Stata 15 or earlier). This has now been fixed. Please update the command again.

    In addition, estat overid will now show an error message if it is run after the just-identified fixed-effects estimator. estat hausman will now show a note under some circumstances if it is likely that a degrees-of-freedom adjustment is necessary. (A word of caution: The reverse is not true. If the command does not display the warning note, you cannot infer that the degrees of freedom are correct. Always check yourself!)

    A situation where this can happen is when the model is estimated with time effects, because the additional level moment conditions for the time effects are asymptotically redundant, even though some of them might not be numerically redundant in finite sample when the panel data set is unbalanced. Due to the asymptotic equivalence of the overidentification test and the generalized Hausman test, the correct degrees of freedom can normally be inferred from the estat overid output. They can then be manually supplied to estat hausman with the df() option. While estat overid adjusts the degrees of freedom automatically, this is a more complex issue for estat hausman and therefore not (currently) implemented.

    As a general rule, when contrasting the random-effects versus the fixed-effects estimator, the degrees of freedom equal the number of exogenous regressors, excluding the lagged dependent variables and excluding deterministic terms such as time dummies. If the random-effects assumption is only applied to a subset of the regressors in a hybrid model, the degrees of freedom are reduced accordingly.

    Leave a comment:


  • Sebastian Kripfganz
    replied
    A significant update to version 1.2.0 is available for the xtdpdbc command:
    Code:
    net install xtdpdbc, from(http://www.kripfganz.de/stata/) replace
    With the re option, you can now compute a random-effects (RE) version of the bias-corrected estimator. This is a GMM (not GLS) estimator, which adds the level moment conditions for the exogenous regressors to the existing moment conditions, which are utilized by the fixed-effects (FE) estimator. Consequently, the RE estimator is overidentified. This has the benefit that a Sargan-Hansen overidentification test is a simple way of testing for RE versus FE. This test is now implemented in the estat overid postestimation command. Alternatively, a generalized Hausman test can be used for the same purpose with the new estat hausman command.

    The command also allows to estimate hybrid models in between the RE and FE models, in which the RE assumption is only applied to a subset of regressors. This can be achieved with the new option hybrid(). Furthermore, the Arellano-Bond test for serial correlation in the first-differenced residuals is now implemented in the estat serial postestimation command.

    For full details, please see the help files:
    Code:
    help xtdpdbc
    help xtdpdbc postestimation
    Here is a simple example:
    Code:
    . webuse abdata
    
    . xtdpdbc n w k, fe teffects vce(robust)
    
    Bias-corrected estimation
    Iteration 0:   f(b) =  .00223714  
    Iteration 1:   f(b) =  3.069e-06  
    Iteration 2:   f(b) =  7.574e-10  
    Iteration 3:   f(b) =  5.466e-17  
    
    Group variable: id                           Number of obs         =       891
    Time variable: year                          Number of groups      =       140
    
    Fixed-effects model                          Obs per group:    min =         6
                                                                   avg =  6.364286
                                                                   max =         8
    
                                         (Std. err. adjusted for clustering on id)
    ------------------------------------------------------------------------------
                 |               Robust
               n | Coefficient  std. err.      z    P>|z|     [95% conf. interval]
    -------------+----------------------------------------------------------------
               n |
             L1. |   .7618648   .1021882     7.46   0.000     .5615796      .96215
                 |
               w |  -.4193358   .1337239    -3.14   0.002    -.6814298   -.1572417
               k |   .2283234   .0532835     4.29   0.000     .1238897    .3327571
                 |
            year |
           1978  |  -.0198984    .014369    -1.38   0.166     -.048061    .0082643
           1979  |  -.0304133   .0168824    -1.80   0.072    -.0635021    .0026756
           1980  |  -.0629852   .0182608    -3.45   0.001    -.0987757   -.0271947
           1981  |  -.1126957   .0211605    -5.33   0.000    -.1541695   -.0712219
           1982  |  -.0816646   .0184412    -4.43   0.000    -.1178087   -.0455205
           1983  |  -.0401727   .0201478    -1.99   0.046    -.0796616   -.0006838
           1984  |   -.007744   .0261004    -0.30   0.767       -.0589    .0434119
                 |
           _cons |   1.684649   .4919518     3.42   0.001     .7204408    2.648856
    ------------------------------------------------------------------------------
    
    . estimates store fe
    
    . xtdpdbc n w k, re teffects vce(robust)
    
    Bias-corrected estimation
    
    Step 1:
    Iteration 0:   f(b) =  .10203684  
    Iteration 1:   f(b) =  .00604729  
    Iteration 2:   f(b) =  .00567393  
    Iteration 3:   f(b) =  .00567273  
    Iteration 4:   f(b) =  .00567272  
    
    Step 2:
    Iteration 0:   f(b) =  .14020188  
    Iteration 1:   f(b) =  .08998905  
    Iteration 2:   f(b) =  .08993042  
    Iteration 3:   f(b) =  .08993037  
    
    Group variable: id                           Number of obs         =       891
    Time variable: year                          Number of groups      =       140
    
    Random-effects model                         Obs per group:    min =         6
                                                                   avg =  6.364286
                                                                   max =         8
    
                                         (Std. err. adjusted for clustering on id)
    ------------------------------------------------------------------------------
                 |               Robust
               n | Coefficient  std. err.      z    P>|z|     [95% conf. interval]
    -------------+----------------------------------------------------------------
               n |
             L1. |   .9566098   .0248605    38.48   0.000     .9078841    1.005336
                 |
               w |  -.0559192   .0174443    -3.21   0.001    -.0901093    -.021729
               k |   .0415304    .020966     1.98   0.048     .0004378     .082623
                 |
            year |
           1978  |   .0096357   .0141993     0.68   0.497    -.0181943    .0374657
           1979  |   .0063768   .0152289     0.42   0.675    -.0234712    .0362249
           1980  |  -.0357028   .0169424    -2.11   0.035    -.0689093   -.0024962
           1981  |  -.1129358   .0185769    -6.08   0.000    -.1493458   -.0765257
           1982  |  -.1018467   .0147469    -6.91   0.000    -.1307501   -.0729433
           1983  |   -.039552   .0190684    -2.07   0.038    -.0769254   -.0021786
           1984  |  -.0447143   .0180495    -2.48   0.013    -.0800908   -.0093379
                 |
           _cons |    .237997   .0784342     3.03   0.002     .0842687    .3917253
    ------------------------------------------------------------------------------
    
    . estat overid
    
    Hansen test of the overidentifying restrictions
    note: degrees of freedom adjusted for time effects in unbalanced panels
    
    H0: overidentifying restrictions are valid             chi2(2)     =   12.5903
                                                           Prob > chi2 =    0.0018
    
    . estat hausman fe (L.n w k), df(2)
    
    Generalized Hausman test                               chi2(2)     =   19.9951
    H0: coefficients do not systematically differ          Prob > chi2 =    0.0000
    
    . estat serial, ar(1/3)
    
    Arellano-Bond test for autocorrelation of the first-differenced residuals
    H0: no autocorrelation of order 1:     z =   -2.3646   Prob > |z|  =    0.0180
    H0: no autocorrelation of order 2:     z =   -0.9081   Prob > |z|  =    0.3638
    H0: no autocorrelation of order 3:     z =    0.8619   Prob > |z|  =    0.3887
    Both the Hansen and the Hausman test suggest that the RE assumption is not justified. The Arellano-Bond serial correlation does not indicate any problem. (If there is no serial correlation in levels, we expect first-order serial correlation in first differences but no higher-order serial correlation.)

    Leave a comment:


  • Joseph L. Staats
    replied
    Thanks, so much. Your answers to my questions, as usual, are very helpful and allow me to go forward with my projects with more knowledge and confidence.

    Leave a comment:


  • Sebastian Kripfganz
    replied
    1. The stability condition for the dynamic process is that the sum of the coefficients of the lagged dependent variables should be less than 1. This is satisfied in your case. (There is no difference in this regard between xtdpdbc and xtdpdgmm.)

    2. Yes, that is what your results suggest. There is a contemporaneous short-run effect of x1 which is counteracted by a similar delayed short-run effect with opposite sign. They cancel out each other in the long-run.

    Leave a comment:


  • Joseph L. Staats
    replied
    Hello Sebastian,

    I have some additional questions about the use of xtdpdbc.

    1. When I use two lagged dependent variables in one of my projects, the coefficient of the first lag is greater than 1.00. Is this a problem, as I believe it would be if I were using xtdpdgmm? If I use only one lag, the coefficient is well below 1.00. Does that suggest that I should use only the first lag? Below are the results I get, first using only one lag and then two lags.

    Code:
    .  xtdpdbc y l(0/1).(x1 x2 x3 x4 x5 x6 x7 ) if l2.y~=.,  fe vce(robust)   lags(1) teffects
    
    Bias-corrected estimation
    Iteration 0:   f(b) =  .00080012  
    Iteration 1:   f(b) =  4.311e-06  
    Iteration 2:   f(b) =  6.914e-10  
    Iteration 3:   f(b) =  2.391e-16  
    
    Group variable: ccode                        Number of obs         =       970
    Time variable: year                          Number of groups      =        67
    
                                                 Obs per group:    min =         5
                                                                   avg =  14.47761
                                                                   max =        19
    
                                      (Std. err. adjusted for clustering on ccode)
    ------------------------------------------------------------------------------
                 |               Robust
               y | Coefficient  std. err.      z    P>|z|     [95% conf. interval]
    -------------+----------------------------------------------------------------
               y |
             L1. |   .9505701    .032202    29.52   0.000     .8874553    1.013685
                 |
              x1 |
             --. |   .0338362   .0119252     2.84   0.005     .0104632    .0572092
             L1. |  -.0314495   .0113423    -2.77   0.006    -.0536799   -.0092191
                 |
              x2 |
             --. |  -.0080593   .0107437    -0.75   0.453    -.0291165    .0129979
             L1. |   .0077397   .0089561     0.86   0.387    -.0098139    .0252934
                 |
              x3 |
             --. |   .0095416   .0031686     3.01   0.003     .0033314    .0157519
             L1. |  -.0025377   .0022687    -1.12   0.263    -.0069842    .0019088
                 |
              x4 |
             --. |  -.0388867   .0131287    -2.96   0.003    -.0646184    -.013155
             L1. |   .0260135   .0127738     2.04   0.042     .0009773    .0510498
                 |
              x5 |
             --. |  -.0077469   .0040453    -1.92   0.055    -.0156755    .0001817
             L1. |   .0006256   .0027068     0.23   0.817    -.0046797    .0059309
                 |
              x6 |
             --. |   .0041924   .0150888     0.28   0.781     -.025381    .0337659
             L1. |  -.0073131   .0154181    -0.47   0.635     -.037532    .0229058
                 |
              x7 |
             --. |  -.0016741   .0025556    -0.66   0.512     -.006683    .0033348
             L1. |   .0040669   .0030768     1.32   0.186    -.0019636    .0100973
                 |
            year |
           2001  |   .0000659   .0155769     0.00   0.997    -.0304642     .030596
           2002  |   .0261921    .016956     1.54   0.122     -.007041    .0594253
           2003  |   .0267463   .0191526     1.40   0.163    -.0107922    .0642848
           2004  |   .0032066   .0168039     0.19   0.849    -.0297284    .0361416
           2005  |  -.0096127   .0176495    -0.54   0.586    -.0442051    .0249797
           2006  |  -.0034658   .0165419    -0.21   0.834    -.0358874    .0289558
           2007  |   .0105695   .0185388     0.57   0.569    -.0257658    .0469048
           2008  |   .0033926   .0184238     0.18   0.854    -.0327174    .0395025
           2009  |   .0058206   .0181105     0.32   0.748    -.0296753    .0413166
           2010  |   .0232615   .0180021     1.29   0.196    -.0120219    .0585449
           2011  |  -.0081063   .0219142    -0.37   0.711    -.0510572    .0348447
           2012  |  -.0076749   .0186275    -0.41   0.680    -.0441842    .0288343
           2013  |  -.0162198   .0196031    -0.83   0.408    -.0546413    .0222016
           2014  |  -.0000364   .0185574    -0.00   0.998    -.0364082    .0363354
           2015  |   .0054735   .0213165     0.26   0.797    -.0363061    .0472531
           2016  |  -.0096166   .0187961    -0.51   0.609    -.0464562    .0272231
           2017  |  -.0080846   .0166891    -0.48   0.628    -.0407945    .0246254
           2018  |   .0112082   .0158973     0.71   0.481    -.0199498    .0423663
                 |
           _cons |   .0335686   .0215907     1.55   0.120    -.0087484    .0758855
    ------------------------------------------------------------------------------
    
    .  xtdpdbc y l(0/1).(x1 x2 x3 x4 x5 x6 x7 ) if l2.y~=.,  fe vce(robust)   lags(2) teffects
    
    Bias-corrected estimation
    Iteration 0:   f(b) =  .00088917  
    Iteration 1:   f(b) =  3.530e-06  
    Iteration 2:   f(b) =  6.444e-09  
    Iteration 3:   f(b) =  3.986e-14  
    
    Group variable: ccode                        Number of obs         =       970
    Time variable: year                          Number of groups      =        67
    
                                                 Obs per group:    min =         5
                                                                   avg =  14.47761
                                                                   max =        19
    
                                      (Std. err. adjusted for clustering on ccode)
    ------------------------------------------------------------------------------
                 |               Robust
               y | Coefficient  std. err.      z    P>|z|     [95% conf. interval]
    -------------+----------------------------------------------------------------
               y |
             L1. |   1.079941   .0932062    11.59   0.000     .8972602    1.262622
             L2. |  -.1456414   .0836234    -1.74   0.082    -.3095402    .0182574
                 |
              x1 |
             --. |   .0320216   .0110966     2.89   0.004     .0102727    .0537705
             L1. |  -.0286191   .0104707    -2.73   0.006    -.0491414   -.0080969
                 |
              x2 |
             --. |   -.008389   .0107657    -0.78   0.436    -.0294894    .0127114
             L1. |   .0072774   .0091959     0.79   0.429    -.0107462    .0253011
                 |
              x3 |
             --. |   .0089407   .0029207     3.06   0.002     .0032162    .0146653
             L1. |  -.0033645   .0024326    -1.38   0.167    -.0081322    .0014032
                 |
              x4 |
             --. |  -.0340336   .0114255    -2.98   0.003    -.0564272     -.01164
             L1. |   .0218265   .0107334     2.03   0.042     .0007895    .0428635
                 |
              x5 |
             --. |  -.0076639   .0040157    -1.91   0.056    -.0155345    .0002066
             L1. |   .0015686   .0024271     0.65   0.518    -.0031883    .0063255
                 |
              x6 |
             --. |   .0032186   .0129516     0.25   0.804     -.022166    .0286032
             L1. |  -.0054722   .0131681    -0.42   0.678    -.0312811    .0203367
                 |
              x7 |
             --. |   .0000721   .0023317     0.03   0.975     -.004498    .0046422
             L1. |    .002005   .0022469     0.89   0.372    -.0023988    .0064088
                 |
            year |
           2001  |  -.0019082   .0170997    -0.11   0.911    -.0354231    .0316066
           2002  |   .0238397   .0169462     1.41   0.159    -.0093743    .0570537
           2003  |   .0222379   .0199436     1.12   0.265    -.0168507    .0613266
           2004  |   .0014419   .0170003     0.08   0.932    -.0318781    .0347619
           2005  |  -.0093809   .0171457    -0.55   0.584    -.0429858    .0242239
           2006  |  -.0018028   .0158745    -0.11   0.910    -.0329161    .0293106
           2007  |   .0114593   .0180345     0.64   0.525    -.0238878    .0468063
           2008  |   .0022069   .0183826     0.12   0.904    -.0338223    .0382361
           2009  |   .0035213   .0174704     0.20   0.840      -.03072    .0377627
           2010  |    .022896   .0177357     1.29   0.197    -.0118653    .0576572
           2011  |  -.0093328   .0220228    -0.42   0.672    -.0524966    .0338311
           2012  |  -.0064383   .0175773    -0.37   0.714    -.0408892    .0280127
           2013  |  -.0144945   .0191387    -0.76   0.449    -.0520057    .0230167
           2014  |   .0028155    .018004     0.16   0.876    -.0324716    .0381026
           2015  |   .0046752   .0206528     0.23   0.821    -.0358035    .0451539
           2016  |  -.0102328   .0179129    -0.57   0.568    -.0453415    .0248758
           2017  |  -.0059206   .0167213    -0.35   0.723    -.0386939    .0268526
           2018  |   .0131007   .0157716     0.83   0.406    -.0178111    .0440125
                 |
           _cons |   .0433748   .0208475     2.08   0.037     .0025145    .0842351
    ------------------------------------------------------------------------------
    
    .
    2. As seen in the results above, the coefficient of x1 (my main variable of interest) for the current period is positive and statistically significant, but not much greater than the coefficient of the lag of x1, which is negatively signed. When I calculate the long-run effects of x1 (in the manner discussed in earlier posts), the long-run coefficient is not statistically significant. Do these results suggest that x1 has a short-run effect on y, but this effect does not increase over time (in other words, no long-run effect)?

    Thanks.

    Leave a comment:


  • Joseph L. Staats
    replied
    Thanks again.

    Your comments help me understand at a deeper level what Arjun Wilkins was trying to tell me about choosing realistic changes in the IVs rather than worrying about what happens with a one-unit change (or some other change) that results in the DV increasing above its maximum value. And I just remembered an article I read a few days ago (Mummolo, Jonathan, and Erik Peterson. 2018. Improving the Interpretation of Fixed Effects Regression Results. Political Science Research and Methods 6(4): 829-835) that argues researchers doing fixed-effects regressions should (but typically don’t) apply within-unit variation (rather than overall variation) when calculating substantive effects. When I do that, one-standard deviation increases in my IVs result in LREs respectively of approximately .735, .288, and .564, which are entirely plausible.

    Without the combined help I received from you and Arjun Wilkins, I likely would not have reached the current state of my knowledge on the subject, so thank you very much.

    Leave a comment:


  • Sebastian Kripfganz
    replied
    Apologies; I did not reed your message carefully enough. I still believe that you would need to consider a model that takes the nature of your dependent variable into account (e.g. fractional response model) if you want to calculate meaningful long-run effects. Alternatively, you might have to consider a smaller effect size. Think about what would be an economically reasonable magnitude for a change in X.

    Leave a comment:


  • Joseph L. Staats
    replied
    Thanks so much for your informative insights. One thing I didn't understand, however, was what you said in the first sentence of the last paragraph: "Calculating these long-run effects for binary dependent variables is problematic in a linear probability model." To clarify, all of my variables, including the DV, are continuous, not binary, and have values ranging between 0-1 (in their original form they have other values, but I normalized them to 0-1). Knowing that, would your answer be any different?

    Leave a comment:


  • Sebastian Kripfganz
    replied
    I think it is a matter of perspective: Are we considering
    1. the long-run effect of a permanent change in X itself [without assuming any underlying process for X], or
    2. the long-run effect of a permanent change in a determinant/disturbance e to the process of X [assuming an underlying AR(1) process for X in your case]?
    In 1., we consider the effect of changing X by 1 unit and then keeping it constant afterwards. The long-run effect of that change of X on y will be β/(1-α1-α2).

    In 2., we are taking into account that X itself will adjust over time to a permanent 1-unit level shift. The long-run effect of that change of e on X will be 1/(1-ρ), and the long-run effect on y will be the product of the two, i.e. β/[(1-ρ)(1-α1-α2)].

    Calculating these long-run effects for binary dependent variables is problematic in a linear probability model. As you noticed, there is no upper bound for the effect, and it thus may exceed the range of plausible values for y. I am afraid, the only solution I see is to consider a nonlinear binary response model, with all the complications that this brings in the case of dynamic panel models.

    Leave a comment:


  • Joseph L. Staats
    replied
    Hello again, Sebastian,

    I have another question that has arisen in connection with calculating long-run effects (LRE) when using xtdpdbc.

    I mentioned in an earlier message the following article: Wilkins, Arjun S. 2018. To Lag or Not to Lag?: Re-Evaluating the Use of Lagged Dependent Variables in Regression Analysis. Political Science Research and Methods 6(2): 393-411
    The author suggests at p. 404 that the denominator of the equation for calculating LRE (with two lags of the DV and one lag of the IV) should look like this: (1-ρ)(1-α1-α2), where ρ represents the coefficient of the IV regressed on the lag of the IV. The denominator used by every other article I could find on LRE, and what you suggested in one of your messages to me, does not include 1-ρ. When I follow Wilkins’ recommendation, I get an implausibly high LRE for each of my IVs of interest. Here are results I get for the three IVs of interest, using the equation you suggested (not including 1-ρ) and what Wilkins suggests (including 1-ρ):

    x1 ρ=.973, LRE without 1-ρ=.256, LRE with 1-ρ=9.459

    x2 ρ=.937, LRE without 1-ρ=.330, LRE with 1-ρ=5.240

    x3 ρ=.953, LRE without 1-ρ=.273, LRE with 1-ρ=5.799

    What is particularly strange about the results when I include 1-ρ is that the DV and the IVs are all continuous within the range of 0-1. Thus, for x1, a one-unit change in x1 results in an LRE that is over nine times the maximum value of the DV.

    I wrote to Arjun Wilkins about my results, and he suggested I calculate LRE using something less than a one-unit change in the IV, such as a standard-deviation change. That didn’t really help, because the LREs would be respectively 2.204, .865, and 1.693 for each of the IVs. Even if I reduce the change in the IVs to something as low as .1, I still get LREs that seem much greater than expected by theory. He also suggested I check for a unit-root problem, but that wasn’t an issue.

    Do you have any thoughts on this? In particular, is it appropriate to include 1-ρ when calculating LRE? And if it is, do you have any ideas about the results I am getting?

    Leave a comment:

Working...
X