Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by Richard Williams View Post
    You also have zip and zinb. Also see http://statisticalhorizons.com/zero-inflated-models. I would not use plain old poisson though.

    I hesitate to advise you too much just because I have never worked with models quite like this. Perhaps others will chime in.

    I don't really understand your dependent variable but I wonder how horrible it would be to recode it into 0/not 0 and just use logit. Less than 3% of the records are getting recoded if you did that.

    Dear Prof. Williams,

    Thank you for your advice! I am a bit worried about using logit/probit as I would have to delete some information (even this 3% could be important). But I will try it just to find out what is happening.

    As for your question regarding my dependent variable, it is the number of media articles about the financial performance of firm i published on trading day t. Similarly, the main independent variable is the number of corporate disclosures about the company's financial performance issued on trading day t. I am mentioning these definitions to showcase that trading days that contain e.g. 15 media articles should not - in my opinion - be treated as if there was only 1 media article about the firm during the day (we would have to assume this when using a logit model).

    Thank you for providing this link about the Zero-Inflated Models! This is very intresting. The disagreement between William Greene and Paul Allison beneath the article is even more interesting!!! :-)

    Comment


    • #17
      Dear Nikos,

      I guess I know what is going on here: you have perfect predictors in your model and in that case the maximum likelihood estimator does not exist. This problem is well known in binary models but most people do not realize that the same can happen in models such as Poisson, NegBin, and Tobit. Please try the following:

      a) generate all your dummies using the -xi- command with the -noomit- option;

      b) estimate the Poisson model using the -ppml- command rather than Stata's -poisson- command (you will need to install -ppml-; type "findit ppml")

      You should see that -ppml- drops some regressors and observations, just like -logit- would do. You can read more about this problem in this paper.

      All the best,

      Joao

      Comment


      • #18
        Nikos:
        as a sidelight, dealing with a so impressive frequency of zeros, have you considered an hurdle model regression, as explained in Long and Freese (2014) texbook quoted in your post #14?
        Kind regards,
        Carlo
        (StataNow 18.5)

        Comment


        • #19
          Originally posted by Carlo Lazzaro View Post
          Nikos:
          as a sidelight, dealing with a so impressive frequency of zeros, have you considered an hurdle model regression, as explained in Long and Freese (2014) texbook quoted in your post #14?

          Dear members,

          Before responding to Joao's post, Carlo could you please explain to me why you think that the Hurdle model (or even the zero-inflated count model) would be suitable in my case? I have studied about these models and I have realized that the Hurdle model suggests thatt the process of moving from 0 to 1 with the count outcome is more difficult than subsequent increases. This makes sense in the example used by Long and Freese (2014). However, in my research (see definition of dep. variable in #16) I am not sure about the suitability of the model. What do you think? As for the Zero-inflated model, can I really assume that there are " two latent/unobserved groups " ? i.e. companies that are in the "always 0" group have outcome 0 with probability 1, and companies in the "not always 0" group have a nonzero probability of a positive count. Again, I am not sure this applies to my case.

          Thank you so much for taking the time to help me.

          Best,
          Nikos

          Comment


          • #20
            Nikos:
            my previous aside was also based on http://www.stata.com/bookstore/micro...ata/index.html. pages 583-589. In their example, authors consider a hurdle model regesssion with a logit in the first step and a zero-truncated binomial regression in the second one. I do hope that this quotation can be helpful for your research project, too.
            Kind regards,
            Carlo
            (StataNow 18.5)

            Comment


            • #21
              Originally posted by Joao Santos Silva View Post
              Dear Nikos,

              I guess I know what is going on here: you have perfect predictors in your model and in that case the maximum likelihood estimator does not exist. This problem is well known in binary models but most people do not realize that the same can happen in models such as Poisson, NegBin, and Tobit. Please try the following:

              a) generate all your dummies using the -xi- command with the -noomit- option;

              b) estimate the Poisson model using the -ppml- command rather than Stata's -poisson- command (you will need to install -ppml-; type "findit ppml")

              You should see that -ppml- drops some regressors and observations, just like -logit- would do. You can read more about this problem in this paper.

              All the best,

              Joao

              Dear Joao,

              many thanks for your post.

              I have installed the ppml command. You have done an amazing job there, this command should be included in econometrics books. Also, your publications were very helpful !

              By the way, as I am interested in the Negative Binomial regression as well (I want to run both Poisson and Neg. Binomial and argue that the conclusions are approximately the same), do you think there is something I should do? Would you also prefer the xtpoisson and the xtnbreg as Richard Williams suggested above?

              Thank you in advance.

              Best,
              Nikos

              Comment


              • #22
                Dear Nikos,

                I am glad I could help. The negbin model will suffer from the exact same problem as the Poisson. However, I think you will be able to estimate it by doing the following:

                a) estimate the model using ppml

                b) estimate the negbin using the sub-sample used by ppml. That is, immediately after the ppml estimation type: nbreg y x if e(sample)

                Let me know whether or not this works, OK?

                About xtpoisson and xtnbreg; I believe that xtnbreg does not estimate a fixed effects model in the usual sense, so it won't do what you want. xtpoisson may work or not; it depends on what are the variables causing trouble. In any case, if it works the estimates will be exactly the same you get by running ppml with the right set of dummies. Also, always keep in mind that the dummies should be created before running the regression (i.e., do not use ppml with the prefix xi) and with the noomit option; this is the safest way to avoid the problems you were having.

                All the best,

                Joao

                Comment


                • #23
                  Dear Joao / members,

                  Many thanks for offering this solution. I will try this and will inform you about the outcome.

                  I have a more general question. Given that I have TOO many zeros in my dependent variable (see #14), should I use instead a zero-inflated model ? The following command gives strong evidence that I should use a zero-inflated model (I don't report the output as it is not necessary I guess):
                  Code:
                  countfit y $xlist, nbreg zinb nograph noestimates
                  I also find that zip is more preferable than Poisson. However, my greatest concern is the the Zero-inflated model assumes that there are two latent/unobserved groups. A company/individual in the "always 0" group has outcome 0 with probability 1, whereas an individual in the "not always 0" group might have outcome 0, but there is nonzero probability that the individual has a positive count. I am not sure whether I can assume something like that. Once again, my y is the number of media articles about firm i on trading day t. I examine very large companies. So, if I used the zero-inflated model, would I have to assume that some companies cannot have positive media coverage? In other words, why would I assume that journalists cannot write a media articles about a firm? It does not seem to be right in my case.

                  All views are highly appreciated. Thank you.

                  PS. xtpoisson and xtbreg run without a problem surprisingly, and provide similar results.

                  Best,
                  Nikos

                  Comment


                  • #24
                    Dear Nikos,

                    You have TOO many zeros relatively to what? Even a Poisson distribution can have 99% of zeros (or more!). Given the nature of the dependent variable you are considering, I cannot see any justification for a ZI model. A hurdle model is potentially more plausible, but even that is not clearly preferable to what you are doing.

                    Anyway, the first thing that you need to consider is whether you are trying to model the conditional distribution or just the conditional mean. Your posts suggest that you essentially want to estimate the conditional mean and if that is the case then Poisson and or NegBin should do a reasonable job. Also, if you just care about the conditional mean, the results provided by the countfit command would not be relevant.

                    All the best,

                    Joao
                    PS: Can you please post the xtpoisson and xtbreg results?

                    Comment


                    • #25
                      Dear Joao,

                      I appreciate your help. I will not consider the Zero-inflated or the hurdle any more. When I use xtpoisson and xtnbreg, I use the command margins, dydx(varname) to interpret the coefficients. However, Stata returns the exact values of the coefficients βj from the regressions. I think that this is normal, also based on the discussion by Cameron and Trivedi. Does this mean that we can interpret a coefficient of e.g. 1.05 in the following way: " An increase in x1 by 1 unit is expected to increase y by 1.05 units " ? Does it make sense?

                      This is the output when using xtpoisson and xtnbreg:

                      Code:
                      . * Poisson Firm Fixed Effects
                      . xtpoisson ma_date $xlist i.week_day i.month i.year, robust i(id) fe
                      
                      Iteration 0:   log pseudolikelihood = -101052.87  
                      Iteration 1:   log pseudolikelihood = -87083.903  
                      Iteration 2:   log pseudolikelihood = -82655.717  
                      Iteration 3:   log pseudolikelihood = -82517.558  
                      Iteration 4:   log pseudolikelihood = -82516.934  
                      Iteration 5:   log pseudolikelihood = -82516.934  
                      
                      Conditional fixed-effects Poisson regression    Number of obs      =    260051
                      Group variable: id                              Number of groups   =        99
                      
                                                                      Obs per group: min =        69
                                                                                     avg =    2626.8
                                                                                     max =      3127
                      
                                                                      Wald chi2(45)      =  17407.46
                      Log pseudolikelihood  = -82516.934              Prob > chi2        =    0.0000
                      
                                                               (Std. Err. adjusted for clustering on id)
                      ----------------------------------------------------------------------------------
                                       |               Robust
                               ma_date |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                      -----------------+----------------------------------------------------------------
                               pr_date |   1.056312   .1020971    10.35   0.000     .8562058    1.256419
                          lag1_pr_date |   .8367576   .0570333    14.67   0.000     .7249744    .9485408
                          lag2_pr_date |   .2192144   .0528778     4.15   0.000     .1155758     .322853
                          lag3_pr_date |   .1165821   .0591882     1.97   0.049     .0005753    .2325889
                          lag4_pr_date |   .1561734   .0661425     2.36   0.018     .0265365    .2858103
                          lag5_pr_date |    .186936   .0877355     2.13   0.033     .0149775    .3588944
                          lag6_pr_date |  -.0350678   .0556113    -0.63   0.528    -.1440639    .0739283
                          lag7_pr_date |   .0507892   .0876225     0.58   0.562    -.1209477    .2225261
                         lead1_pr_date |   .0420125   .0748847     0.56   0.575    -.1047588    .1887838
                         NANAL_ALLFPIs |   .0163195    .005728     2.85   0.004     .0050928    .0275462
                               lnvolat |  -.0900008   .0937361    -0.96   0.337    -.2737202    .0937187
                                   lmc |   .3667891   .0489359     7.50   0.000     .2708766    .4627016
                                 b2m_w |   .3980362   .2240414     1.78   0.076    -.0410768    .8371492
                                 lev_w |  -.2853996   .4435487    -0.64   0.520    -1.154739    .5839399
                                turn_w |   .1804657   .0693862     2.60   0.009     .0444712    .3164602
                            car_3day_w |   11.15319   1.792703     6.22   0.000     7.639557    14.66683
                                 1.BAD |   -.016793   .0326912    -0.51   0.607    -.0808666    .0472806
                                       |
                      BAD#c.car_3day_w |
                                    1  |  -24.57739   3.842341    -6.40   0.000    -32.10824   -17.04654
                                       |
                              week_day |
                                    2  |   .2410475   .0825997     2.92   0.004      .079155      .40294
                                    3  |   .3278385   .0775476     4.23   0.000     .1758481     .479829
                                    4  |   .4370395   .0755743     5.78   0.000     .2889166    .5851624
                                    5  |   .1442329   .0795514     1.81   0.070     -.011685    .3001509
                                       |
                                 month |
                                    2  |  -.1305463   .1097388    -1.19   0.234    -.3456305    .0845379
                                    3  |  -.8403968   .1061567    -7.92   0.000     -1.04846   -.6323334
                                    4  |   .0738214    .057274     1.29   0.197    -.0384335    .1860763
                                    5  |  -.5291769   .0973513    -5.44   0.000     -.719982   -.3383718
                                    6  |  -.7973417   .1337523    -5.96   0.000    -1.059491   -.5351919
                                    7  |   .0722171   .0503759     1.43   0.152    -.0265178     .170952
                                    8  |  -.5804919   .1713501    -3.39   0.001    -.9163319    -.244652
                                    9  |  -.7501863   .1281567    -5.85   0.000    -1.001369   -.4990038
                                   10  |   .1304046   .0601616     2.17   0.030       .01249    .2483191
                                   11  |  -.4191068   .1461702    -2.87   0.004    -.7055952   -.1326185
                                   12  |  -.7557469   .1047387    -7.22   0.000     -.961031   -.5504628
                                       |
                                  year |
                                 2001  |   .1092901   .0963619     1.13   0.257    -.0795758     .298156
                                 2002  |   .4926691   .1065423     4.62   0.000     .2838501    .7014882
                                 2003  |   .2147073   .1338441     1.60   0.109    -.0476223     .477037
                                 2004  |   .3419805    .081392     4.20   0.000      .182455    .5015059
                                 2005  |   .2953855   .0925383     3.19   0.001     .1140139    .4767572
                                 2006  |   .0861302   .1079035     0.80   0.425    -.1253569    .2976172
                                 2007  |  -.1100854   .1284014    -0.86   0.391    -.3617475    .1415767
                                 2008  |   -.237696   .1040227    -2.29   0.022    -.4415767   -.0338154
                                 2009  |   .2863938   .1350267     2.12   0.034     .0217464    .5510413
                                 2010  |   .5648068   .1405783     4.02   0.000     .2892783    .8403352
                                 2011  |   .3107114   .1323874     2.35   0.019     .0512368    .5701859
                                 2012  |   .5910536   .1400699     4.22   0.000     .3165216    .8655856
                      ----------------------------------------------------------------------------------
                      
                      . estimates store m5, title(Poisson Firm FE)
                      
                      . 
                      . * Negative Binomial Firm Fixed Effects
                      . xtnbreg ma_date $xlist i.week_day i.month i.year, i(id) fe
                      
                      Iteration 0:   log likelihood = -263165.16  (not concave)
                      Iteration 1:   log likelihood = -177126.48  
                      Iteration 2:   log likelihood = -115718.15  
                      Iteration 3:   log likelihood = -80136.085  
                      Iteration 4:   log likelihood = -69581.856  
                      Iteration 5:   log likelihood = -68470.059  
                      Iteration 6:   log likelihood =  -68421.33  
                      Iteration 7:   log likelihood = -68420.726  
                      Iteration 8:   log likelihood = -68420.726  
                      
                      Conditional FE negative binomial regression     Number of obs      =    260051
                      Group variable: id                              Number of groups   =        99
                      
                                                                      Obs per group: min =        69
                                                                                     avg =    2626.8
                                                                                     max =      3127
                      
                                                                      Wald chi2(45)      =  27597.35
                      Log likelihood  = -68420.726                    Prob > chi2        =    0.0000
                      
                      ----------------------------------------------------------------------------------
                               ma_date |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                      -----------------+----------------------------------------------------------------
                               pr_date |   .9750439   .0107043    91.09   0.000     .9540638     .996024
                          lag1_pr_date |   .7911428   .0141136    56.06   0.000     .7634807    .8188049
                          lag2_pr_date |   .2084522   .0296711     7.03   0.000      .150298    .2666065
                          lag3_pr_date |   .0940436    .038124     2.47   0.014     .0193219    .1687654
                          lag4_pr_date |   .1722832   .0364253     4.73   0.000     .1008909    .2436754
                          lag5_pr_date |   .1777932   .0345202     5.15   0.000     .1101349    .2454514
                          lag6_pr_date |   .0396349   .0316721     1.25   0.211    -.0224414    .1017111
                          lag7_pr_date |   .0507825   .0279701     1.82   0.069    -.0040379    .1056028
                         lead1_pr_date |   .1248843   .0269555     4.63   0.000     .0720526    .1777161
                         NANAL_ALLFPIs |   .0256579    .001541    16.65   0.000     .0226377    .0286781
                               lnvolat |   .0887594   .0284979     3.11   0.002     .0329046    .1446142
                                   lmc |   .5441136   .0134318    40.51   0.000     .5177878    .5704395
                                 b2m_w |   .8756595     .05495    15.94   0.000     .7679595    .9833595
                                 lev_w |   .8999794   .0708558    12.70   0.000     .7611047    1.038854
                                turn_w |   .3881163   .0204996    18.93   0.000     .3479378    .4282947
                            car_3day_w |   10.38695   .3467879    29.95   0.000     9.707258    11.06664
                                 1.BAD |   .0037368   .0197204     0.19   0.850    -.0349145    .0423881
                                       |
                      BAD#c.car_3day_w |
                                    1  |  -22.35153   .5645498   -39.59   0.000    -23.45802   -21.24503
                                       |
                              week_day |
                                    2  |   .1115337   .0256293     4.35   0.000     .0613013    .1617661
                                    3  |   .1971611   .0254537     7.75   0.000     .1472727    .2470494
                                    4  |   .3312595   .0251349    13.18   0.000     .2819959     .380523
                                    5  |   .1204554   .0262742     4.58   0.000     .0689589    .1719519
                                       |
                                 month |
                                    2  |  -.0435043   .0338344    -1.29   0.199    -.1098185    .0228099
                                    3  |  -.7328828   .0399897   -18.33   0.000    -.8112612   -.6545044
                                    4  |   .1303644   .0307768     4.24   0.000      .070043    .1906859
                                    5  |  -.4302141   .0360992   -11.92   0.000    -.5009672    -.359461
                                    6  |  -.7271781   .0406061   -17.91   0.000    -.8067646   -.6475915
                                    7  |   .1375008   .0304275     4.52   0.000      .077864    .1971376
                                    8  |  -.4997111   .0363316   -13.75   0.000    -.5709198   -.4285025
                                    9  |  -.7360071   .0409031   -17.99   0.000    -.8161758   -.6558384
                                   10  |   .1707386   .0305796     5.58   0.000     .1108037    .2306735
                                   11  |  -.3577954   .0369474    -9.68   0.000    -.4302109   -.2853799
                                   12  |  -.6813658    .041533   -16.41   0.000    -.7627689   -.5999627
                                       |
                                  year |
                                 2001  |   .1151298   .0523966     2.20   0.028     .0124343    .2178253
                                 2002  |   .4909092   .0502716     9.77   0.000     .3923787    .5894397
                                 2003  |    .208019   .0549307     3.79   0.000     .1003568    .3156812
                                 2004  |   .1761451   .0576787     3.05   0.002     .0630969    .2891932
                                 2005  |   .1992749   .0578246     3.45   0.001     .0859407     .312609
                                 2006  |   .0121404   .0577493     0.21   0.833    -.1010462     .125327
                                 2007  |  -.2414569   .0591814    -4.08   0.000    -.3574504   -.1254634
                                 2008  |  -.4246219   .0530758    -8.00   0.000    -.5286487   -.3205952
                                 2009  |  -.0950437   .0529491    -1.80   0.073     -.198822    .0087346
                                 2010  |   .2387406   .0573166     4.17   0.000     .1264021     .351079
                                 2011  |   .0081684   .0578043     0.14   0.888     -.105126    .1214627
                                 2012  |   .2297088   .0580392     3.96   0.000     .1159541    .3434635
                                       |
                                 _cons |  -9.404747   .1698649   -55.37   0.000    -9.737676   -9.071818
                      ----------------------------------------------------------------------------------
                      
                      . estimates store m6, title(Negative Binomial Firm FE)
                      
                      . 
                      end of do-file



                      However, when I run the xtpoisson and the xtnbreg to perform a robustness test in a subsample (in this subsample I examine only media articles published in a particular printed newspaper in the post-2008 period), I get the following error from xtnbreg........ Honestly, I don't know what to do. This robustness test is important to showcase that the coefficient of pr_date now becomes insignificant. This is the case with xtpoisson. But, xtnbreg does not work. :-(


                      Code:
                      . drop if year<2008
                      (292200 observations deleted)
                      
                      . * Poisson Firm Fixed Effects
                      
                      . 
                      . xtpoisson ma_date $xlist i.week_day i.month i.year, robust i(id) fe
                      note: 26 groups (28737 obs) dropped because of all zero outcomes
                      
                      Iteration 0:   log pseudolikelihood = -5419.9178  
                      Iteration 1:   log pseudolikelihood = -5352.8208  
                      Iteration 2:   log pseudolikelihood = -5021.3086  
                      Iteration 3:   log pseudolikelihood = -5013.6769  
                      Iteration 4:   log pseudolikelihood = -5013.6263  
                      Iteration 5:   log pseudolikelihood = -5013.6263  
                      
                      Conditional fixed-effects Poisson regression    Number of obs      =     89000
                      Group variable: id                              Number of groups   =        73
                      
                                                                      Obs per group: min =       439
                                                                                     avg =    1219.2
                                                                                     max =      1258
                      
                                                                      Wald chi2(37)      =   2826.72
                      Log pseudolikelihood  = -5013.6263              Prob > chi2        =    0.0000
                      
                                                               (Std. Err. adjusted for clustering on id)
                      ----------------------------------------------------------------------------------
                                       |               Robust
                               ma_date |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                      -----------------+----------------------------------------------------------------
                               pr_date |    .093194   .1553837     0.60   0.549    -.2113524    .3977404
                          lag1_pr_date |   .9815185   .1497868     6.55   0.000     .6879417    1.275095
                          lag2_pr_date |   .3356425   .1224965     2.74   0.006     .0955537    .5757313
                          lag3_pr_date |  -.0906612   .1061216    -0.85   0.393    -.2986556    .1173332
                          lag4_pr_date |    .053018   .1949144     0.27   0.786    -.3290072    .4350433
                          lag5_pr_date |   .0137616   .1151077     0.12   0.905    -.2118453    .2393685
                          lag6_pr_date |   .1018643   .1040224     0.98   0.327    -.1020158    .3057445
                          lag7_pr_date |  -.0849451   .0987516    -0.86   0.390    -.2784948    .1086046
                         lead1_pr_date |  -.0264634   .1275668    -0.21   0.836    -.2764898    .2235631
                         NANAL_ALLFPIs |   .0220978    .009123     2.42   0.015      .004217    .0399786
                               lnvolat |   .2291454   .1767311     1.30   0.195    -.1172412    .5755321
                                   lmc |   .0664723   .1507772     0.44   0.659    -.2290457    .3619902
                                 b2m_w |  -.8084562   .7679916    -1.05   0.292    -2.313692    .6967797
                                 lev_w |  -2.622166   1.073318    -2.44   0.015     -4.72583   -.5185012
                                turn_w |    .060839   .1232731     0.49   0.622    -.1807718    .3024499
                            car_3day_w |   6.616523   2.171951     3.05   0.002     2.359577    10.87347
                                 1.BAD |   -.098034   .0582825    -1.68   0.093    -.2122655    .0161975
                                       |
                      BAD#c.car_3day_w |
                                    1  |  -16.47603   4.145026    -3.97   0.000    -24.60013   -8.351931
                                       |
                              week_day |
                                    2  |   .1928296   .1130696     1.71   0.088    -.0287827     .414442
                                    3  |    .640457   .1185151     5.40   0.000     .4081716    .8727424
                                    4  |   .4640189    .122343     3.79   0.000     .2242311    .7038067
                                    5  |   .6131821   .1452962     4.22   0.000     .3284068    .8979575
                                       |
                                 month |
                                    2  |  -.1380094   .1116018    -1.24   0.216    -.3567448     .080726
                                    3  |  -.5268631   .1555619    -3.39   0.001    -.8317589   -.2219674
                                    4  |   .0856872   .1244977     0.69   0.491    -.1583238    .3296983
                                    5  |  -.3102728   .1703977    -1.82   0.069    -.6442462    .0237005
                                    6  |   -.402485   .1878153    -2.14   0.032    -.7705962   -.0343738
                                    7  |    .006307    .114564     0.06   0.956    -.2182343    .2308483
                                    8  |  -.5543087   .1759488    -3.15   0.002    -.8991619   -.2094554
                                    9  |  -.7989323   .1299903    -6.15   0.000    -1.053709   -.5441559
                                   10  |   .1272104   .1078368     1.18   0.238    -.0841458    .3385666
                                   11  |  -.3861857   .1733296    -2.23   0.026    -.7259055   -.0464659
                                   12  |  -.6391425   .1780054    -3.59   0.000    -.9880267   -.2902583
                                       |
                                  year |
                                 2009  |   .1588914   .2135131     0.74   0.457    -.2595866    .5773693
                                 2010  |   .5539266   .2119586     2.61   0.009     .1384955    .9693578
                                 2011  |   .0734823   .2355466     0.31   0.755    -.3881806    .5351452
                                 2012  |   .0314301   .2495375     0.13   0.900    -.4576544    .5205146
                      ----------------------------------------------------------------------------------
                      
                      . 
                      . 
                      . 
                      . * Negative Binomial Firm Fixed Effects
                      
                      . 
                      . xtnbreg ma_date $xlist i.week_day i.month i.year, i(id) fe
                      note: 26 groups (28737 obs) dropped because of all zero outcomes
                      
                      Iteration 0:   log likelihood = -16794.144  
                      Iteration 1:   log likelihood = -12935.766  (not concave)
                      Iteration 2:   log likelihood =  -11903.22  (not concave)
                      Iteration 3:   log likelihood = -11028.118  (not concave)
                      Iteration 4:   log likelihood =  -10527.56  (not concave)
                      Iteration 5:   log likelihood = -10300.586  (not concave)
                      Iteration 6:   log likelihood = -9243.2891  (not concave)
                      Iteration 7:   log likelihood = -9215.7188  (not concave)
                      Iteration 8:   log likelihood =      -9047  (not concave)
                      Iteration 9:   log likelihood = -9013.4688  (not concave)
                      Iteration 10:  log likelihood = -9011.3438  (not concave)
                      Iteration 11:  log likelihood =  -8833.125  (not concave)
                      Iteration 12:  log likelihood = -8084.8105  (not concave)
                      Iteration 13:  log likelihood = -7850.5635  (not concave)
                      Iteration 14:  log likelihood = -7512.3389  (not concave)
                      Iteration 15:  log likelihood = -7360.2217  (not concave)
                      Iteration 16:  log likelihood = -8503.4316  (not concave)
                      Iteration 17:  log likelihood = -8030.8164  (not concave)
                      Iteration 18:  log likelihood = -7902.5913  (not concave)
                      Iteration 19:  log likelihood = -7233.3516  (not concave)
                      Iteration 20:  log likelihood = -8974.7122  (not concave)
                      Iteration 21:  log likelihood = -8076.4451  (not concave)
                      Iteration 22:  log likelihood = -8595.8733  (not concave)
                      Iteration 23:  log likelihood = -8392.7896  (not concave)
                      Iteration 24:  log likelihood = -8324.5596  (not concave)
                      Iteration 25:  log likelihood = -8324.5596  (not concave)
                      Iteration 26:  log likelihood = -8324.5596  (not concave)
                      Iteration 27:  log likelihood = -8324.5596  (not concave)
                      Iteration 28:  log likelihood = -8324.5596  (not concave)
                      Iteration 29:  log likelihood = -8324.5596  (not concave)
                      
                      * AND STILL CONTINUES to try to max the log-likelihood function but unsuccessfully

                      Best,
                      Nikos

                      Comment


                      • #26
                        Dear Nikos,

                        I think your interpretation of the results is not correct, but before going into that it would be important to make sure you are getting the right results. From what you posted, I suspect that -xtpoisson- and -xtnbreg- are not giving you what you want; can you please also post the -ppml- results?

                        Cheers,

                        Joao

                        Comment


                        • #27
                          Dear Joao,

                          I hope I understood how to use -xi- correctly. This is the output:

                          Code:
                          . xi I.week_day, prefix(_Iw) noomit
                          
                          . xi I.month, prefix(_Im) noomit
                          
                          . xi I.year, prefix(_Iy) noomit
                          
                          . ppml ma_date $xlist _I* dff_*, cluster(id)
                          note: checking the existence of the estimates
                          WARNING: NANAL_ALLFPIs has very large values, consider rescaling  or recentering
                          note: starting ppml estimation
                          note: _Iwweek_day_5 omitted because of collinearity
                          note: _Immonth_12 omitted because of collinearity
                          note: _Iyyear_2012 omitted because of collinearity
                          note: dff_4830 omitted because of collinearity
                          
                          Iteration 1:   deviance =  213022.2
                          Iteration 2:   deviance =  146502.3
                          Iteration 3:   deviance =  131531.5
                          Iteration 4:   deviance =  128350.3
                          Iteration 5:   deviance =  127907.6
                          Iteration 6:   deviance =  127892.7
                          Iteration 7:   deviance =  127892.7
                          Iteration 8:   deviance =  127892.7
                          
                          Number of parameters: 79
                          Number of observations: 260051
                          Number of observations dropped: 0
                          Pseudo log-likelihood: -84570.774
                          R-squared: .12876849
                                                               (Std. Err. adjusted for 99 clusters in id)
                          -------------------------------------------------------------------------------
                                        |               Robust
                                ma_date |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                          --------------+----------------------------------------------------------------
                                pr_date |   1.052171   .0937032    11.23   0.000     .8685157    1.235826
                           lag1_pr_date |    .831192   .0544695    15.26   0.000     .7244337    .9379503
                           lag2_pr_date |   .2219545   .0526119     4.22   0.000      .118837     .325072
                           lag3_pr_date |   .1053613   .0565342     1.86   0.062    -.0054438    .2161663
                           lag4_pr_date |   .1760903    .068556     2.57   0.010      .041723    .3104576
                           lag5_pr_date |   .2145987   .0897551     2.39   0.017     .0386819    .3905156
                           lag6_pr_date |  -.0292285   .0555804    -0.53   0.599    -.1381641    .0797072
                           lag7_pr_date |   .0400491   .0832442     0.48   0.630    -.1231064    .2032047
                          lead1_pr_date |   .0683708   .0757644     0.90   0.367    -.0801246    .2168663
                          NANAL_ALLFPIs |   .0104664   .0040809     2.56   0.010      .002468    .0184649
                                lnvolat |  -.0104961   .1075254    -0.10   0.922     -.221242    .2002497
                                    lmc |   .7414319    .039859    18.60   0.000     .6633097     .819554
                                  b2m_w |   .8289121   .2013191     4.12   0.000     .4343339     1.22349
                                  lev_w |  -.5888024   .4966844    -1.19   0.236    -1.562286     .384681
                                 turn_w |   .2845151   .0810917     3.51   0.000     .1255783    .4434518
                             car_3day_w |   11.40727   1.508173     7.56   0.000     8.451301    14.36323
                                    BAD |  -.0178649   .0350596    -0.51   0.610    -.0865804    .0508507
                                 CARBAD |  -24.98132   3.373921    -7.40   0.000    -31.59409   -18.36856
                          _Iwweek_day_1 |  -.1641189   .0769339    -2.13   0.033    -.3149065   -.0133313
                          _Iwweek_day_2 |   .0692184     .08258     0.84   0.402    -.0926355    .2310723
                          _Iwweek_day_3 |   .1667098   .0792965     2.10   0.036     .0112915    .3221281
                          _Iwweek_day_4 |   .2860659    .061211     4.67   0.000     .1660946    .4060373
                          _Iwweek_day_5 |          0  (omitted)
                             _Immonth_1 |   .7629438   .1042771     7.32   0.000     .5585644    .9673231
                             _Immonth_2 |   .6285196   .1205681     5.21   0.000     .3922104    .8648287
                             _Immonth_3 |  -.0712191   .0644546    -1.10   0.269    -.1975478    .0551095
                             _Immonth_4 |   .8368928   .1277526     6.55   0.000     .5865024    1.087283
                             _Immonth_5 |   .2351686     .09122     2.58   0.010     .0563807    .4139565
                             _Immonth_6 |  -.0353584   .0973626    -0.36   0.716    -.2261856    .1554688
                             _Immonth_7 |   .8399003    .112776     7.45   0.000     .6188634    1.060937
                             _Immonth_8 |   .1497484   .1613117     0.93   0.353    -.1664167    .4659135
                             _Immonth_9 |   .0068434   .0984036     0.07   0.945    -.1860242    .1997109
                            _Immonth_10 |    .885785    .117634     7.53   0.000     .6552266    1.116343
                            _Immonth_11 |   .3304572    .128144     2.58   0.010     .0792996    .5816148
                            _Immonth_12 |          0  (omitted)
                           _Iyyear_2000 |  -.4009425   .1627046    -2.46   0.014    -.7198376   -.0820475
                           _Iyyear_2001 |  -.2523851   .1541247    -1.64   0.102    -.5544639    .0496937
                           _Iyyear_2002 |   .1936743     .16047     1.21   0.227    -.1208411    .5081898
                           _Iyyear_2003 |   -.025932   .1525042    -0.17   0.865    -.3248347    .2729706
                           _Iyyear_2004 |   .0393319   .1196533     0.33   0.742    -.1951842     .273848
                           _Iyyear_2005 |   -.025659   .1291513    -0.20   0.843     -.278791    .2274729
                           _Iyyear_2006 |  -.2690746   .1011645    -2.66   0.008    -.4673534   -.0707958
                           _Iyyear_2007 |  -.5360149   .0972853    -5.51   0.000    -.7266906   -.3453392
                           _Iyyear_2008 |  -.7519932   .1119646    -6.72   0.000    -.9714398   -.5325465
                           _Iyyear_2009 |   -.168719   .1199942    -1.41   0.160    -.4039032    .0664653
                           _Iyyear_2010 |   .0654915   .1042685     0.63   0.530     -.138871     .269854
                           _Iyyear_2011 |  -.2138834   .0651458    -3.28   0.001    -.3415668   -.0861999
                           _Iyyear_2012 |          0  (omitted)
                                dff_481 |   .6884734   .2087523     3.30   0.001     .2793265     1.09762
                                dff_482 |   1.246825   .2423402     5.14   0.000     .7718466    1.721803
                                dff_483 |   .8453428   .2155314     3.92   0.000      .422909    1.267777
                                dff_484 |   .8194704   .1790813     4.58   0.000     .4684774    1.170463
                                dff_485 |   .7856193   .1951966     4.02   0.000     .4030409    1.168198
                                dff_486 |   .4175662   .2486582     1.68   0.093    -.0697949    .9049273
                                dff_487 |   .4195727    .244095     1.72   0.086    -.0588446    .8979901
                                dff_488 |   .6323451   .1634474     3.87   0.000      .311994    .9526961
                                dff_489 |  -.6701857   .1986414    -3.37   0.001    -1.059516   -.2808556
                               dff_4810 |  -.3087553   .3851635    -0.80   0.423    -1.063662    .4461513
                               dff_4811 |   2.969264   .2131779    13.93   0.000     2.551443    3.387085
                               dff_4812 |    .431152   .3880328     1.11   0.267    -.3293783    1.191682
                               dff_4813 |  -.9811808   .2808127    -3.49   0.000    -1.531564    -.430798
                               dff_4814 |  -.8521513   .3707946    -2.30   0.022    -1.578895   -.1254072
                               dff_4815 |  -.1199974   .2673118    -0.45   0.654     -.643919    .4039241
                               dff_4816 |   .1511852   .2457659     0.62   0.538     -.330507    .6328774
                               dff_4817 |   .4677143   .1917512     2.44   0.015      .091889    .8435397
                               dff_4818 |   .8502777   .2266748     3.75   0.000     .4060032    1.294552
                               dff_4819 |   1.168738   .3709716     3.15   0.002     .4416471    1.895829
                               dff_4820 |   .2926636   .3439681     0.85   0.395    -.3815015    .9668287
                               dff_4821 |     .04455   .1979365     0.23   0.822    -.3433985    .4324984
                               dff_4822 |    .513263   .2040231     2.52   0.012      .113385     .913141
                               dff_4823 |    .655507   .2499663     2.62   0.009     .1655821    1.145432
                               dff_4824 |   .3471894   .2018535     1.72   0.085    -.0484362     .742815
                               dff_4825 |   1.436533   .2127431     6.75   0.000     1.019564    1.853502
                               dff_4826 |    1.35246   .1701663     7.95   0.000      1.01894     1.68598
                               dff_4827 |   .2629131   .2169667     1.21   0.226    -.1623338      .68816
                               dff_4828 |   .3728516   .3416901     1.09   0.275    -.2968487    1.042552
                               dff_4829 |  -.3755445   .2461308    -1.53   0.127     -.857952     .106863
                               dff_4830 |          0  (omitted)
                                  _cons |  -11.42847   .7652727   -14.93   0.000    -12.92837   -9.928558
                          -------------------------------------------------------------------------------
                          Number of regressors dropped to ensure that the estimates exist: 0
                          Option strict is off
                          
                          . nbreg ma_date $xlist _I* dff_* if e(sample), cluster(id)
                          note: _Iwweek_day_5 omitted because of collinearity
                          note: _Immonth_12 omitted because of collinearity
                          note: _Iyyear_2012 omitted because of collinearity
                          note: dff_4830 omitted because of collinearity
                          
                          Fitting Poisson model:
                          
                          Iteration 0:   log pseudolikelihood = -614820.73  
                          Iteration 1:   log pseudolikelihood = -600049.32  (backed up)
                          Iteration 2:   log pseudolikelihood = -497145.16  (backed up)
                          Iteration 3:   log pseudolikelihood = -412141.42  (backed up)
                          Iteration 4:   log pseudolikelihood = -318536.96  (backed up)
                          Iteration 5:   log pseudolikelihood = -223610.54  
                          Iteration 6:   log pseudolikelihood =  -94190.12  
                          Iteration 7:   log pseudolikelihood = -85489.665  
                          Iteration 8:   log pseudolikelihood = -84589.654  
                          Iteration 9:   log pseudolikelihood = -84570.824  
                          Iteration 10:  log pseudolikelihood = -84570.774  
                          Iteration 11:  log pseudolikelihood = -84570.774  
                          
                          Fitting constant-only model:
                          
                          Iteration 0:   log pseudolikelihood = -111830.61  (not concave)
                          Iteration 1:   log pseudolikelihood = -86960.548  
                          Iteration 2:   log pseudolikelihood = -86895.689  
                          Iteration 3:   log pseudolikelihood = -86895.686  
                          
                          Fitting full model:
                          
                          Iteration 0:   log pseudolikelihood = -80480.501  (not concave)
                          Iteration 1:   log pseudolikelihood =  -71859.26  
                          Iteration 2:   log pseudolikelihood = -69341.139  
                          Iteration 3:   log pseudolikelihood = -68719.115  
                          Iteration 4:   log pseudolikelihood = -68702.307  
                          Iteration 5:   log pseudolikelihood = -68702.281  
                          Iteration 6:   log pseudolikelihood = -68702.281  
                          
                          Negative binomial regression                      Number of obs   =     260051
                                                                            Wald chi2(67)   =          .
                          Dispersion           = mean                       Prob > chi2     =          .
                          Log pseudolikelihood = -68702.281                 Pseudo R2       =     0.2094
                          
                                                               (Std. Err. adjusted for 99 clusters in id)
                          -------------------------------------------------------------------------------
                                        |               Robust
                                ma_date |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                          --------------+----------------------------------------------------------------
                                pr_date |   2.247993   .1328436    16.92   0.000     1.987624    2.508362
                           lag1_pr_date |   1.444228   .1049681    13.76   0.000     1.238494    1.649962
                           lag2_pr_date |   .2211551   .0640249     3.45   0.001     .0956687    .3466415
                           lag3_pr_date |    .078516   .0701399     1.12   0.263    -.0589558    .2159878
                           lag4_pr_date |   .2084509   .1175018     1.77   0.076    -.0218484    .4387501
                           lag5_pr_date |   .2452226   .0667983     3.67   0.000     .1143002    .3761449
                           lag6_pr_date |   .0534334   .0452411     1.18   0.238    -.0352375    .1421043
                           lag7_pr_date |   .1240363   .0634079     1.96   0.050     -.000241    .2483135
                          lead1_pr_date |   .1473781    .049885     2.95   0.003     .0496054    .2451508
                          NANAL_ALLFPIs |    .015404   .0050685     3.04   0.002     .0054699     .025338
                                lnvolat |   .1290542   .0876203     1.47   0.141    -.0426785    .3007869
                                    lmc |   .8916833   .0456288    19.54   0.000     .8022525    .9811141
                                  b2m_w |   .9761746   .2225084     4.39   0.000     .5400662    1.412283
                                  lev_w |  -.1382236   .4507382    -0.31   0.759    -1.021654    .7452071
                                 turn_w |   .2546776    .062727     4.06   0.000     .1317349    .3776203
                             car_3day_w |   12.28865   1.264065     9.72   0.000     9.811128    14.76617
                                    BAD |  -.0329519   .0252579    -1.30   0.192    -.0824565    .0165528
                                 CARBAD |  -27.07786   2.426125   -11.16   0.000    -31.83298   -22.32274
                          _Iwweek_day_1 |   -.104886   .0532351    -1.97   0.049    -.2092249   -.0005471
                          _Iwweek_day_2 |   .0547453   .0548501     1.00   0.318    -.0527589    .1622494
                          _Iwweek_day_3 |   .0930527   .0538107     1.73   0.084    -.0124144    .1985197
                          _Iwweek_day_4 |   .1955793   .0434536     4.50   0.000     .1104117    .2807469
                          _Iwweek_day_5 |          0  (omitted)
                             _Immonth_1 |   .6726258   .1107412     6.07   0.000     .4555771    .8896746
                             _Immonth_2 |   .6457852   .1210236     5.34   0.000     .4085832    .8829871
                             _Immonth_3 |  -.0714396   .0788274    -0.91   0.365    -.2259383    .0830592
                             _Immonth_4 |   .8291621   .1063276     7.80   0.000     .6207638     1.03756
                             _Immonth_5 |   .2520949   .1068045     2.36   0.018     .0427619     .461428
                             _Immonth_6 |  -.1199021   .0952381    -1.26   0.208    -.3065654    .0667611
                             _Immonth_7 |   .8227358   .1022923     8.04   0.000     .6222466    1.023225
                             _Immonth_8 |    .261813    .102113     2.56   0.010     .0616752    .4619507
                             _Immonth_9 |  -.0801642   .1087655    -0.74   0.461    -.2933408    .1330123
                            _Immonth_10 |   .9066078   .1064598     8.52   0.000     .6979505    1.115265
                            _Immonth_11 |   .3615638   .1014902     3.56   0.000     .1626466     .560481
                            _Immonth_12 |          0  (omitted)
                           _Iyyear_2000 |  -.4904513   .1653367    -2.97   0.003    -.8145053   -.1663974
                           _Iyyear_2001 |  -.2452503   .1288313    -1.90   0.057    -.4977551    .0072545
                           _Iyyear_2002 |   .1952508   .1237554     1.58   0.115    -.0473054    .4378069
                           _Iyyear_2003 |   .0233378   .1229028     0.19   0.849    -.2175472    .2642229
                           _Iyyear_2004 |  -.2245407   .1272149    -1.77   0.078    -.4738773     .024796
                           _Iyyear_2005 |  -.2790996   .1300626    -2.15   0.032    -.5340175   -.0241816
                           _Iyyear_2006 |  -.5209815   .1157604    -4.50   0.000    -.7478677   -.2940952
                           _Iyyear_2007 |  -.8125904   .1159263    -7.01   0.000    -1.039802    -.585379
                           _Iyyear_2008 |   -.971347   .1236667    -7.85   0.000    -1.213729   -.7289647
                           _Iyyear_2009 |  -.3325684   .1228198    -2.71   0.007    -.5732908    -.091846
                           _Iyyear_2010 |  -.0640576    .088531    -0.72   0.469    -.2375752    .1094601
                           _Iyyear_2011 |  -.2122139   .0726764    -2.92   0.004    -.3546571   -.0697707
                           _Iyyear_2012 |          0  (omitted)
                                dff_481 |   .8171322   .1358796     6.01   0.000     .5508131    1.083451
                                dff_482 |   1.430244   .3111868     4.60   0.000     .8203296    2.040159
                                dff_483 |   .9758514   .1511631     6.46   0.000     .6795771    1.272126
                                dff_484 |   .7345527    .106276     6.91   0.000     .5262556    .9428499
                                dff_485 |    1.22941   .1818537     6.76   0.000     .8729834    1.585837
                                dff_486 |   .7718213   .2243806     3.44   0.001     .3320433    1.211599
                                dff_487 |   .5162925   .1888825     2.73   0.006     .1460897    .8864954
                                dff_488 |   .7295854   .0737029     9.90   0.000     .5851302    .8740405
                                dff_489 |  -.3033605   .1329005    -2.28   0.022    -.5638407   -.0428802
                               dff_4810 |  -.1291446   .3625409    -0.36   0.722    -.8397117    .5814226
                               dff_4811 |   3.266223    .167148    19.54   0.000     2.938619    3.593827
                               dff_4812 |   .3851765   .2509706     1.53   0.125    -.1067169    .8770698
                               dff_4813 |  -.6001461   .1832771    -3.27   0.001    -.9593626   -.2409296
                               dff_4814 |  -.4065606   .2632715    -1.54   0.123    -.9225632    .1094421
                               dff_4815 |  -.2687441   .2247391    -1.20   0.232    -.7092247    .1717365
                               dff_4816 |   .2166383   .1468606     1.48   0.140    -.0712032    .5044797
                               dff_4817 |   .3932401   .1295183     3.04   0.002     .1393888    .6470913
                               dff_4818 |   .9904419   .1990388     4.98   0.000     .6003331    1.380551
                               dff_4819 |   1.452219   .2514482     5.78   0.000       .95939    1.945049
                               dff_4820 |   .7469517   .2929905     2.55   0.011     .1727009    1.321203
                               dff_4821 |   .1882909    .151817     1.24   0.215     -.109265    .4858468
                               dff_4822 |    .440028   .1074251     4.10   0.000     .2294787    .6505774
                               dff_4823 |    .696302   .2668276     2.61   0.009     .1733295    1.219275
                               dff_4824 |   .5615763   .1235353     4.55   0.000     .3194516    .8037009
                               dff_4825 |   1.317961   .1337195     9.86   0.000     1.055875    1.580046
                               dff_4826 |   1.366076   .0926049    14.75   0.000     1.184574    1.547579
                               dff_4827 |   .2569888   .1594787     1.61   0.107    -.0555836    .5695612
                               dff_4828 |   .3208987   .2668197     1.20   0.229    -.2020584    .8438558
                               dff_4829 |  -.3415261   .2603351    -1.31   0.190    -.8517736    .1687214
                               dff_4830 |          0  (omitted)
                                  _cons |  -13.02091   .7611813   -17.11   0.000    -14.51279   -11.52902
                          --------------+----------------------------------------------------------------
                               /lnalpha |   1.212884   .1799361                      .8602152    1.565552
                          --------------+----------------------------------------------------------------
                                  alpha |   3.363168   .6051555                      2.363669    4.785315
                          -------------------------------------------------------------------------------
                          PS. How do we interpret the coefficients now? In addition, have you got any idea as to how I should interpret the coefficients from xtpoisson and xtnbreg?

                          Best,
                          Nikos

                          Comment


                          • #28
                            Dear Nikos,

                            These are multiplicative models, so the interpretation is as in your post #2 (you have to do the exp transformation).

                            However, I am still worried about the fact that you have some coefficients that are very large in absolute value. The -ppml- results shown that you actually do not have perfect predictors, so that is not the cause of the problem. I would try rescaling your regressors that have large coefficients (car_3day_w and CARBAD) and also the dependent variable to make sure the results are stable. Rescaling can also help with the convergence of the -xtnbreg- that is giving you problems.

                            All the best,

                            Joao

                            Comment


                            • #29
                              Originally posted by Joao Santos Silva View Post
                              Dear Nikos,

                              These are multiplicative models, so the interpretation is as in your post #2 (you have to do the exp transformation).

                              However, I am still worried about the fact that you have some coefficients that are very large in absolute value. The -ppml- results shown that you actually do not have perfect predictors, so that is not the cause of the problem. I would try rescaling your regressors that have large coefficients (car_3day_w and CARBAD) and also the dependent variable to make sure the results are stable. Rescaling can also help with the convergence of the -xtnbreg- that is giving you problems.

                              All the best,

                              Joao

                              Dear Joao / members,

                              Thank you so much for all this help.

                              May I ask you one last question. When you say 'rescaling', what exactly do you have in mind? In other words, how should I rescale these variables? As a matter of fact, ppml above says that

                              Code:
                               
                               WARNING: NANAL_ALLFPIs has very large values, consider rescaling  or recentering
                              Code:
                              * This variable measures the number of analysts that follow the company (this number is the same for every trading day per company, but changes every calendar quarter i.e. we could have 10 analysts foe every day between Jan-Mar, then 12 analysts between Apr-Jun etc.)
                              
                              . tab NANAL_ALLFPIs
                              
                                Number of |
                                 Analysts |      Freq.     Percent        Cum.
                              ------------+-----------------------------------
                                        1 |      1,369        0.53        0.53
                                        2 |      2,772        1.07        1.59
                                        3 |      3,341        1.28        2.88
                                        4 |      3,013        1.16        4.04
                                        5 |      4,990        1.92        5.95
                                        6 |      6,251        2.40        8.36
                                        7 |      6,899        2.65       11.01
                                        8 |     10,023        3.85       14.87
                                        9 |     10,939        4.21       19.07
                                       10 |     12,626        4.86       23.93
                                       11 |     12,009        4.62       28.55
                                       12 |     13,237        5.09       33.64
                                       13 |     11,895        4.57       38.21
                                       14 |     12,074        4.64       42.85
                                       15 |     11,879        4.57       47.42
                                       16 |     12,962        4.98       52.40
                                       17 |     13,544        5.21       57.61
                                       18 |     12,350        4.75       62.36
                                       19 |     14,710        5.66       68.02
                                       20 |     13,049        5.02       73.04
                                       21 |     11,862        4.56       77.60
                                       22 |      9,048        3.48       81.08
                                       23 |      8,402        3.23       84.31
                                       24 |      5,794        2.23       86.54
                                       25 |      4,661        1.79       88.33
                                       26 |      3,455        1.33       89.66
                                       27 |      3,661        1.41       91.06
                                       28 |      2,702        1.04       92.10
                                       29 |      2,733        1.05       93.15
                                       30 |      2,394        0.92       94.08
                                       31 |      2,970        1.14       95.22
                                       32 |      2,241        0.86       96.08
                                       33 |      2,026        0.78       96.86
                                       34 |      1,522        0.59       97.44
                                       35 |      1,631        0.63       98.07
                                       36 |        816        0.31       98.38
                                       37 |      1,084        0.42       98.80
                                       38 |        895        0.34       99.15
                                       39 |        633        0.24       99.39
                                       40 |        380        0.15       99.54
                                       41 |        252        0.10       99.63
                                       42 |        189        0.07       99.70
                                       43 |        125        0.05       99.75
                                       45 |          6        0.00       99.76
                                       46 |         63        0.02       99.78
                                       48 |         64        0.02       99.80
                                       50 |         64        0.02       99.83
                                       52 |        125        0.05       99.88
                                       53 |          6        0.00       99.88
                                       55 |        125        0.05       99.93
                                       56 |         63        0.02       99.95
                                       57 |         64        0.02       99.98
                                       59 |         63        0.02      100.00
                              ------------+-----------------------------------
                                    Total |    260,051      100.00
                              * I am not sure why you also mentioned the dep. variable and car_3day_w and CARBAD. Maybe because of the large coefficients? // In any case, I am still not sure what I should do. On the one hand, I could report the simple Poisson and Neg. Binomial modela (but their results are very different as you have seen) and on the other had I could report xtpoisson and xtnbreg.

                              Best,
                              Nikos

                              Comment


                              • #30
                                Rescaling would be something like

                                gen newinc = inc/1000

                                So, income might be measured in thousands of dollars rather than dollars. Technically this shouldn't make any difference but computers only have so much precision so this sort of thing can help. It may also make the results easier to read, as the effect of a single dollar can be quite small.

                                Centering occurs if, say, you subtract the mean from each case. Then, a score of zero is the score of an average person; before centering zero may not even be a possible value. Or if, say, the variable is years of education, you might subtract 12 from each case; then (in the US) a score of zero would correspond to high school graduate. This may aid with interpretation (because zero is a meaningful value) and may also aid with computation.
                                -------------------------------------------
                                Richard Williams, Notre Dame Dept of Sociology
                                Stata Version: 17.0 MP (2 processor)

                                EMAIL: [email protected]
                                WWW: https://www3.nd.edu/~rwilliam

                                Comment

                                Working...
                                X