Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Insignificant constant in OLS regression

    Hi folks,

    Over the last number of weeks I've been creating a custom dataset. In short, I have geocoded sports capital grants by the Irish government. I have also merged this data with small area statistics from the 2011 census in order to have an idea of the regional characteristics of where these clubs are based.

    A lot has been examined in regards to the political bias of the distribution of these grants, with regions in which the finance and minister of sport are based doing particularly well. I'm adding to the literature by looking at other potential actors who can perhaps manipulate the system, namely heads of key sporting organisations within Ireland.

    I utilise two dependent variables, namely the difference between the amount of money a club applied for, and the grant which they were awarded (as a per cent). The second dependent variable is the amount which a club received for a grant.

    For my political bias measure I use the distance in km between the hometown of a minister/head of a sporting organisation to a club.

    All my variables are in logarithmic form. I also utilised the augmented dickey fuller test finding no evidence of variables having a unit root. My ministerial variables are in line with what past studies have found also, leading me to believe that my data is sound. One of my issues is that in some cases, I reduce the sample to take into account extremely large grants awarded and also focus on particular sports, my constant is insignificant.

    My P>F is significant though with it reading 0.000 in nearly all cases. I include dummies to take into account differences in sports, along with year dummies too.

    I'm wondering is it acceptable to present results which have an insignificant constant and a large standard error?

    Also would anyone recommend any other robustness tests to carry out, to ensure robust results.

    Kind regards,

    Sean

  • #2
    Sean:
    in general, the answer is. yes, it is.
    However, more positive and helpful comments may come if you post what you typed and what Stata gave you back via CODE delimiters (the FAQ explain why and how do so). Thanks.
    Kind regards,
    Carlo
    (Stata 19.0)

    Comment


    • #3
      Carlo,

      Here is sample of the results,

      Code:
      . regress logdiff logincome logpop logpoorhealth logfin logsport logpop19 logurban logunemployed loghigerpro sport1 sport2 sport3 sport4 sport5 sport6 sport7 sport8 s
      > port9 sport10 sport11 sport12 sport13 sport14 sport15 sport16 sport17 sport18 sport19 sport20 sport21 sport22 sport23 sport24 sport25 sport26 sport27 sport28 sport2
      > 9 sport30 sport31 sport32 sport33 sport34 sport35 sport36 sport37 sport38 sport39 sport40 sport41 sport42 y2002 y2003 y2004 y2005 y2006 y2007 y2008 y2012 y2014 y201
      > 5 cou1 cou2 cou3 cou4 cou5 cou6 cou7 cou8 cou9 cou10 cou11 cou12 cou13 cou14 cou15 cou16 cou17 cou18 cou19 cou20 cou21 cou22 cou23 cou24 cou25 cou26 cou27 cou28 cou
      > 29 cou30 cou31 cou32 cou33 cou34
      note: sport40 omitted because of collinearity
      note: y2005 omitted because of collinearity
      note: cou24 omitted because of collinearity
      
            Source |       SS           df       MS      Number of obs   =     7,094
      -------------+----------------------------------   F(92, 7001)     =      9.73
             Model |  171.092775        92  1.85970408   Prob > F        =    0.0000
          Residual |  1338.53035     7,001  .191191309   R-squared       =    0.1133
      -------------+----------------------------------   Adj R-squared   =    0.1017
             Total |  1509.62313     7,093  .212832811   Root MSE        =    .43725
      
      -------------------------------------------------------------------------------
            logdiff |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
      --------------+----------------------------------------------------------------
          logincome |   .2335903   .1773109     1.32   0.188    -.1139928    .5811734
             logpop |   .0017059   .0340232     0.05   0.960    -.0649899    .0684016
      logpoorhealth |   .0005436   .0098318     0.06   0.956    -.0187296    .0198169
             logfin |    .043203   .0092826     4.65   0.000     .0250063    .0613998
           logsport |  -.0148294   .0074163    -2.00   0.046    -.0293675   -.0002912
           logpop19 |  -.0202466    .029678    -0.68   0.495    -.0784244    .0379313
           logurban |  -.0007655   .0060658    -0.13   0.900    -.0126563    .0111252
      logunemployed |   .0403585   .0272819     1.48   0.139    -.0131223    .0938393
        loghigerpro |   .0100792    .023012     0.44   0.661    -.0350312    .0551896
             sport1 |  -.3751083   .2535193    -1.48   0.139     -.872083    .1218664
             sport2 |  -.3036962   .2537511    -1.20   0.231    -.8011252    .1937328
             sport3 |  -.3271582   .2536711    -1.29   0.197    -.8244303    .1701139
             sport4 |  -.3428759   .2547548    -1.35   0.178    -.8422724    .1565206
             sport5 |  -.3048707    .255053    -1.20   0.232    -.8048517    .1951104
             sport6 |  -.2757654    .255376    -1.08   0.280    -.7763798     .224849
             sport7 |  -.1546973   .2555217    -0.61   0.545    -.6555972    .3462026
             sport8 |  -.2324005   .2593193    -0.90   0.370    -.7407449     .275944
             sport9 |  -.1799072   .2549327    -0.71   0.480    -.6796524     .319838
            sport10 |  -.2395806   .2605841    -0.92   0.358    -.7504043    .2712431
            sport11 |  -.2612106   .2605542    -1.00   0.316    -.7719757    .2495544
            sport12 |  -.1764761   .2613258    -0.68   0.500    -.6887538    .3358017
            sport13 |  -.1942926   .2612304    -0.74   0.457    -.7063833    .3177981
            sport14 |  -.1995733   .2586752    -0.77   0.440     -.706655    .3075084
            sport15 |  -.1863863   .2619979    -0.71   0.477    -.6999815    .3272089
            sport16 |   -.369424   .2708058    -1.36   0.173    -.9002854    .1614375
            sport17 |  -.2379923   .2630494    -0.90   0.366    -.7536487    .2776641
            sport18 |  -.3609022   .2793352    -1.29   0.196    -.9084839    .1866794
            sport19 |  -.7622248   .2620559    -2.91   0.004    -1.275934   -.2485158
            sport20 |  -.2185605   .2583158    -0.85   0.398    -.7249378    .2878168
            sport21 |  -.2170487   .2571192    -0.84   0.399    -.7210802    .2869827
            sport22 |  -.1533494   .2736979    -0.56   0.575    -.6898802    .3831814
            sport23 |  -.4865816   .2719001    -1.79   0.074    -1.019588    .0464249
            sport24 |  -.1880634   .2726216    -0.69   0.490    -.7224843    .3463574
            sport25 |   -.248781   .2599058    -0.96   0.339     -.758275    .2607131
            sport26 |  -.2904589   .2970304    -0.98   0.328    -.8727285    .2918107
            sport27 |  -.2256332   .2554483    -0.88   0.377    -.7263893    .2751229
            sport28 |   -.362275   .2892462    -1.25   0.210    -.9292851    .2047352
            sport29 |  -.1178993    .320612    -0.37   0.713    -.7463959    .5105973
            sport30 |  -.2433861   .3029986    -0.80   0.422    -.8373552     .350583
            sport31 |  -.2897319   .2613311    -1.11   0.268      -.80202    .2225562
            sport32 |  -.1987088   .2792903    -0.71   0.477    -.7462023    .3487848
            sport33 |  -.3758679   .3211223    -1.17   0.242    -1.005365    .2536291
            sport34 |  -.2782871   .2812229    -0.99   0.322    -.8295691    .2729949
            sport35 |  -.0861838   .3353091    -0.26   0.797    -.7434913    .5711236
            sport36 |  -.1993903   .3357845    -0.59   0.553    -.8576296    .4588489
            sport37 |  -.6740176   .4006551    -1.68   0.093    -1.459423    .1113877
            sport38 |  -.5800191   .3205699    -1.81   0.070    -1.208433    .0483951
            sport39 |  -.2705459   .3206734    -0.84   0.399     -.899163    .3580712
            sport40 |          0  (omitted)
            sport41 |  -.1245987    .257741    -0.48   0.629    -.6298492    .3806517
            sport42 |  -.2036286   .2736376    -0.74   0.457    -.7400413     .332784
              y2002 |   .0770034   .0320381     2.40   0.016     .0141991    .1398077
              y2003 |   .0829499   .0321112     2.58   0.010     .0200023    .1458976
              y2004 |   .0947318   .0259948     3.64   0.000     .0437742    .1456895
              y2005 |          0  (omitted)
              y2006 |   .0625482    .024396     2.56   0.010     .0147247    .1103717
              y2007 |  -.0586731   .0235644    -2.49   0.013    -.1048664   -.0124798
              y2008 |  -.1997982   .0259991    -7.68   0.000    -.2507643   -.1488321
              y2012 |   .1445908   .0356467     4.06   0.000     .0747125    .2144691
              y2014 |    .118906   .0307544     3.87   0.000     .0586181    .1791939
              y2015 |    .110505   .0306586     3.60   0.000     .0504049    .1706051
               cou1 |   .1075559   .0684263     1.57   0.116    -.0265803    .2416921
               cou2 |   .1605351   .0676327     2.37   0.018     .0279545    .2931157
               cou3 |   .2042512   .0718852     2.84   0.005     .0633346    .3451679
               cou4 |   .1974642   .0723047     2.73   0.006      .055725    .3392034
               cou5 |   .2049819   .0745635     2.75   0.006     .0588148     .351149
               cou6 |   .0896104   .0664659     1.35   0.178     -.040683    .2199038
               cou7 |   .1281585   .0651094     1.97   0.049     .0005243    .2557927
               cou8 |   .2542966   .0633949     4.01   0.000     .1300235    .3785698
               cou9 |  -.0675319   .0694468    -0.97   0.331    -.2036686    .0686048
              cou10 |   .0617494   .0612524     1.01   0.313    -.0583238    .1818225
              cou11 |   .1175448   .0656129     1.79   0.073    -.0110764    .2461659
              cou12 |   .1771076   .0673272     2.63   0.009     .0451258    .3090894
              cou13 |   .0690826   .0644251     1.07   0.284    -.0572102    .1953754
              cou14 |   .0570711   .0598989     0.95   0.341    -.0603488     .174491
              cou15 |   .1119796   .0643817     1.74   0.082    -.0142281    .2381873
              cou16 |   .0654615   .0621251     1.05   0.292    -.0563226    .1872456
              cou17 |  -.0133968   .0643106    -0.21   0.835     -.139465    .1126713
              cou18 |   .0900145   .0581556     1.55   0.122    -.0239881    .2040171
              cou19 |  -.0400859   .0629936    -0.64   0.525    -.1635725    .0834007
              cou20 |  -.0313334   .0687061    -0.46   0.648    -.1660182    .1033515
              cou21 |   .0653036   .0632106     1.03   0.302    -.0586083    .1892154
              cou22 |  -.0302411   .0674153    -0.45   0.654    -.1623955    .1019134
              cou23 |  -.0030682   .0625044    -0.05   0.961    -.1255957    .1194593
              cou24 |          0  (omitted)
              cou25 |   .0541775   .0664285     0.82   0.415    -.0760424    .1843975
              cou26 |   .0131074   .0687803     0.19   0.849    -.1217228    .1479375
              cou27 |   .0265172   .0604567     0.44   0.661    -.0919963    .1450307
              cou28 |  -.0057899   .0712568    -0.08   0.935    -.1454748    .1338949
              cou29 |  -.0415261   .0639048    -0.65   0.516    -.1667988    .0837465
              cou30 |   .0423652   .0679218     0.62   0.533    -.0907821    .1755125
              cou31 |  -.0744962    .067117    -1.11   0.267    -.2060659    .0570735
              cou32 |   .0328482   .0679167     0.48   0.629    -.1002891    .1659856
              cou33 |   .0710231   .0726733     0.98   0.328    -.0714386    .2134847
              cou34 |   -.006543   .0691871    -0.09   0.925    -.1421706    .1290846
              _cons |    2.02601   1.842979     1.10   0.272    -1.586787    5.638807
      -------------------------------------------------------------------------------

      Comment


      • #4
        I believe the constant just "tells" us the Yhat when the covariates are set to zero (or the mean, if centered). That said, the p-value for the constant "tells" us it is not significantly diferent from zero when all the covariates are set to zero. Theoretically, IMHO, the coefficient of the constant is not bound to harm a model.
        Best regards,

        Marcos

        Comment


        • #5
          Sean:
          I would take one step aside from the main issue:
          - your model is so full of predictors that I would find really difficult to investigate the meaning of each one of them and even harder to disseminate the results; I would strongly welcome a more parsimoniuos model (btw, despite the tons of predictors the R-sq is quite low; provided that it may be frequent in your reearch field, it strikes a bit, though);
          - you have many dummies that you have presumably created by hand. Set aside the implilicit risk of errors, you can use your precious time more efficiently simply relying upon the wonderful capabilities of -fvvarlist-, which has also a virtuous link with commands such as -margins- and -marginsplot-, both deserving taking a thorough look at.
          Kind regards,
          Carlo
          (Stata 19.0)

          Comment


          • #6
            Carlo and other researchers perhaps I could get your opinions on something. I've been playing around with my data recently and I've come to a bit of an issue.

            The first regression takes into account differences amongst sports and years, while the second also includes counties. I notice the inclusion of the county dummies lowers the f-value quiet a bit. However, if I was to go in line with theory I probably should include the three dummies. As there is substantial differences in years, amongst different sports and different regions.

            Not only this, but when I decompose my dataset to take into account individual sports I notice stat significance evidence of lobby group bias.

            However, this only occurs when I include the county dummies. Also the f-value is lower with them without them.

            I'm wondering should I include them? I can reason them out with theory but they don't appear to improve the overall fit of the model greatly.

            Code:
            regress loggrant loghigerpro logunemployed logurban logfai logirfu loggaa logsport logfin logpop logincome logpoorhealth logpop19p  i.sportcode i.year
            
                  Source |       SS           df       MS      Number of obs   =     7,583
            -------------+----------------------------------   F(62, 7520)     =     51.99
                   Model |  3633.04627        62  58.5975205   Prob > F        =    0.0000
                Residual |  8475.34904     7,520   1.1270411   R-squared       =    0.3000
            -------------+----------------------------------   Adj R-squared   =    0.2943
                   Total |  12108.3953     7,582  1.59699226   Root MSE        =    1.0616
            
            -------------------------------------------------------------------------------
                 loggrant |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
            --------------+----------------------------------------------------------------
              loghigerpro |  -.0987982   .0493194    -2.00   0.045     -.195478   -.0021184
            logunemployed |  -.2126096   .0571403    -3.72   0.000    -.3246205   -.1005987
                 logurban |   .0390068   .0123137     3.17   0.002     .0148685     .063145
                   logfai |   .0512335   .0212281     2.41   0.016     .0096206    .0928465
                  logirfu |  -.0450759   .0178806    -2.52   0.012    -.0801269    -.010025
                   loggaa |   .0307933   .0212126     1.45   0.147    -.0107893    .0723759
                 logsport |   -.098738   .0174163    -5.67   0.000    -.1328788   -.0645972
                   logfin |  -.0618871     .01898    -3.26   0.001    -.0990932    -.024681
                   logpop |   .0685787    .018686     3.67   0.000     .0319489    .1052086
                logincome |    .756162   .1748462     4.32   0.000     .4134145    1.098909
            logpoorhealth |  -.0158159   .0329365    -0.48   0.631    -.0803806    .0487488
                logpop19p |   .0806889   .0645585     1.25   0.211    -.0458637    .2072415
                          |
                sportcode |
                       2  |  -.1298185     .03766    -3.45   0.001    -.2036426   -.0559944
                       3  |  -.1751473   .0367436    -4.77   0.000     -.247175   -.1031197
                       4  |   .0718141   .0654227     1.10   0.272    -.0564327     .200061
                       5  |  -.6381395    .072909    -8.75   0.000    -.7810614   -.4952175
                       6  |   -1.03961   .0753876   -13.79   0.000    -1.187391   -.8918295
                       7  |  -.8888204   .0820435   -10.83   0.000    -1.049649   -.7279923
                       8  |  -1.177836   .1342817    -8.77   0.000    -1.441066   -.9146066
                       9  |  -1.974569   .0708176   -27.88   0.000    -2.113392   -1.835747
                      10  |   -1.42275   .1456314    -9.77   0.000    -1.708229   -1.137272
                      11  |  -1.351686   .1414087    -9.56   0.000    -1.628887   -1.074486
                      12  |  -1.600075   .1537277   -10.41   0.000    -1.901425   -1.298726
                      13  |  -1.041823   .1538115    -6.77   0.000    -1.343337     -.74031
                      14  |  -2.498457   .1235386   -20.22   0.000    -2.740627   -2.256286
                      15  |  -1.875008   .1587789   -11.81   0.000    -2.186259   -1.563757
                      16  |  -.5415392   .2331021    -2.32   0.020    -.9984844    -.084594
                      17  |  -1.897852   .1703317   -11.14   0.000     -2.23175   -1.563955
                      18  |  -1.392869    .285732    -4.87   0.000    -1.952984   -.8327544
                      19  |   -.035337   .1555039    -0.23   0.820    -.3401681    .2694941
                      20  |  -1.514224   .1226583   -12.35   0.000    -1.754669    -1.27378
                      21  |   -1.28315   .1057248   -12.14   0.000      -1.4904     -1.0759
                      22  |  -1.274398   .2516227    -5.06   0.000    -1.767649   -.7811476
                      23  |  -2.069863   .2336495    -8.86   0.000    -2.527882   -1.611845
                      24  |  -1.358663   .2329792    -5.83   0.000    -1.815368    -.901959
                      25  |  -.4257372    .135568    -3.14   0.002    -.6914884   -.1599861
                      26  |  -1.523942   .3762479    -4.05   0.000    -2.261493   -.7863907
                      27  |  -1.093699   .0798383   -13.70   0.000    -1.250204   -.9371933
                      28  |  -2.295149   .3379156    -6.79   0.000    -2.957558    -1.63274
                      29  |  -2.053676   .4767216    -4.31   0.000    -2.988184   -1.119169
                      30  |  -.8778456   .3768132    -2.33   0.020    -1.616505   -.1391865
                      31  |  -1.143671   .1566912    -7.30   0.000    -1.450829   -.8365125
                      33  |  -1.344077    .284875    -4.72   0.000    -1.902512   -.7856427
                      34  |  -1.692757   .4775486    -3.54   0.000    -2.628886   -.7566286
                      35  |  -1.576368   .2516712    -6.26   0.000    -2.069714   -1.083022
                      36  |  -1.921748   .5331485    -3.60   0.000    -2.966868   -.8766279
                      37  |  -2.066121   .5332202    -3.87   0.000    -3.111382    -1.02086
                      38  |  -1.905228    .753015    -2.53   0.011    -3.381348   -.4291085
                      39  |   .0964159   .4766241     0.20   0.840    -.8379006    1.030732
                      40  |  -1.032804   .4760711    -2.17   0.030    -1.966036   -.0995716
                      42  |  -.7095284   .6138363    -1.16   0.248    -1.912819    .4937623
                      43  |  -1.563416   .1140864   -13.70   0.000    -1.787058   -1.339775
                      44  |  -2.786147   .2389724   -11.66   0.000      -3.2546   -2.317694
                          |
                     year |
                    2003  |  -.0602795   .0570716    -1.06   0.291    -.1721557    .0515967
                    2004  |  -.1141819   .0559517    -2.04   0.041     -.223863   -.0045009
                    2005  |   -.016431   .0607036    -0.27   0.787     -.135427     .102565
                    2006  |   .0153062   .0575133     0.27   0.790     -.097436    .1280483
                    2007  |  -.0510352   .0559599    -0.91   0.362    -.1607323    .0586619
                    2008  |  -.3981978   .0609832    -6.53   0.000    -.5177419   -.2786538
                    2012  |  -.3831891   .0577516    -6.64   0.000    -.4963983   -.2699798
                    2014  |   -.482317   .0538192    -8.96   0.000    -.5878177   -.3768163
                    2015  |  -.4767951   .0526757    -9.05   0.000    -.5800543    -.373536
                          |
                    _cons |   2.754542   1.824743     1.51   0.131    -.8224647    6.331548
            -------------------------------------------------------------------------------
            Code:
            regress loggrant loghigerpro logunemployed logurban logfai logirfu loggaa logsport logfin logpop logincome logpoorhealth logpop19p i.year i.sportcode i.county
            
                  Source |       SS           df       MS      Number of obs   =     7,583
            -------------+----------------------------------   F(95, 7487)     =     35.93
                   Model |  3791.68682        95  39.9124928   Prob > F        =    0.0000
                Residual |  8316.70849     7,487  1.11081989   R-squared       =    0.3131
            -------------+----------------------------------   Adj R-squared   =    0.3044
                   Total |  12108.3953     7,582  1.59699226   Root MSE        =     1.054
            
            -------------------------------------------------------------------------------
                 loggrant |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
            --------------+----------------------------------------------------------------
              loghigerpro |  -.0489985   .0523996    -0.94   0.350    -.1517164    .0537193
            logunemployed |  -.1090394   .0622674    -1.75   0.080     -.231101    .0130222
                 logurban |  -.0132887   .0145811    -0.91   0.362    -.0418717    .0152943
                   logfai |   .0604491   .0373234     1.62   0.105    -.0127152    .1336134
                  logirfu |  -.0540334   .0480774    -1.12   0.261    -.1482786    .0402118
                   loggaa |  -.0332126   .0231291    -1.44   0.151    -.0785522     .012127
                 logsport |  -.0818258   .0184433    -4.44   0.000    -.1179798   -.0456718
                   logfin |  -.1337954   .0224056    -5.97   0.000    -.1777166   -.0898742
                   logpop |   .1072683   .0209504     5.12   0.000     .0661995     .148337
                logincome |  -.2298102   .4080065    -0.56   0.573    -1.029617    .5699971
            logpoorhealth |  -.0036305   .0342179    -0.11   0.916    -.0707072    .0634462
                logpop19p |   .0983099    .070649     1.39   0.164     -.040182    .2368018
                          |
                     year |
                    2003  |  -.0418567   .0569132    -0.74   0.462    -.1534225    .0697091
                    2004  |  -.0324271   .0620569    -0.52   0.601     -.154076    .0892218
                    2005  |   .1151039   .0743678     1.55   0.122    -.0306778    .2608857
                    2006  |   .1140223   .0714007     1.60   0.110    -.0259432    .2539877
                    2007  |   .0546304   .0711782     0.77   0.443    -.0848989    .1941597
                    2008  |  -.2881585   .0741694    -3.89   0.000    -.4335513   -.1427657
                    2012  |   -.389526   .0582052    -6.69   0.000    -.5036245   -.2754274
                    2014  |  -.4494087   .0539482    -8.33   0.000    -.5551624    -.343655
                    2015  |  -.4336512   .0528971    -8.20   0.000    -.5373443   -.3299581
                          |
                sportcode |
                       2  |   -.169239   .0379053    -4.46   0.000    -.2435439    -.094934
                       3  |  -.1906378   .0369197    -5.16   0.000    -.2630108   -.1182649
                       4  |      .0467   .0652932     0.72   0.474     -.081293     .174693
                       5  |  -.6444995   .0727039    -8.86   0.000    -.7870195   -.5019795
                       6  |  -1.049163   .0758606   -13.83   0.000    -1.197871   -.9004549
                       7  |  -.9013549    .082002   -10.99   0.000    -1.062102   -.7406079
                       8  |  -1.177664   .1335729    -8.82   0.000    -1.439504   -.9158231
                       9  |  -1.988395   .0705994   -28.16   0.000     -2.12679   -1.850001
                      10  |  -1.474179   .1453434   -10.14   0.000    -1.759093   -1.189265
                      11  |   -1.38631   .1409164    -9.84   0.000    -1.662546   -1.110075
                      12  |  -1.596462   .1530341   -10.43   0.000    -1.896451   -1.296472
                      13  |  -1.054114   .1532257    -6.88   0.000    -1.354479   -.7537481
                      14  |  -2.531282   .1234557   -20.50   0.000     -2.77329   -2.289274
                      15  |  -1.894226    .158019   -11.99   0.000    -2.203988   -1.584465
                      16  |  -.4717261   .2319465    -2.03   0.042    -.9264065   -.0170457
                      17  |  -1.926445   .1695962   -11.36   0.000    -2.258901   -1.593989
                      18  |  -1.386934   .2840673    -4.88   0.000    -1.943786   -.8300825
                      19  |  -.0786224   .1551054    -0.51   0.612    -.3826726    .2254277
                      20  |  -1.542844   .1224877   -12.60   0.000    -1.782955   -1.302734
                      21  |  -1.267133   .1059974   -11.95   0.000    -1.474918   -1.059348
                      22  |  -1.245714   .2503206    -4.98   0.000    -1.736412   -.7550153
                      23  |  -2.094335   .2323287    -9.01   0.000    -2.549764   -1.638906
                      24  |  -1.367633   .2317864    -5.90   0.000    -1.821999    -.913266
                      25  |  -.4222761   .1350833    -3.13   0.002    -.6870773   -.1574749
                      26  |  -1.391187   .3745434    -3.71   0.000    -2.125397   -.6569763
                      27  |  -1.075647   .0798408   -13.47   0.000    -1.232157   -.9191366
                      28  |  -2.363229   .3368247    -7.02   0.000      -3.0235   -1.702958
                      29  |   -2.07293   .4743385    -4.37   0.000    -3.002767   -1.143094
                      30  |  -.9424945   .3747032    -2.52   0.012    -1.677018    -.207971
                      31  |  -1.207879   .1563894    -7.72   0.000    -1.514447   -.9013122
                      33  |  -1.409991   .2836258    -4.97   0.000    -1.965977   -.8540047
                      34  |   -1.79131   .4753001    -3.77   0.000    -2.723032   -.8595887
                      35  |  -1.521952   .2512427    -6.06   0.000    -2.014458   -1.029446
                      36  |  -1.961444   .5302006    -3.70   0.000    -3.000786   -.9221022
                      37  |  -1.990399   .5320878    -3.74   0.000    -3.033441   -.9473577
                      38  |  -1.832694    .748265    -2.45   0.014    -3.299504   -.3658844
                      39  |   .2147184   .4765325     0.45   0.652    -.7194193    1.148856
                      40  |  -1.077735   .4735988    -2.28   0.023    -2.006122   -.1493482
                      42  |  -.6149759   .6110236    -1.01   0.314    -1.812754     .582802
                      43  |  -1.563249   .1140136   -13.71   0.000    -1.786748    -1.33975
                      44  |  -2.793633   .2378974   -11.74   0.000    -3.259978   -2.327287
                          |
                   county |
                       2  |   .6360787   .2052065     3.10   0.002     .2338163    1.038341
                       3  |   .5975108   .1974272     3.03   0.002      .210498    .9845236
                       4  |   .4917607   .1809716     2.72   0.007     .1370056    .8465158
                       5  |   .5112731   .2018006     2.53   0.011     .1156872     .906859
                       6  |   .2892545   .1449712     2.00   0.046     .0050703    .5734388
                       7  |   .0712982   .1204469     0.59   0.554    -.1648115    .3074079
                       8  |  -.2873423   .1188832    -2.42   0.016    -.5203868   -.0542978
                       9  |  -.1108365    .134687    -0.82   0.411    -.3748609    .1531878
                      10  |   .1365111   .1242575     1.10   0.272    -.1070686    .3800908
                      11  |   .2855251   .1399108     2.04   0.041     .0112607    .5597895
                      12  |  -.1887262   .1209871    -1.56   0.119    -.4258948    .0484425
                      13  |   .2112139   .1225024     1.72   0.085    -.0289252     .451353
                      14  |   .2367532   .1121629     2.11   0.035     .0168824    .4566239
                      15  |   .2149528   .1368756     1.57   0.116    -.0533618    .4832674
                      16  |   .2182693   .1253407     1.74   0.082    -.0274337    .4639723
                      17  |   .8316241   .1520652     5.47   0.000     .5335335    1.129715
                      18  |   .2956575   .1287722     2.30   0.022     .0432279    .5480871
                      19  |   .1097823   .1267504     0.87   0.386    -.1386841    .3582487
                      20  |    .616297   .1609086     3.83   0.000     .3008708    .9317231
                      21  |   .2211162   .1369177     1.61   0.106     -.047281    .4895134
                      22  |   .2849739   .1380418     2.06   0.039     .0143732    .5555747
                      23  |   .3085458   .1480367     2.08   0.037     .0183522    .5987394
                      24  |    .659901   .1534404     4.30   0.000     .3591147    .9606873
                      25  |   .3441669   .1392843     2.47   0.013     .0711305    .6172032
                      26  |   .5742841   .1560863     3.68   0.000     .2683112     .880257
                      27  |   .2556778   .1197229     2.14   0.033     .0209874    .4903683
                      28  |   .0584679   .1421929     0.41   0.681    -.2202702     .337206
                      29  |   .1977397   .1265268     1.56   0.118    -.0502884    .4457678
                      30  |   .0862083   .1300676     0.66   0.507    -.1687608    .3411774
                      31  |   .2540976   .1394228     1.82   0.068    -.0192101    .5274054
                      32  |   .2681752   .1282007     2.09   0.036     .0168657    .5194846
                      33  |   .2440777   .1420397     1.72   0.086    -.0343599    .5225153
                      34  |   .0893366   .1329609     0.67   0.502    -.1713041    .3499773
                          |
                    _cons |   13.35235   4.121811     3.24   0.001      5.27244    21.43225
            -------------------------------------------------------------------------------

            Comment


            • #7
              I support everything Carlo says on the intercept. In fact I would go further.

              Sometimes with variations on y = b_0 + b_1 x there are grounds for thinking that the limiting behaviour justifies forcing the model through the origin. In the simplest case there might be physical or other laws implying proportionality y = b x.

              When log y = b_0 + b_1 log x_1 + other stuff and so forth the equivalent grounds for forcing would be that log y = 0 when all predictors are 0 and so that y = 1 when all the other antilogged predictors are 1 and all the indicators are 0.. I suspect that to be a condition which is no sense privileged; in fact I would be amazed if it had any substantive or special interpretation for you, particularly as in many cases the units of measurement are arbitrary (e.g whether you measure urban as percent or per thousand; if that example is guessed wrongly there will be others).

              In short, leave the intercept alone. It does no harm and indeed omitting it will complicate interpretation.

              On the model as whole I have to express a prejudice that it is over-complicated, but then I have not grappled with the data and am not familiar with the context. I would use Poisson regression here, not regression with a logged response. See Bill Gould's instant classic blog post on this.

              Comment


              • #8
                Thank you for the reply Nick!

                Comment


                • #9
                  Sean:
                  as usual, Nick gave precious advice.
                  From my side, I would not question the way you tackle the data analysis substantively, but I still express a strong concern about the number of predictors included in both of your regression models.
                  Isn't a feasible way to keep things simple even though not simpler?
                  Kind regards,
                  Carlo
                  (Stata 19.0)

                  Comment

                  Working...
                  X