Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • High standart error and coefficient

    Hello users!
    I have a problem with my logistic regression, since one of my variable (log income) is extremely significant, but its odds ratio is 39.5464, its Coefficient is 3.66506 and its standard error is 8.691485. I do not know how to interpret it and if it is correct, if it is not, what can I do?
    Thank you so much
    Luke Brown

  • #2
    While odds ratios of that magnitude are almost never seen in the real world, to say something more specific would require examining this result in the context of the entire logistic regression output and an example of your data. Please post back with those. To make the logistic regression output readable here, be sure to post it between code delimiters. (Read the Forum FAQ, with special attention to #12, if you are not familiar with code delimiters.) For the example data, please use the -dataex- command. If you are running version 15.1 or a fully updated version 14.2, it is already part of your official Stata installation. If not, run -ssc install dataex- to get it. Either way, run -help dataex- to read the simple instructions for using it. -dataex- will save you time; it is easier and quicker than typing out tables. It includes complete information about aspects of the data that are often critical to answering your question but cannot be seen from tabular displays or screenshots. It also makes it possible for those who want to help you to create a faithful representation of your example to try out their code, which in turn makes it more likely that their answer will actually work in your data.

    Comment


    • #3
      Thank you Clyde, I hope I am doing it correctly. As I mentioned before, I am using a logistic regression. Dependent variable is the dummy hequity (1= having equity, 0= not having equity)

      Code:
      * Example generated by -dataex-. To install: ssc install dataex
      clear
      input byte(hequity hhsex age educ race) float(logsaving logincome logincome2 crisis)
      0 2 71 10 1         0  9.485262  89.97019 1
      0 1 20  8 3         0 10.011355 100.22723 1
      1 1 30  9 1  8.006368 11.629843 135.25325 1
      1 2 33  8 3  5.298317 10.657982 113.59258 1
      1 1 41 13 1  8.935904   11.6477 135.66893 1
      0 2 25  8 1         0  9.628363  92.70537 1
      1 1 29 13 1  9.740969 11.063447 122.39986 1
      0 2 30  6 2         0  10.21615  104.3697 1
      1 1 21  9 1  9.159047  9.753526  95.13126 1
      0 2 48 11 1  6.802395 10.657982 113.59258 1
      1 1 62  8 5  7.313221  10.76833 115.95694 1
      0 2 83 10 1         0  9.405219  88.45815 1
      0 1 37  6 1         0  8.529751  72.75665 1
      1 1 45 12 3 10.434115 11.665245 136.07794 1
      1 1 68  8 1 10.106428  11.48466 131.89745 1
      1 1 45 12 5 11.418614  13.79864 190.40245 1
      0 1 22  8 1 2.3025851  10.13919 102.80315 1
      1 1 64 14 1 11.058795 12.997954 168.94682 1
      1 2 93  8 1         0 10.549973 111.30192 1
      1 1 61 14 1 12.206073 15.400842 237.18593 1
      0 2 55  8 1  7.377759 10.583874 112.01839 1
      1 2 85 14 1 13.495305   12.1298 147.13202 1
      0 2 30  8 2         0  10.17841     103.6 1
      1 1 48 12 2         0 11.930948 142.34752 1
      0 1 68  7 1 3.4011974 10.726975   115.068 1
      0 1 27  8 1         0 10.287608 105.83488 1
      0 2 61  5 3         0 10.011355 100.22723 1
      1 1 74 13 5 10.404263 16.196535 262.32776 1
      1 1 42 12 1  9.740969  11.39765 129.90642 1
      1 1 50  8 1  7.554859 11.452912  131.1692 1
      1 1 51  8 1         0  12.23352 149.65898 1
      1 1 42 12 1 10.596635 12.262647  150.3725 1
      0 2 57 13 1         0 11.047447  122.0461 1
      1 1 55 11 1  6.214608 11.629843 135.25325 1
      0 1 40  8 1         0 10.748954    115.54 1
      1 1 59  9 1 14.332517 15.369226  236.2131 1
      1 2 65  8 1 9.2103405  8.712072  75.90021 1
      1 1 58 12 1 12.483705 13.585997  184.5793 1
      1 1 48 13 5         0 11.620793 135.04283 1
      1 2 43  9 2  9.035987 10.386048    107.87 1
      1 1 58 12 1  8.517193 10.980756   120.577 1
      1 1 41 12 3  9.546813 12.135248 147.26425 1
      1 1 31  8 3  10.29553 11.420122 130.41919 1
      0 1 45  7 1         0  10.13919 102.80315 1
      0 1 29  9 2         0  9.964835  99.29794 1
      1 1 44 12 1 10.714417 12.067807 145.63197 1
      1 1 40 12 1  11.05089 13.200708  174.2587 1
      1 1 56 12 1         0 11.109967 123.43137 1
      1 1 55 13 1  12.00701 13.502337  182.3131 1
      0 1 42 12 1  5.010635 11.210772  125.6814 1
      1 1 69 13 1 13.122363 12.607287  158.9437 1
      0 2 30  9 2         0  9.485262  89.97019 1
      0 2 51  5 2         0  9.916045  98.32795 1
      1 2 59 12 1  5.298317 11.620793 135.04283 1
      0 2 32  1 3         0  9.222898  85.06184 1
      0 1 48  8 1         0 10.726975   115.068 1
      1 2 90 12 1         0  9.916045  98.32795 1
      1 1 67  9 1         0  11.27702  127.1712 1
      1 1 75  8 1  9.472705 15.223074 231.74197 1
      0 2 69 12 1  8.517193 11.602444  134.6167 1
      0 1 47  7 2         0 10.583874 112.01839 1
      1 1 35  8 1  9.472705 11.803115 139.31352 1
      0 1 66 13 1 12.100712 11.780125 138.77133 1
      0 1 30  8 1         0  9.692902  93.95234 1
      0 1 42 12 1         0  9.485262  89.97019 1
      1 1 77 14 1         0 14.673936  215.3244 1
      0 1 68 12 1         0 11.832968 140.01912 1
      0 1 41  8 3  6.802395  11.13982  124.0956 1
      1 1 61 12 1         0 11.673903    136.28 1
      1 1 62 14 5         0  15.05771  226.7346 1
      1 1 66 13 1         0 11.620793 135.04283 1
      1 1 77 12 1         0 11.564703 133.74236 1
      1 1 48  8 1  8.281724 11.420122 130.41919 1
      1 1 61  8 1 10.680516 12.318475 151.74483 1
      0 2 41  8 2 3.4011974  9.864752  97.31333 1
      1 1 32  8 1 2.3025851  10.32151 106.53358 1
      1 1 45 13 1  6.907755 11.707805  137.0727 1
      0 2 82  9 1         0 10.657982 113.59258 1
      1 1 40 14 3  11.91839  9.628363  92.70537 1
      0 1 44  8 1         0 10.681513 114.09472 1
      1 1 53  8 3         0  9.964835  99.29794 1
      0 2 46  4 2         0  9.916045  98.32795 1
      0 1 71 13 1  8.853665   11.2378 126.28816 1
      0 2 56  8 1         0 10.386048    107.87 1
      0 1 73 13 1         0 11.620793 135.04283 1
      1 2 62 13 2  8.853665 11.803115 139.31352 1
      1 1 71 12 1         0   10.3543 107.21152 1
      1 1 74 13 1  8.517193  13.02956 169.76944 1
      0 1 28 10 1   6.44572 10.748954    115.54 1
      1 1 71 13 1         0  15.20633  231.2325 1
      1 1 59 12 1 12.100712 12.193312 148.67685 1
      1 1 76 14 1         0         .         . 1
      0 2 68  9 1         0 10.011355 100.22723 1
      1 1 70 13 1         0 11.463607 131.41428 1
      0 2 35  9 2         0 10.657982 113.59258 1
      1 1 38 12 5 16.811293 14.607392  213.3759 1
      1 1 66 12 1         0  12.25785  150.2549 1
      1 1 61  8 1         0  12.05611 145.34982 1
      1 2 38 12 1 9.2103405  11.13982  124.0956 1
      1 1 62 12 1  5.940171 12.198427 148.80162 1
      end

      Comment


      • #4
        Thank you for posting the data. You do not show the command you used. When I run
        Code:
        logistic hequity i.hhsex age educ i.race logsaving logincome logincome2 i.crisis
        in that data, I get mostly normal looking results:

        Code:
        . logistic hequity i.hhsex age educ i.race logsaving logincome logincome2 i.crisis
        note: 5.race != 0 predicts success perfectly
              5.race dropped and 6 obs not used
        
        note: 1.crisis omitted because of collinearity
        
        Logistic regression                             Number of obs     =         93
                                                        LR chi2(8)        =      52.70
                                                        Prob > chi2       =     0.0000
        Log likelihood = -36.897151                     Pseudo R2         =     0.4166
        
        ------------------------------------------------------------------------------
             hequity | Odds Ratio   Std. Err.      z    P>|z|     [95% Conf. Interval]
        -------------+----------------------------------------------------------------
             2.hhsex |    .468794   .3588664    -0.99   0.322     .1045632     2.10177
                 age |    1.02441    .022423     1.10   0.271     .9813915    1.069315
                educ |   1.121859   .1679708     0.77   0.442     .8365499    1.504474
                     |
                race |
                  2  |   1.151848   1.199959     0.14   0.892     .1494998     8.87462
                  3  |   3.169356    3.26454     1.12   0.263     .4209257    23.86363
                  5  |          1  (empty)
                     |
           logsaving |   1.206748   .1021176     2.22   0.026     1.022318    1.424449
           logincome |   .0000113   .0000807    -1.60   0.109     9.97e-12    12.90007
          logincome2 |    1.77182   .5984883     1.69   0.090     .9139015    3.435104
            1.crisis |          1  (omitted)
               _cons |   1.02e+23   3.80e+24     1.42   0.156     1.70e-09    6.10e+54
        ------------------------------------------------------------------------------
        Note: _cons estimates baseline odds.
        Note: 0 failures and 2 successes completely determined.
        Here, the constant term is bizarre in that it is both astronmically large and its confidence interval is, for practical purposes, from zero to infinity.

        I notice that you have a variable logincome and also a variable logincome2. The latter appears to be the square of the former. They have a correlation coefficient of 0.996, so you have a very intense multicolinearity between them, which can account for unstable and very imprecisely estimated results. Moreover, from a modeling perspective, it is rather unusual to include both the log of income and the square of the log of income, and I'm wondering what your motivation for doing that is. In any case, within this data, logincome2 is hardly adding any new information to the data and its presence is destabilizing the model. I suggest you drop the logincome2 variable. That will leave you with stable results.


        Comment


        • #5
          Thank you so much, that was really helpful. I dropped that variable as you suggested. But now another question arises: what is that bizarre value of my constant term telling me?

          Comment


          • #6
            Well, at least in the example data you showed, the logistic regression leaving out logincome2 produces normal-looking output, and the constant term is no longer huge, nor does it have a huge standard error.

            In the model that included logincome2 and produced that bizarre result, I think the only conclusion you can draw is that the model is badly specified and needs to be changed.

            Comment

            Working...
            X