Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Trying to understand logit tables

    Hi all,

    I'm fairly new to this so please bear with me.

    I've run a logit model, and in it I have included the variable 'Brexit' and the variable 'political_knowledge'. The variables and coefficients themselves are not too important, but I am trying to understand why only half the values show up.

    I have pasted the table below (apologies, I haven't got my head around using dataex for logit tables), and as you can see only a value for remain appears, and when I interacted brexit with political_knowledge it still only shows the remain values again.

    why is this?


    brexit
    Remain | -1.143769 .4453552 -2.57 0.010 -2.01665 -.2708892


    political_knowledge##brexit

    OK#Remain | 1.376699 .6378883 2.16 0.031 .1264606 2.626937
    Good#Remain | .7214613 .5805425 1.24 0.214 -.4163811 1.859304
    Great#Remain | 1.188758 .5590191 2.13 0.033 .0931012 2.284416

  • #2
    Presumably Remain is an indicator variable with values (0, 1). (Other names are binary, dichotomous, dummy, quantal, zero-one, Boolean.)

    Whenever such a variable is a predictor (covariate, independent variable, whatever) only one coefficient is shown, namely by default the effect of that predictor being 1 not 0. (You can flip the default by changing the factor variable notation you used on your unstated logit command.)

    Here's an example every Stata user can run.

    Code:
    . webuse nlswork, clear
    (National Longitudinal Survey of Young Women, 14-24 years old in 1968)
    
    
    . logit nev_mar i.collgrad i.c_city
    
    Iteration 0:   log likelihood = -15362.312  
    Iteration 1:   log likelihood = -15076.255  
    Iteration 2:   log likelihood = -15073.457  
    Iteration 3:   log likelihood = -15073.457  
    
    Logistic regression                                     Number of obs = 28,510
                                                            LR chi2(2)    = 577.71
                                                            Prob > chi2   = 0.0000
    Log likelihood = -15073.457                             Pseudo R2     = 0.0188
    
    ------------------------------------------------------------------------------
         nev_mar | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
    -------------+----------------------------------------------------------------
      1.collgrad |   .1196052   .0372292     3.21   0.001     .0466373    .1925731
        1.c_city |   .6840089   .0286618    23.86   0.000     .6278329    .7401849
           _cons |  -1.504111   .0201021   -74.82   0.000     -1.54351   -1.464711
    ------------------------------------------------------------------------------
    
    
    . logit nev_mar ib1.collgrad ib1.c_city
    
    Iteration 0:   log likelihood = -15362.312  
    Iteration 1:   log likelihood = -15076.255  
    Iteration 2:   log likelihood = -15073.457  
    Iteration 3:   log likelihood = -15073.457  
    
    Logistic regression                                     Number of obs = 28,510
                                                            LR chi2(2)    = 577.71
                                                            Prob > chi2   = 0.0000
    Log likelihood = -15073.457                             Pseudo R2     = 0.0188
    
    ------------------------------------------------------------------------------
         nev_mar | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
    -------------+----------------------------------------------------------------
      0.collgrad |  -.1196052   .0372292    -3.21   0.001    -.1925731   -.0466373
        0.c_city |  -.6840089   .0286618   -23.86   0.000    -.7401849   -.6278329
           _cons |  -.7004968   .0372216   -18.82   0.000    -.7734497   -.6275439
    ------------------------------------------------------------------------------
    
    .
    Last edited by Nick Cox; 07 Jan 2023, 09:54.

    Comment


    • #3
      Thank you very much Nick

      Comment

      Working...
      X