Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Problem with latent class estat lcmean

    Hi all,

    I am having a problem getting the command "estat lcmean" to work for a latent class model. I can get it to work for a very simple model, but for the somewhat more complex one I am trying to produce I simply get spinning....spinning. I have left this working for several hours with no output. (Note: there are some comments [but as yet no answer that I see] regarding an error that some users are getting. I have not gotten an error. Just...nothing. No output. It goes until I hit break.) The estat lcgof command works fine, as does estat lcprob (slow, minutes not hours).

    Specifics: The model is a bit complex but not ridiculously so: 1555 cases, 21 binary variables. Code is below. I am running STATA/IC 15.1 on a Mac (actually two: have tried both desktop and laptop). I could not get this to work at all but for the nonrtolerance option I found in a separate topic. Not sure why that would be an issue here though.

    [/CODE]gsem (v41safetymuslims v41moralsmuslims v41intolermuslims v41politicsmuslims v41commmuslims v41jobsmuslims v41welfaremuslims \\\
    v41safetyjews v41moralsjews v41intolerjews v41politicsjews v41commjews v41jobsjews v41welfarejews \\\
    v41safetyafams v41moralsafams v41intolerafams v41politicsafams v41commafams v41jobsafams v41welfareafams <- _cons) \\\
    [pweight=weight1] if zWhiteR==1, logit startvalues(randomid, draws(8) seed(193693)) em(iter(5)) lclass(A 5) nodvheader nonrtolerance
    [/QUOTE]

    Thanks for any help/advice!

  • #2
    ... Sorry let me try that code line again...

    Code:
    gsem (v41safetymuslims v41moralsmuslims v41intolermuslims v41politicsmuslims v41commmuslims v41jobsmuslims v41welfaremuslims \\\
    v41safetyjews v41moralsjews v41intolerjews v41politicsjews v41commjews v41jobsjews v41welfarejews \\\
    v41safetyafams v41moralsafams v41intolerafams v41politicsafams v41commafams v41jobsafams v41welfareafams <- _cons) \\\
    [pweight=weight1] if zWhiteR==1, logit startvalues(randomid, draws(8) seed(193693)) em(iter(5)) lclass(A 5) \\\
    nodvheader nonrtolerance
    
    estat lcgof
    
    estat lcprob
    
    estat lcmean

    Comment


    • #3
      Were any of the logit intercepts near to 15 or -15? For the record, those correspond to item endorsement probabilities of almost 1 and almost 0 respectively. MPlus, it seems, automatically constrains the logit intercepts if they hit +/- 15. In Stata, we have to do it manually. Logit intercepts hitting those levels can cause some estimation trouble.

      When that happened to me recently, I believe it also slowed the estat commands down. You should check if those probabilities are reasonable given your theoretical knowledge, or if the latent class has very few members. Nonetheless, try constraining the intercepts and seeing if that helps - the procedure was outlined in the post you linked, but it would look something like this - say that for class 5, you saw that the intercepts for the last 3 items had hit over 15:

      Code:
       
       gsem (v41safetymuslims v41moralsmuslims v41intolermuslims v41politicsmuslims v41commmuslims v41jobsmuslims v41welfaremuslims /// v41safetyjews v41moralsjews v41intolerjews v41politicsjews v41commjews v41jobsjews v41welfarejews /// v41safetyafams v41moralsafams v41intolerafams v41politicsafams v41commafams v41jobsafams v41welfareafams <- _cons) /// (5:v41commafams <- _cons@15) (5:v41jobsafams <- _cons@15) (5:v41welfareafams <- _cons@15) /// [pweight=weight1] if zWhiteR==1, logit startvalues(randomid, draws(8) seed(193693)) em(iter(5)) lclass(A 5) /// nodvheader nonrtolerance
      Also, once you have hit what you believe to be a global max with the -nonrtolerance- option on, my reading of current best practices would suggest that you save the parameter estimates into a matrix, then try to run your model again (with the above constraints) but without the -nonrtolerance- option. And as you increase the number of latent classes, it appears likely that you want to make more random draws, and/or run your model a large number of times (using one random draw each time) to make sure you have hit the global max.
      Be aware that it can be very hard to answer a question without sample data. You can use the dataex command for this. Type help dataex at the command line.

      When presenting code or results, please use the code delimiters format them. Use the # button on the formatting toolbar, between the " (double quote) and <> buttons.

      Comment


      • #4
        Did one of the logit intercepts hit 15 or -15? Those correspond to item endorsement probabilities of near 1 or near 0. The last time that happened to me, I believe it slowed the estat commands down a lot. If this did happen, then the post I think you are referring to also outlined how to constrain the intercepts to +15 or -15, but here's the link anyway.
        Be aware that it can be very hard to answer a question without sample data. You can use the dataex command for this. Type help dataex at the command line.

        When presenting code or results, please use the code delimiters format them. Use the # button on the formatting toolbar, between the " (double quote) and <> buttons.

        Comment


        • #5
          Yes, that's the link I am referring to. That's what prompted me to put the nonrtolerance option in there, following 's example. Though several of my intercepts hit >15/-15, none were blanked as his were (see below.) Still, I can get estat lcgof to work, and estat lcprob. Just not estat lcmean, which runs forever without solution. I don't know why.

          Code:
          Class          : 5
          
          ------------------------------------------------------------------------------------
                             |               Robust
                             |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
          -------------------+----------------------------------------------------------------
          v41safetymuslims   |
                       _cons |  -5.136787   .8698296    -5.91   0.000    -6.841622   -3.431953
          -------------------+----------------------------------------------------------------
          v41moralsmuslims   |
                       _cons |  -2.322696   .2019414   -11.50   0.000    -2.718493   -1.926898
          -------------------+----------------------------------------------------------------
          v41intolermuslims  |
                       _cons |  -2.490241    .240002   -10.38   0.000    -2.960637   -2.019846
          -------------------+----------------------------------------------------------------
          v41politicsmuslims |
                       _cons |  -4.639323   .4057539   -11.43   0.000    -5.434586    -3.84406
          -------------------+----------------------------------------------------------------
          v41commmuslims     |
                       _cons |  -5.981032   .9836477    -6.08   0.000    -7.908946   -4.053118
          -------------------+----------------------------------------------------------------
          v41jobsmuslims     |
                       _cons |   -5.90705   .8558976    -6.90   0.000    -7.584578   -4.229521
          -------------------+----------------------------------------------------------------
          v41welfaremuslims  |
                       _cons |  -17.99317   1.496781   -12.02   0.000    -20.92681   -15.05953
          -------------------+----------------------------------------------------------------
          v41safetyjews      |
                       _cons |   -15.0661   .0507429  -296.91   0.000    -15.16555   -14.96664
          -------------------+----------------------------------------------------------------
          v41moralsjews      |
                       _cons |  -4.890215   .6771554    -7.22   0.000    -6.217415   -3.563014
          -------------------+----------------------------------------------------------------
          v41intolerjews     |
                       _cons |  -5.276898   1.007116    -5.24   0.000    -7.250809   -3.302987
          -------------------+----------------------------------------------------------------
          v41politicsjews    |
                       _cons |   -5.38869   .5918635    -9.10   0.000    -6.548721   -4.228659
          -------------------+----------------------------------------------------------------
          v41commjews        |
                       _cons |  -16.62212   .1029196  -161.51   0.000    -16.82384    -16.4204
          -------------------+----------------------------------------------------------------
          v41jobsjews        |
                       _cons |  -17.03696    .044703  -381.11   0.000    -17.12457   -16.94934
          -------------------+----------------------------------------------------------------
          v41welfarejews     |
                       _cons |  -16.33492   5.848252    -2.79   0.005    -27.79728   -4.872556
          -------------------+----------------------------------------------------------------
          v41safetyafams     |
                       _cons |  -17.26967   .9628075   -17.94   0.000    -19.15674    -15.3826
          -------------------+----------------------------------------------------------------
          v41moralsafams     |
                       _cons |   -6.48992   1.252093    -5.18   0.000    -8.943976   -4.035863
          -------------------+----------------------------------------------------------------
          v41intolerafams    |
                       _cons |  -3.756009   .4297711    -8.74   0.000    -4.598345   -2.913673
          -------------------+----------------------------------------------------------------
          v41politicsafams   |
                       _cons |   -4.88863   .4974805    -9.83   0.000    -5.863674   -3.913586
          -------------------+----------------------------------------------------------------
          v41commafams       |
                       _cons |  -16.65207   .0748952  -222.34   0.000    -16.79886   -16.50528
          -------------------+----------------------------------------------------------------
          v41jobsafams       |
                       _cons |  -6.029268   .7646754    -7.88   0.000    -7.528005   -4.530532
          -------------------+----------------------------------------------------------------
          v41welfareafams    |
                       _cons |  -2.320263   .1618666   -14.33   0.000    -2.637516   -2.003011
          ------------------------------------------------------------------------------------


          Comment


          • #6
            I'm not sure why, but I have a suspicion that -estat lcmean- is having trouble estimating the margins when the intercepts are at +/- 15. Why don't you try re-estimating the model with the appropriate constraints imposed, and see if -estat lcmean- runs more smoothly. As I said, I could have sworn that when I had a model with some intercepts at that value, -estat lcmean- ran very slowly. I'm aware that your SEs are not missing, but try it anyway. The model should converge within a few seconds if you supply the start parameters from the last model plus the constraints.

            As a side note, when I have produced a latent class model with SEs missing, I have usually been unable to replicate that model (i.e. the model didn't converge when re-estimated without -nrtolerance-, and I didn't have any other attempts converging at that log-likelihood value). I would urge people to look for missing SEs, and that they check their model outputs before considering them final.
            Be aware that it can be very hard to answer a question without sample data. You can use the dataex command for this. Type help dataex at the command line.

            When presenting code or results, please use the code delimiters format them. Use the # button on the formatting toolbar, between the " (double quote) and <> buttons.

            Comment


            • #7
              Thank you, this is helpful. I am a newbie in this sort of analysis. I will have to figure out how to do this, but I will give it a try.

              Comment


              • #8
                Joseph, here's how you would add a constraint, edits in bold. The annoying thing is that you need to add one constraint per parameter per class in which you have an issue.

                Originally posted by Joseph Gerteis View Post

                Code:
                gsem (...)
                
                matrix b = e(b)
                
                gsem (v41safetymuslims v41moralsmuslims v41intolermuslims v41politicsmuslims v41commmuslims v41jobsmuslims v41welfaremuslims \\\
                v41safetyjews v41moralsjews v41intolerjews v41politicsjews v41commjews v41jobsjews v41welfarejews \\\
                v41safetyafams v41moralsafams v41intolerafams v41politicsafams v41commafams v41jobsafams v41welfareafams <- _cons) \\\
                (5: v41welfaremuslims <- _cons@-15) (5: v41safetyjews <- _cons@-15) \\\
                [pweight=weight1] if zWhiteR==1, logit from(b) lclass(A 5) \\\
                nodvheader
                That code presumes you saves your previous command's matrix of parameters as the matrix b. Not vetted for typos. I put in the first 2 of your parameters. I also removed the -nonrtolerance- option. As I said before, if this model is identified, it should converge quickly with -nonrtolerance- off and the start values supplied.

                As I implied earlier, the logit intercepts imply that close to 0% of this class think that the safety of Jews and Muslims is a problem, or that they are safe, or whatever it is that the questions mean as they're coded. I would encourage you to think substantively if that's plausible given your subject-matter knowledge. Also, if this class is a small one, I'd urge a bit of caution; it could be a spurious result.

                Last, in maximum likelihood estimation, we usually require both the change in the log-likelihood to be under a certain amount, and we require the gradient to be under a certain level. -nonrtolerance- disables the second requirement. Based on my limited understanding of how MLE works, this means that you aren't guaranteed to be at a global maximum. In situations like that, I've run repeated attempts with varied start values (I've used the -jitter- option, but I'm not that clear what the scale of the jittering is; the -randomid- option should also work), and I've tried to make sure that I have several model runs hitting my supposed global max likelihood. I've also re-run models with -nonrtolerance- off. From my limited experience so far, missing SEs is definitely a bad sign, and I haven't been able to replicate any model with missing SEs. All my models that look like a replicable global max likelihood have been able to converge with -nonrtolerance- off.
                Be aware that it can be very hard to answer a question without sample data. You can use the dataex command for this. Type help dataex at the command line.

                When presenting code or results, please use the code delimiters format them. Use the # button on the formatting toolbar, between the " (double quote) and <> buttons.

                Comment


                • #9
                  I may need to correct my previous statement. I found that constraining the suspect intercept actually didn't make a difference in estimation time.

                  In example 54g in the SEM manual, Stata says:

                  The estimation of standard errors for marginal means and marginal probabilities can be timeconsuming
                  with large models. If you are interested only in the means and probabilities, you can
                  specify the nose option with estat lcmean and estat lcprob to speed up estimation. With this
                  option, no standard errors, test statistics, or confidence intervals are reported.
                  In my case, I had 13 binary variables and 5 classes. I had let the model sit for maybe 10 or 20 minutes before I realized I didn't need standard errors. I was producing a profile plot, and many overlapping CIs would be hard to read. Joseph Gerteis, your model is much larger, with 21 variables and 5 classes. If my earlier suggestion didn't produce results for you, then it may be worth thinking if you absolutely need the standard errors. I can't answer why it takes Stata so long to basically estimate margins repeatedly.
                  Be aware that it can be very hard to answer a question without sample data. You can use the dataex command for this. Type help dataex at the command line.

                  When presenting code or results, please use the code delimiters format them. Use the # button on the formatting toolbar, between the " (double quote) and <> buttons.

                  Comment


                  • #10
                    Adding some info:

                    For a model with 13 binary variables, one continuous variable, and 6 latent classes, -estat lcmean- took 16,403 seconds to run. That's 273 minutes - over 4 and a half hours. I kid you not.

                    Code:
                    . timer on 1
                    
                    . estat lcmean, post
                    
                    Latent class marginal means                     Number of obs     =     11,532
                    
                    -------------------------------------------------------------------------------
                                  |            Delta-method
                                  |     Margin   Std. Err.      z    P>|z|     [95% Conf. Interval]
                    --------------+----------------------------------------------------------------
                    1             |
                            I5950 |   .2038064   .0271905                      .1556324    .2622604
                            I6000 |   .6807568   .0497712                      .5764923    .7696105
                            I5900 |   .2025474   .0264965                      .1555174    .2594305
                          anxiety |   .3230197   .0322956                      .2632029     .389249
                       depression |   .4522196   .0351345                      .3846887    .5215569
                    cognitive_mod |   .1162298   .0293093                      .0699235    .1870354
                           E0100A |   .3376039   .0379324                       .267676    .4154367
                           E0100B |    .530779   .0416969                      .4489611    .6109752
                       E0200A_bin |   .0036171   .0037939                       .000461    .0277804
                       E0200B_bin |   .1381118   .0234107                      .0982764    .1906796
                       E0200C_bin |   .2983752   .0301442                       .242823    .3605842
                        E0800_bin |   .1480758   .0226992                      .1088592    .1982768
                        E0900_bin |   .0371043   .0132709                      .0182668    .0739057
                            D0300 |   1.429195   .2577147     5.55   0.000      .924083    1.934306
                    --------------+----------------------------------------------------------------
                    2             |
                            I5950 |   .1994888   .0243895                      .1559295    .2515896
                            I6000 |   .1268881   .0301566                      .0785463    .1985706
                            I5900 |   .0444509   .0123137                       .025678    .0758793
                          anxiety |   .4031807   .0285576                      .3486818    .4601793
                       depression |   .5998575   .0290571                      .5418032    .6552367
                    cognitive_mod |   .5177771   .0413508                      .4369731     .597662
                           E0100A |   .2194998   .0297845                      .1666816    .2833638
                           E0100B |    .350119   .0338655                       .286953    .4190187
                       E0200A_bin |   .2779026   .0405854                      .2056603    .3638961
                       E0200B_bin |   .5238594    .047826                      .4303815    .6156949
                       E0200C_bin |   .4227725   .0363005                      .3536695    .4950361
                        E0800_bin |   .4635224   .0426838                      .3816395    .5474209
                        E0900_bin |   .3044995   .0327529                      .2443306    .3721878
                            D0300 |   5.401224   .2939648    18.37   0.000     4.825063    5.977384
                    --------------+----------------------------------------------------------------
                    3             |
                            I5950 |   .1449133   .0154842                      .1171206    .1779718
                            I6000 |   .0322897   .0080767                      .0197086    .0524721
                            I5900 |   .0172943   .0057404                      .0089959    .0329926
                          anxiety |    .282033   .0196175                      .2452176    .3220175
                       depression |   .5650847   .0212579                       .523057    .6061973
                    cognitive_mod |    .705365   .0199046                       .664912    .7428234
                           E0100A |   .0384848   .0090806                      .0241455    .0608091
                           E0100B |   .1317581   .0151061                      .1048638    .1642842
                       E0200A_bin |   .3629864   .0252727                      .3150431    .4138177
                       E0200B_bin |   .5501218   .0286778                      .4935131    .6054616
                       E0200C_bin |   .4085544   .0244283                      .3616785    .4571547
                        E0800_bin |    .502877   .0255989                      .4528431    .5528535
                        E0900_bin |   .3588886    .020637                      .3195188     .400256
                            D0300 |   .6233647   .0772223     8.07   0.000     .4720117    .7747177
                    --------------+----------------------------------------------------------------
                    4             |
                            I5950 |   .0732236    .011351                      .0538586    .0988241
                            I6000 |   .0966291   .0135177                      .0731849     .126558
                            I5900 |   .0858699   .0110394                      .0665595    .1101217
                          anxiety |   .4307685   .0203581                      .3913978    .4710344
                       depression |   .7164611   .0190746                      .6776385    .7523172
                    cognitive_mod |   .1645179   .0164017                      .1348353    .1992302
                           E0100A |   .0368049   .0090251                       .022674     .059209
                           E0100B |   .0794655   .0135886                      .0565874    .1105096
                       E0200A_bin |   .0179026   .0062308                      .0090194    .0352239
                       E0200B_bin |   .0951689   .0135679                      .0716969     .125288
                       E0200C_bin |   .0970546   .0126264                      .0749672     .124772
                        E0800_bin |    .112244   .0135842                      .0882481    .1417502
                        E0900_bin |   .0217148   .0068803                      .0116285     .040194
                            D0300 |   11.24836   .2110747    53.29   0.000     10.83466    11.66206
                    --------------+----------------------------------------------------------------
                    5             |
                            I5950 |   .0385186   .0036532                      .0319636    .0463535
                            I6000 |   .0301492   .0037921                      .0235414    .0385385
                            I5900 |   .0231126   .0028327                      .0181656    .0293664
                          anxiety |   .2523124   .0076212                        .23767    .2675403
                       depression |   .5326802   .0086805                      .5156353    .5496492
                    cognitive_mod |   .2294378   .0077322                      .2146352    .2449428
                           E0100A |   .0038906   .0015188                      .0018086    .0083493
                           E0100B |   .0125013   .0025725                      .0083445    .0186897
                       E0200A_bin |   .0013471   .0010493                      .0002924    .0061832
                       E0200B_bin |   .0331668   .0036724                      .0266764    .0411696
                       E0200C_bin |   .0210572   .0031222                      .0157336    .0281305
                        E0800_bin |   .0484905   .0042033                      .0408855    .0574254
                        E0900_bin |   .0137006   .0025433                      .0095143    .0196924
                            D0300 |   3.547132   .0566133    62.66   0.000     3.436172    3.658091
                    --------------+----------------------------------------------------------------
                    6             |
                            I5950 |   .0383131   .0037875                      .0315425    .0464674
                            I6000 |   .0291776   .0039848                      .0223034    .0380881
                            I5900 |   .0204192   .0027738                      .0156356    .0266267
                          anxiety |   .1662433   .0068554                      .1532379    .1801176
                       depression |    .404784   .0087171                      .3878205    .4219781
                    cognitive_mod |    .285601   .0087946                      .2686804    .3031456
                           E0100A |   .0051369   .0014492                      .0029531    .0089211
                           E0100B |   .0127266   .0025803                      .0085457     .018914
                       E0200A_bin |   .0043589   .0015697                      .0021502    .0088165
                       E0200B_bin |   .0289839   .0036484                      .0226281    .0370573
                       E0200C_bin |   .0164133   .0029898                      .0114744    .0234276
                        E0800_bin |   .0433909   .0042921                      .0357151    .0526264
                        E0900_bin |   .0206006   .0034877                      .0147679    .0286699
                            D0300 |   .1971032   .0141975    13.88   0.000     .1692765    .2249298
                    -------------------------------------------------------------------------------
                    
                    . timer off 1
                    
                    . timer list
                       1:  16402.94 /        1 =   16402.9420
                    Be aware that it can be very hard to answer a question without sample data. You can use the dataex command for this. Type help dataex at the command line.

                    When presenting code or results, please use the code delimiters format them. Use the # button on the formatting toolbar, between the " (double quote) and <> buttons.

                    Comment

                    Working...
                    X