Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Smarter way to

    While learning probit, mprobit, logit, mlogit, ivprobit, and cmp like
    Code:
    mprobit Y X
    I realized that likelihood calculation is often “backed up”. Then I have to add an option “nrtolerance(1e-4)” (or other number) to circumvent this problem until I find a number that is not too big but not too small enough to circumvent this problem.

    This is a tedious process. Is there a way to do something like this:
    If the calculation hits “backed up” 3 times like the following,

    Iteration 6: log likelihood = -45564.029 (backed up)
    Iteration 7: log likelihood = -45564.029 (backed up)
    Iteration 8: log likelihood = -45564.029 (backed up)
    then it automatically changes nrtolerance to a bigger number, try again, and if it hits “backed up” 3 times, then it again automatically increases nrtolerance to a bigger number, and so on, until it solves the problem.

    Would there be such a solution?

  • #2
    I apologize for the badly formed title of this thread. I forgot to complete it. It seems I can't edit title or delete this thread. Title I wanted was "Smarter way to circumvent "backed up" issue?" If other visitors tell me they want me to just repost it under the correct title, I will do so.

    Comment


    • #3
      Do the iterations eventually converge to a maximum (without a backed-up or other message at the final iteration)? If so, then I recommend not worrying about intermediary iterations with such a message.

      If not, then I would try just about any other device than fiddling with the convergence tolerances. Have you tried the difficult option, for example?

      Comment


      • #4
        It's not converging at all after tens of "backed up" messages. And in my past experiences, in any probit/logit-type regressions, if "backed up" didn't disappear in 3 iterations, it didn't disappear in tens of, or hundred iterations. So I chose number 3.

        Yes I am adding "difficult" option already.

        Comment


        • #5
          So, you see failure-to-converge with probit/logit-type regression models containing only a single predictor?

          For your first example, mprobit Y X, before successively loosening the tolerances, I would probably first try technique() with a few iterations switching between different algorithms to see whether I could break free of the problematic region of the log-likelihood.

          Perhaps even try a single integration point to get a first-approximation coefficient vector to use as starting values for a second attempt with more integration points.

          Comment


          • #6
            Yes when I get hit by "backed up" repeatedly, then I also try changing technique(). But whether I try technique() or nrtolerance(), the problem is that I have to start running the program from the start, and repeatedly do it, to finally find out the best configuration to run a single regression. So I am wondering if there could be a way to automatize this. When "backed up" hits 3 times, then it will give up the current configuration and try different technique() or nrtolerance(). This will be immensely useful.

            Comment


            • #7
              I'm not aware of any automatizing of what you're looking to do. If such an option isn't mentioned in the user's manual entry for ml, then it's unlikely to be available.

              I've had no experience with mprobit, but you might be better off with bayes: and a gently regularizing prior that will help assure some concavity to the search rather than relaxing criteria to declare convergence-to-maximum.

              I'm not sure what to suggest for conventional logit or probit; I don't recall having such a problem with single-predictor regression models for them.

              Comment

              Working...
              X