Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • teffects ipw - Overlap Assumption Violated - How to Proceed?

    Hi all,

    I'm using teffects ipw to fit a model that estimates the probability of treatment group assignment, with the goal of using generated IPWs to adjust in the outcome model. I have four treatment groups in my study and have identified a fairly long list of factors from the literature to include in a treatment model. The advice I received was to start with the long list and refine until I arrived at a final model. However, when I include the long list, I get an error advising that the overlap assumption has been violated. Next, I went to a much smaller group of variables and received the same error. Here are the results from the reduced list of variables:

    Code:
    . teffects ipw (study_grp) (AGEcat BMI COMOR EXD PRIORDX)
    treatment 2 has 1 propensity score less than 1.00e-05
    treatment 3 has 7 propensity scores less than 1.00e-05
    treatment 4 has 11 propensity scores less than 1.00e-05
    treatment overlap assumption has been violated; use option osample() to
    identify the overlap violators
    I re-ran the same code using the osample()option, which adds a variable to identify the overlap violators. I understand the theory behind the overlap assumption, but I don't know what to do with the observations where the assumption is violated. There are about 35,000 participants in my study cohort, so these 18 observations aren't a significant portion of the cohort.

    My question: now that I know that the overlap assumption is violated and I know which observations violate the assumption, what should I do next?

    Thanks for your consideration!


  • #2
    since only a fraction of observations violate the assumption, I recommend not giving it much thought and taking the textbook approach of dropping them from the analysis of the outcome

    Comment


    • #3
      That sounds like a reasonable idea. I'll run this past my supervisor and see what he has to say. Thanks for the quick reply, Øyvind!

      Comment


      • #4
        Hi Øyvind, everyone,

        A follow-up question - does anyone have any experience with the option osample()? When I run the code, I get three values for the variable: ., 0, or 1. The relevant Stata documentation doesn't appear to describe what the values are. I can guess, but wondered if anyone might be able to direct me to documentation that might prevent me from having to guess!

        Thank you!

        Comment


        • #5
          does this return the number of . values for the osample() variable,
          Code:
          count if missing(study_grp, AGEcat, BMI, COMOR, EXD, PRIORDX)
          ?

          Comment


          • #6
            Not precisely, but I was able to count values - there are ~25,000 0 values, 20 1 values, and 14,000 missing values. The sum is equal to the size of my study cohort.

            Comment


            • #7
              Following up on this after getting down a side-path in my research! Does anyone know how to interpret values generated by osample()? I would have expected a binary variable to as noted above, but am unsure what to make of all the missing values. Documentation of how this works is a bit thin, unfortunately! Would welcome ideas!

              Last edited by Sean Hardiman; 29 Mar 2022, 22:23.

              Comment


              • #8
                Hai Sean Hardiman, So what I understand is osample(newvar) creates a binary variable in the name newvar. when newvar = 1, it means that the stata could not find a match for that observation. Hope this helps!

                Comment


                • #9
                  Originally posted by Preethi Ramanujam View Post
                  Hai Sean Hardiman, So what I understand is osample(newvar) creates a binary variable in the name newvar. when newvar = 1, it means that the stata could not find a match for that observation. Hope this helps!
                  Thank you! I eventually figured it out, too, to the same outcome you did. Thanks!

                  Comment

                  Working...
                  X