Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Hi,
    here is the do file. I'm sorry that it's in a total mess.

    I explain what I did:

    - I first aggregated my dataset from HS6 to HS2 (product level to chapter level)
    - I computed the RCA using the Balassa index on the overall dataset and then kept the mauritian values
    - I decided to reduce the dataset (as stata couldn't handle the number of dummies) by keeping the countries to where mauritius exports more than 3% of the exportation and those from where mauritius import more than 5% of the importation. I created a new "country" with code 999 corresponding to the rest of the world.

    - Afterward, I created my dummies and then regress.
    Attached Files

    Comment


    • #17
      Listed 100 out of 8367 observations
      Use the count() option to list more

      . dataex contribution_PG midline baseline treated inclusion treated_ml inclusion_ml

      ----------------------- copy starting from the next line -----------------------
      Code:
      * Example generated by -dataex-. To install: ssc install dataex
      clear
      input byte contribution_PG float(midline baseline treated inclusion treated_ml inclusion_ml)
       30 1 0 1 0 1 0
       30 0 1 1 0 0 0
      100 0 0 1 0 0 0
        0 0 1 1 0 0 0
      100 0 0 1 0 0 0
       50 0 0 1 0 0 0
      100 0 0 1 0 0 0
      100 0 0 1 0 0 0
       20 0 1 1 0 0 0
      100 0 0 1 0 0 0
       70 0 1 1 0 0 0
       95 1 0 1 0 1 0
       60 1 0 1 0 1 0
       25 0 1 1 0 0 0
       10 0 0 1 0 0 0
      100 0 0 1 0 0 0
      100 0 0 1 0 0 0
      100 0 0 1 0 0 0
        0 0 0 1 0 0 0
       60 1 0 1 0 1 0
       50 1 0 1 0 1 0
       40 0 1 1 0 0 0
       10 0 1 1 0 0 0
       50 1 0 1 0 1 0
       60 1 0 1 0 1 0
       45 0 1 1 0 0 0
       35 0 1 1 0 0 0
       10 0 0 1 0 0 0
       10 0 1 1 0 0 0
       30 1 0 1 0 1 0
       50 0 1 1 0 0 0
       40 1 0 1 0 1 0
       30 1 0 1 0 1 0
       10 1 0 1 0 1 0
       20 0 1 1 0 0 0
        5 0 0 1 0 0 0
       60 0 1 1 0 0 0
       30 0 1 1 0 0 0
       60 0 1 1 0 0 0
       20 0 1 1 0 0 0
       80 1 0 1 0 1 0
       20 0 0 1 0 0 0
       20 0 1 1 0 0 0
       20 0 1 1 0 0 0
       90 0 0 1 0 0 0
      100 0 0 1 0 0 0
       10 0 1 1 0 0 0
       50 1 0 1 0 1 0
       30 1 0 1 0 1 0
        0 0 1 1 0 0 0
       10 1 0 1 0 1 0
       50 0 1 1 0 0 0
       60 0 1 1 0 0 0
       50 0 1 1 0 0 0
       10 0 0 1 0 0 0
       10 1 0 1 0 1 0
      100 0 0 1 0 0 0
       60 0 0 1 0 0 0
       70 0 0 1 0 0 0
       50 0 1 1 0 0 0
       30 1 0 1 0 1 0
      100 0 0 1 0 0 0
       20 0 1 1 0 0 0
       20 0 1 1 0 0 0
       10 1 0 1 0 1 0
      100 0 1 1 0 0 0
       70 1 0 1 0 1 0
       40 1 0 1 0 1 0
      100 0 0 1 0 0 0
       30 1 0 1 0 1 0
       30 1 0 1 0 1 0
       20 0 1 1 0 0 0
       20 0 1 1 0 0 0
       10 0 1 1 0 0 0
       20 0 1 1 0 0 0
       30 0 1 1 0 0 0
      100 0 0 1 0 0 0
        5 0 0 1 0 0 0
      100 0 0 1 0 0 0
       20 0 1 1 0 0 0
       40 0 0 1 0 0 0
      100 1 0 1 0 1 0
       50 0 1 1 0 0 0
       90 0 1 1 0 0 0
       20 1 0 1 0 1 0
      100 0 0 1 0 0 0
       10 0 1 1 0 0 0
       60 0 1 1 0 0 0
       30 0 0 1 0 0 0
      100 1 0 1 0 1 0
       60 1 0 1 0 1 0
       80 1 0 1 0 1 0
       30 0 1 1 0 0 0
       60 1 0 1 0 1 0
        0 1 0 1 0 1 0
        0 0 1 1 0 0 0
       30 0 1 1 0 0 0
       10 1 0 1 0 1 0
      100 0 1 1 0 0 0
       30 0 1 1 0 0 0
      end
      label values treated treat
      label def treat 1 "Treatment", modify
      label values inclusion t
      label def t 0 "No Inclusion", modify

      I have the above data, treated_ml= treated*midline , inclusion*midline and I need to run following specification

      reg contribution_PG treated treated_ml midline inclusion inclusion_ml, cl(village_code)

      Stata omits treated as well as midline,
      note: treated omitted because of collinearity
      note: midline omitted because of collinearity

      I tried reg contribution_PG midline ibn.treated treated_ml inclusion inclusion_ml female poor, cl(village_code) noconstant baselevels but it still shows
      note: treated_ml omitted because of collinearity


      Please let me know which specification should I use so, no dummy is dropped due to colinearity.


      Comment


      • #18
        Listed 100 out of 8367 observations
        Use the count() option to list more

        . dataex contribution_PG midline baseline treated inclusion treated_ml inclusion_ml
        Code:
        * Example generated by -dataex-. To install: ssc install dataex
        clear
        input byte contribution_PG float(midline baseline treated inclusion)
         30 1 0 1 0
         30 0 1 1 0
        100 0 0 1 0
          0 0 1 1 0
        100 0 0 1 0
         50 0 0 1 0
        100 0 0 1 0
        100 0 0 1 0
         20 0 1 1 0
        100 0 0 1 0
         70 0 1 1 0
         95 1 0 1 0
         60 1 0 1 0
         25 0 1 1 0
         10 0 0 1 0
        100 0 0 1 0
        100 0 0 1 0
        100 0 0 1 0
          0 0 0 1 0
         60 1 0 1 0
         50 1 0 1 0
         40 0 1 1 0
         10 0 1 1 0
         50 1 0 1 0
         60 1 0 1 0
         45 0 1 1 0
         35 0 1 1 0
         10 0 0 1 0
         10 0 1 1 0
         30 1 0 1 0
         50 0 1 1 0
         40 1 0 1 0
         30 1 0 1 0
         10 1 0 1 0
         20 0 1 1 0
          5 0 0 1 0
         60 0 1 1 0
         30 0 1 1 0
         60 0 1 1 0
         20 0 1 1 0
         80 1 0 1 0
         20 0 0 1 0
         20 0 1 1 0
         20 0 1 1 0
         90 0 0 1 0
        100 0 0 1 0
         10 0 1 1 0
         50 1 0 1 0
         30 1 0 1 0
          0 0 1 1 0
         10 1 0 1 0
         50 0 1 1 0
         60 0 1 1 0
         50 0 1 1 0
         10 0 0 1 0
         10 1 0 1 0
        100 0 0 1 0
         60 0 0 1 0
         70 0 0 1 0
         50 0 1 1 0
         30 1 0 1 0
        100 0 0 1 0
         20 0 1 1 0
         20 0 1 1 0
         10 1 0 1 0
        100 0 1 1 0
         70 1 0 1 0
         40 1 0 1 0
        100 0 0 1 0
         30 1 0 1 0
         30 1 0 1 0
         20 0 1 1 0
         20 0 1 1 0
         10 0 1 1 0
         20 0 1 1 0
         30 0 1 1 0
        100 0 0 1 0
          5 0 0 1 0
        100 0 0 1 0
         20 0 1 1 0
         40 0 0 1 0
        100 1 0 1 0
         50 0 1 1 0
         90 0 1 1 0
         20 1 0 1 0
        100 0 0 1 0
         10 0 1 1 0
         60 0 1 1 0
         30 0 0 1 0
        100 1 0 1 0
         60 1 0 1 0
         80 1 0 1 0
         30 0 1 1 0
         60 1 0 1 0
          0 1 0 1 0
          0 0 1 1 0
         30 0 1 1 0
         10 1 0 1 0
        100 0 1 1 0
         30 0 1 1 0
        end
        label values treated treat
        label def treat 1 "Treatment", modify
        label values inclusion t
        label def t 0 "No Inclusion", modify
        I have the above data, treated_ml= treated*midline , inclusion*midline and I need to run following specification

        reg contribution_PG treated treated_ml midline inclusion inclusion_ml, cl(village_code)

        Stata omits treated as well as midline,
        note: treated omitted because of collinearity
        note: midline omitted because of collinearity

        I tried reg contribution_PG midline ibn.treated treated_ml inclusion inclusion_ml female poor, cl(village_code) noconstant baselevels but it still shows
        note: treated_ml omitted because of collinearity


        Please let me know which specification should I use so, no dummy is dropped due to colinearity.
        Last edited by Abha Indurkar; 02 Sep 2020, 10:33.

        Comment


        • #19
          If your example data is representative of your full data set, there is nothing you can do about this because the data itself is the problem.

          In your example data the variables constants: treated is always 1 and inclusion is always zero. And it follows the and inclusion_ml is also always zero. So they are omitted because they are all colinear with each other. If these facts obtain in your complete data set, then they will necessarily be omitted when you run your regression on the full model as well. There is no getting around this: it's linear algebra--you can't invert a singular matrix.

          So if your data are like this as a whole, then either the data management that created this data set are wrong and need to be redone, or if the data management is correct, these data are simply not amenable to the analysis you want to do. If your data are not like this as a whole, please post back with a new example that shows varying values of treated and inclusion and also reproduces this colinearity problem.

          Comment


          • #20
            Code:
            * Example generated by -dataex-. To install: ssc install dataex
            clear
            input byte contribution_PG float(treated inclusion baseline midline treated_ml inclusion_ml) int village_code
             30 1 0 0 1 1 0 91
             30 1 0 1 0 0 0 91
              0 1 0 1 0 0 0 91
             20 1 0 1 0 0 0 91
             70 1 0 1 0 0 0 91
             95 1 0 0 1 1 0 91
             60 1 0 0 1 1 0 91
             25 1 0 1 0 0 0 91
             60 1 0 0 1 1 0 91
             50 1 0 0 1 1 0 91
             40 1 0 1 0 0 0 91
             10 1 0 1 0 0 0 91
             50 1 0 0 1 1 0 91
             60 1 0 0 1 1 0 91
             45 1 0 1 0 0 0 91
             35 1 0 1 0 0 0 91
             10 1 0 1 0 0 0 91
             30 1 0 0 1 1 0 91
             50 1 0 1 0 0 0 91
             40 1 0 0 1 1 0 91
             30 1 0 0 1 1 0 91
             10 1 0 0 1 1 0 91
             20 1 0 1 0 0 0 91
             60 1 0 1 0 0 0 91
             30 1 0 1 0 0 0 91
             60 1 0 1 0 0 0 91
             20 1 0 1 0 0 0 91
             80 1 0 0 1 1 0 91
             20 1 0 1 0 0 0 91
             20 1 0 1 0 0 0 91
             10 1 0 1 0 0 0 91
             50 1 0 0 1 1 0 91
             30 1 0 0 1 1 0 91
              0 1 0 1 0 0 0 91
             10 1 0 0 1 1 0 91
             50 1 0 1 0 0 0 91
             60 1 0 1 0 0 0 91
             50 1 0 1 0 0 0 91
             10 1 0 0 1 1 0 91
             50 1 0 1 0 0 0 91
             30 1 0 0 1 1 0 91
             20 1 0 1 0 0 0 91
             20 1 0 1 0 0 0 91
             10 1 0 0 1 1 0 91
            100 1 0 1 0 0 0 91
             70 1 0 0 1 1 0 91
             40 1 0 0 1 1 0 91
             30 1 0 0 1 1 0 91
             30 1 0 0 1 1 0 91
             20 1 0 1 0 0 0 91
             20 1 0 1 0 0 0 91
             10 1 0 1 0 0 0 91
             20 1 0 1 0 0 0 91
             30 1 0 1 0 0 0 91
             20 1 0 1 0 0 0 91
            100 1 0 0 1 1 0 91
             50 1 0 1 0 0 0 91
             90 1 0 1 0 0 0 91
             20 1 0 0 1 1 0 91
             10 1 0 1 0 0 0 91
             60 1 0 1 0 0 0 91
            100 1 0 0 1 1 0 91
             60 1 0 0 1 1 0 91
             80 1 0 0 1 1 0 91
             30 1 0 1 0 0 0 91
             60 1 0 0 1 1 0 91
              0 1 0 0 1 1 0 91
              0 1 0 1 0 0 0 91
             30 1 0 1 0 0 0 91
             10 1 0 0 1 1 0 91
            100 1 0 1 0 0 0 91
             30 1 0 1 0 0 0 91
             80 1 0 1 0 0 0 91
             50 1 0 0 1 1 0 91
             80 1 0 0 1 1 0 91
             30 1 0 1 0 0 0 91
             20 1 0 1 0 0 0 91
             50 1 0 1 0 0 0 91
             20 1 0 1 0 0 0 91
             40 1 0 0 1 1 0 91
             90 1 0 0 1 1 0 91
             70 1 0 0 1 1 0 91
             20 1 0 1 0 0 0 91
             20 1 0 0 1 1 0 91
            100 0 . 0 1 0 . 92
             10 0 . 0 1 0 . 92
             40 0 . 1 0 0 . 92
             20 0 . 1 0 0 . 92
            100 0 . 0 1 0 . 92
             40 0 . 1 0 0 . 92
             50 0 . 1 0 0 . 92
             20 0 . 0 1 0 . 92
             10 0 . 0 1 0 . 92
              0 0 . 1 0 0 . 92
             70 0 . 0 1 0 . 92
             40 0 . 1 0 0 . 92
             30 0 . 1 0 0 . 92
             10 0 . 1 0 0 . 92
             60 0 . 1 0 0 . 92
            100 0 . 0 1 0 . 92
             60 0 . 1 0 0 . 92
             10 0 . 1 0 0 . 92
             50 0 . 0 1 0 . 92
             60 0 . 0 1 0 . 92
             40 0 . 1 0 0 . 92
              0 0 . 1 0 0 . 92
             40 0 . 1 0 0 . 92
             40 0 . 1 0 0 . 92
             10 0 . 0 1 0 . 92
             10 0 . 1 0 0 . 92
             30 0 . 0 1 0 . 92
             30 0 . 1 0 0 . 92
             50 0 . 1 0 0 . 92
            100 0 . 1 0 0 . 92
             90 0 . 0 1 0 . 92
             20 0 . 0 1 0 . 92
             20 0 . 1 0 0 . 92
             50 0 . 0 1 0 . 92
             40 0 . 1 0 0 . 92
             45 0 . 0 1 0 . 92
            100 0 . 0 1 0 . 92
             40 0 . 1 0 0 . 92
            100 0 . 0 1 0 . 92
             40 0 . 0 1 0 . 92
            100 0 . 0 1 0 . 92
             50 0 . 0 1 0 . 92
             70 0 . 0 1 0 . 92
             30 0 . 0 1 0 . 92
             40 0 . 0 1 0 . 92
             50 0 . 1 0 0 . 92
             20 0 . 1 0 0 . 92
             50 0 . 1 0 0 . 92
             35 0 . 1 0 0 . 92
             20 0 . 1 0 0 . 92
             50 0 . 0 1 0 . 92
             30 0 . 1 0 0 . 92
             50 0 . 1 0 0 . 92
            100 0 . 1 0 0 . 92
             10 0 . 0 1 0 . 92
             50 0 . 0 1 0 . 92
              0 0 . 1 0 0 . 92
             30 0 . 1 0 0 . 92
            100 0 . 0 1 0 . 92
             80 0 . 0 1 0 . 92
             90 0 . 1 0 0 . 92
             20 0 . 1 0 0 . 92
             50 0 . 1 0 0 . 92
             30 0 . 1 0 0 . 92
             50 0 . 0 1 0 . 92
             20 0 . 0 1 0 . 92
             25 0 . 1 0 0 . 92
            100 0 . 0 1 0 . 92
             70 0 . 0 1 0 . 92
             30 0 . 1 0 0 . 92
              0 0 . 1 0 0 . 92
             40 0 . 1 0 0 . 92
             20 0 . 1 0 0 . 92
             20 0 . 1 0 0 . 92
             10 0 . 1 0 0 . 92
             50 0 . 1 0 0 . 92
             30 0 . 1 0 0 . 92
              0 0 . 1 0 0 . 92
             20 0 . 1 0 0 . 92
             60 0 . 1 0 0 . 92
             20 1 0 0 1 1 0 93
             50 1 0 0 1 1 0 93
             20 1 0 0 1 1 0 93
             30 1 0 0 1 1 0 93
             90 1 0 1 0 0 0 93
             50 1 0 0 1 1 0 93
             60 1 0 1 0 0 0 93
             60 1 0 1 0 0 0 93
             30 1 0 1 0 0 0 93
            100 1 0 1 0 0 0 93
             50 1 0 1 0 0 0 93
            100 1 0 0 1 1 0 93
             20 1 0 1 0 0 0 93
             20 1 0 0 1 1 0 93
             50 1 0 1 0 0 0 93
             90 1 0 1 0 0 0 93
             40 1 0 1 0 0 0 93
             20 1 0 1 0 0 0 93
             40 1 0 1 0 0 0 93
             20 1 0 1 0 0 0 93
             60 1 0 1 0 0 0 93
             10 1 0 1 0 0 0 93
             10 1 0 1 0 0 0 93
             70 1 0 1 0 0 0 93
             50 1 0 1 0 0 0 93
              0 1 0 1 0 0 0 93
             90 1 0 1 0 0 0 93
             30 1 0 0 1 1 0 93
            100 1 0 0 1 1 0 93
             60 1 0 0 1 1 0 93
             50 1 0 1 0 0 0 93
            100 1 0 0 1 1 0 93
             50 1 0 0 1 1 0 93
             50 1 0 1 0 0 0 93
            100 1 0 0 1 1 0 93
             40 1 0 1 0 0 0 93
            100 1 0 1 0 0 0 93
             10 1 0 1 0 0 0 93
             10 1 0 0 1 1 0 93
             10 1 0 1 0 0 0 93
             40 1 0 1 0 0 0 93
             10 1 0 1 0 0 0 93
             50 1 0 0 1 1 0 93
             60 1 0 1 0 0 0 93
             80 1 0 0 1 1 0 93
             30 1 0 1 0 0 0 93
             30 1 0 1 0 0 0 93
             10 1 0 0 1 1 0 93
             30 1 0 1 0 0 0 93
            100 1 0 1 0 0 0 93
             50 1 0 0 1 1 0 93
             25 1 0 1 0 0 0 93
             10 1 0 0 1 1 0 93
             60 1 0 1 0 0 0 93
             60 1 0 1 0 0 0 93
             40 1 0 0 1 1 0 93
             45 1 0 0 1 1 0 93
             40 1 0 0 1 1 0 93
             40 1 0 1 0 0 0 93
              0 1 0 1 0 0 0 93
             50 1 0 1 0 0 0 93
              0 1 0 1 0 0 0 93
             50 1 0 0 1 1 0 93
             20 1 0 1 0 0 0 93
              0 1 0 1 0 0 0 93
             50 1 0 1 0 0 0 93
              0 1 0 1 0 0 0 93
            100 1 0 0 1 1 0 93
             20 1 0 1 0 0 0 93
            100 1 0 1 0 0 0 93
             30 1 0 1 0 0 0 93
             60 1 0 1 0 0 0 93
             10 1 0 1 0 0 0 94
             45 1 0 0 1 1 0 94
             70 1 0 1 0 0 0 94
             10 1 0 1 0 0 0 94
             50 1 0 0 1 1 0 94
             50 1 0 1 0 0 0 94
             25 1 0 1 0 0 0 94
            100 1 0 0 1 1 0 94
             40 1 0 1 0 0 0 94
             20 1 0 0 1 1 0 94
             60 1 0 0 1 1 0 94
             40 1 0 1 0 0 0 94
             30 1 0 1 0 0 0 94
             60 1 0 0 1 1 0 94
             30 1 0 1 0 0 0 94
             50 1 0 1 0 0 0 94
             30 1 0 1 0 0 0 94
             50 1 0 1 0 0 0 94
             20 1 0 1 0 0 0 94
             10 1 0 0 1 1 0 94
              0 1 0 1 0 0 0 94
             40 1 0 1 0 0 0 94
             20 1 0 1 0 0 0 94
             10 1 0 0 1 1 0 94
             10 1 0 1 0 0 0 94
             50 1 0 0 1 1 0 94
             20 1 0 1 0 0 0 94
             50 1 0 0 1 1 0 94
             30 1 0 1 0 0 0 94
             30 1 0 0 1 1 0 94
            100 1 0 1 0 0 0 94
             30 1 0 0 1 1 0 94
            100 1 0 1 0 0 0 94
              5 1 0 0 1 1 0 94
             30 1 0 1 0 0 0 94
             40 1 0 0 1 1 0 94
             40 1 0 0 1 1 0 94
             20 1 0 1 0 0 0 94
             80 1 0 0 1 1 0 94
             10 1 0 1 0 0 0 94
             10 1 0 1 0 0 0 94
             20 1 0 0 1 1 0 94
             10 1 0 1 0 0 0 94
             70 1 0 0 1 1 0 94
             60 1 0 1 0 0 0 94
             40 1 0 0 1 1 0 94
             40 1 0 1 0 0 0 94
             50 1 0 1 0 0 0 94
             30 1 0 1 0 0 0 94
            100 1 0 1 0 0 0 94
             20 1 0 1 0 0 0 94
             50 1 0 0 1 1 0 94
             50 1 0 0 1 1 0 94
             60 1 0 0 1 1 0 94
             90 1 0 1 0 0 0 94
            100 1 0 1 0 0 0 94
             60 1 0 0 1 1 0 94
             30 1 0 0 1 1 0 94
             50 1 0 1 0 0 0 94
             90 1 0 0 1 1 0 94
             70 1 0 0 1 1 0 94
             40 1 0 0 1 1 0 94
             90 1 0 0 1 1 0 94
             30 1 0 0 1 1 0 94
             50 1 0 1 0 0 0 94
              0 1 0 1 0 0 0 94
             30 1 0 0 1 1 0 94
             50 1 0 1 0 0 0 94
             50 1 0 0 1 1 0 94
             40 1 0 0 1 1 0 94
             95 1 0 1 0 0 0 94
             20 1 0 1 0 0 0 94
             90 1 0 0 1 1 0 94
            100 1 0 0 1 1 0 94
            100 1 0 0 1 1 0 94
             80 1 0 1 0 0 0 94
             30 1 0 0 1 1 0 94
             70 1 0 0 1 1 0 94
             10 1 0 1 0 0 0 94
             40 1 0 1 0 0 0 94
            100 1 0 1 0 0 0 94
             90 1 0 0 1 1 0 94
             40 1 0 1 0 0 0 94
             30 1 0 1 0 0 0 94
             20 1 0 0 1 1 0 94
             30 1 0 1 0 0 0 94
             90 1 0 0 1 1 0 94
             20 1 0 1 0 0 0 94
             30 1 1 1 0 0 0 95
             50 1 1 1 0 0 0 95
             30 1 1 0 1 1 1 95
            100 1 1 1 0 0 0 95
             40 1 1 1 0 0 0 95
             80 1 1 0 1 1 1 95
             60 1 1 1 0 0 0 95
              0 1 1 0 1 1 1 95
             30 1 1 0 1 1 1 95
             10 1 1 0 1 1 1 95
             90 1 1 0 1 1 1 95
             30 1 1 0 1 1 1 95
             30 1 1 1 0 0 0 95
              0 1 1 0 1 1 1 95
             30 1 1 0 1 1 1 95
             20 1 1 1 0 0 0 95
            100 1 1 0 1 1 1 95
             40 1 1 1 0 0 0 95
             30 1 1 0 1 1 1 95
             30 1 1 1 0 0 0 95
             10 1 1 1 0 0 0 95
             60 1 1 1 0 0 0 95
             25 1 1 1 0 0 0 95
             80 1 1 1 0 0 0 95
             10 1 1 1 0 0 0 95
             30 1 1 1 0 0 0 95
             10 1 1 1 0 0 0 95
             30 1 1 1 0 0 0 95
             10 1 1 1 0 0 0 95
             45 1 1 1 0 0 0 95
             50 1 1 0 1 1 1 95
            100 1 1 1 0 0 0 95
             50 1 1 1 0 0 0 95
             60 1 1 1 0 0 0 95
            100 1 1 0 1 1 1 95
             30 1 1 1 0 0 0 95
             60 1 1 0 1 1 1 95
            100 1 1 0 1 1 1 95
             20 1 1 1 0 0 0 95
             10 1 1 1 0 0 0 95
             30 1 1 0 1 1 1 95
             20 1 1 1 0 0 0 95
             90 1 1 1 0 0 0 95
             30 1 1 0 1 1 1 95
             50 1 1 0 1 1 1 95
             30 1 1 1 0 0 0 95
             10 1 1 1 0 0 0 95
             70 1 1 0 1 1 1 95
             50 1 1 1 0 0 0 95
             30 1 1 1 0 0 0 95
             50 1 1 0 1 1 1 95
             90 1 1 1 0 0 0 95
             50 1 1 1 0 0 0 95
             30 1 1 0 1 1 1 95
             10 1 1 1 0 0 0 95
             10 1 1 1 0 0 0 95
              0 1 1 1 0 0 0 95
             30 1 1 1 0 0 0 95
             30 1 1 0 1 1 1 95
              0 1 1 0 1 1 1 95
            100 1 1 1 0 0 0 95
             10 1 1 1 0 0 0 95
             30 1 1 0 1 1 1 95
             20 1 1 1 0 0 0 95
             80 1 1 0 1 1 1 95
             10 1 1 0 1 1 1 95
             90 1 1 1 0 0 0 95
             10 1 1 1 0 0 0 95
             50 1 1 1 0 0 0 95
             10 1 1 0 1 1 1 95
             20 1 1 1 0 0 0 95
             50 1 1 0 1 1 1 95
             20 1 1 1 0 0 0 95
            100 1 1 1 0 0 0 95
              0 1 1 1 0 0 0 95
             50 1 1 0 1 1 1 95
             30 1 1 0 1 1 1 95
            100 1 1 0 1 1 1 95
             90 1 1 0 1 1 1 95
            100 1 1 1 0 0 0 95
             50 1 1 1 0 0 0 96
             50 1 1 1 0 0 0 96
             20 1 1 0 1 1 1 96
             30 1 1 1 0 0 0 96
             10 1 1 0 1 1 1 96
              5 1 1 0 1 1 1 96
             10 1 1 0 1 1 1 96
             20 1 1 1 0 0 0 96
             30 1 1 1 0 0 0 96
             50 1 1 0 1 1 1 96
             20 1 1 0 1 1 1 96
             40 1 1 0 1 1 1 96
             70 1 1 1 0 0 0 96
             30 1 1 1 0 0 0 96
            100 1 1 1 0 0 0 96
             20 1 1 1 0 0 0 96
             50 1 1 0 1 1 1 96
             30 1 1 0 1 1 1 96
             30 1 1 1 0 0 0 96
             10 1 1 0 1 1 1 96
             10 1 1 0 1 1 1 96
             10 1 1 0 1 1 1 96
             40 1 1 0 1 1 1 96
             30 1 1 0 1 1 1 96
             50 1 1 1 0 0 0 96
             90 1 1 1 0 0 0 96
             20 1 1 1 0 0 0 96
             20 1 1 1 0 0 0 96
             30 1 1 1 0 0 0 96
             30 1 1 1 0 0 0 96
             10 1 1 0 1 1 1 96
             60 1 1 1 0 0 0 96
             15 1 1 0 1 1 1 96
            100 1 1 0 1 1 1 96
             30 1 1 1 0 0 0 96
              0 1 1 0 1 1 1 96
             40 1 1 1 0 0 0 96
             20 1 1 1 0 0 0 96
             10 1 1 0 1 1 1 96
             70 1 1 1 0 0 0 96
             10 1 1 1 0 0 0 96
             80 1 1 1 0 0 0 96
             60 1 1 0 1 1 1 96
             60 1 1 1 0 0 0 96
             10 1 1 0 1 1 1 96
              0 1 1 0 1 1 1 96
              0 1 1 1 0 0 0 96
            100 1 1 1 0 0 0 96
             20 1 1 1 0 0 0 96
             50 1 1 1 0 0 0 96
              5 1 1 0 1 1 1 96
             50 1 1 0 1 1 1 96
             10 1 1 1 0 0 0 96
             20 1 1 1 0 0 0 96
             20 1 1 0 1 1 1 96
             60 1 1 0 1 1 1 96
             30 1 1 1 0 0 0 96
             10 1 1 0 1 1 1 96
             40 1 1 0 1 1 1 96
              5 1 1 0 1 1 1 96
             90 1 1 1 0 0 0 96
             10 1 1 0 1 1 1 96
            100 1 1 1 0 0 0 96
             10 1 1 1 0 0 0 96
             30 1 1 1 0 0 0 96
             20 1 1 1 0 0 0 96
             30 1 1 1 0 0 0 96
             10 1 1 0 1 1 1 96
             90 1 1 1 0 0 0 96
             60 1 1 1 0 0 0 96
            100 1 1 0 1 1 1 96
             50 1 1 1 0 0 0 96
             30 1 1 1 0 0 0 96
             60 1 1 0 1 1 1 96
              0 1 1 0 1 1 1 96
             10 1 1 0 1 1 1 96
             10 1 1 0 1 1 1 96
              5 1 1 0 1 1 1 96
             30 1 1 1 0 0 0 96
             80 1 1 1 0 0 0 96
            100 1 1 1 0 0 0 96
              0 1 1 1 0 0 0 96
             50 1 1 0 1 1 1 96
             20 1 1 1 0 0 0 96
             20 1 1 1 0 0 0 96
             40 1 1 0 1 1 1 96
             10 1 1 0 1 1 1 96
             50 1 1 1 0 0 0 96
             80 0 . 1 0 0 . 97
             80 0 . 0 1 0 . 97
            100 0 . 1 0 0 . 97
             30 0 . 1 0 0 . 97
             20 0 . 0 1 0 . 97
             50 0 . 0 1 0 . 97
             10 0 . 1 0 0 . 97
             10 0 . 0 1 0 . 97
            end
            label values treated treat
            label def treat 0 "Control", modify
            label def treat 1 "Treatment", modify
            label values inclusion t
            label def t 0 "No Inclusion", modify
            label def t 1 "Inclusion", modify
            ------------------ copy up to and including the previous line ------------------


            In above data, treated_ml =treated*midline
            inclusion_ml =inclusion*midline
            I am trying to run following specification

            reg contribution_PG midline ibn.treated treated_ml inclusion inclusion_ml, cl(village_code)

            results shows,

            note: treated omitted because of collinearity
            note: midline omitted because of collinearity

            So when I run the following;
            reg contribution_PG midline ibn.treated treated_ml inclusion inclusion_ml female poor, cl(village_code) noconstant baselevels

            results shows
            note: treated_ml omitted because of collinearity

            What will be the correct specification such that no variable is dropped due to colinearity. I will really appreciate any help with this thank you!

            Last edited by Abha Indurkar; 03 Sep 2020, 01:03.

            Comment


            • #21
              There is no such specification at all.

              The problem arises because the variable inclusion always has a missing value in the control group. Since inclusion is a variable in the regression, all of these observations are omitted from the analysis. In the observations that remain, treated is always "Treatment." Thus, the variable treated carries no information in your analysis. So that makes your design matrix singular, and something has to go. Because treated_ml is the rightmost mentioned variable that participates in the colinearity, that is the one Stata chooses to omit that one (it omits right to left until the singularity is resolved.)

              Why is the variable inclusion always missing in the control group? If that is supposed to be that way, then your data are simply unsuitable for this kind of analysis and what you are trying to do is impossible in principle. If it's not supposed to be that way, then you need to fix the data management that produced bad data.

              Comment

              Working...
              X