Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    thanks Dear Farnando for such a nice and valuable guidelines,

    I have one more request is the about the graphical presentations.
    can you please guide me about the graphs presented in Firpo, et al (2018)
    . at page 27 and 28 regarding decompositions. i
    t will be very helpful if you please give some commands or syntax to generate such graphs or some other in this regard.

    thanks again best wishes
    Atta

    Comment


    • #17
      thanks Dear Farnando for such a nice and valuable guidelines,

      I have one more request is the about the graphical presentations.
      can you please guide me about the graphs presented in
      Firpo, et al (2018)
      . at page 27 and 28 regarding decompositions. it will be very helpful if you please give some commands or syntax to generate such graphs or some other in this regard.

      thanks again best wishes
      Atta

      Comment


      • #18
        There isnt a ready made code for it.
        You have to run all the models (across all quintiles you want), collect all Coefficients and Standard errors, and plot them

        Comment


        • #19
          Dear Fernando,
          thank you very much for a very useful RIF-related suite of commands. It would be great if at some point you could add support for -svy- prefix.
          Best regards,
          Micahl

          Comment


          • #20
            Thank Michal
            That is actually on my list to do next.
            Right now, the only subcommand that allows for this is rifhdreg, with the option "svy" (please see in the helpfile)
            But I do that the hard way. My plan is to improve on that.
            The one that may be more difficult is oaxaca_rif, since i need to understand better what "oaxaca" does to handle svy.
            Best
            Fernando

            Comment


            • #21
              Yes, that might be problematic. I remember that "svy" is used as an option in "oaxaca", not as a prefix due to some technical difficulties.
              Still, "oaxaca_rif" is very helpful in my research - thanks again!
              Best,
              Michal

              Comment


              • #22
                Originally posted by FernandoRios View Post
                Hi Atta
                I see. If I understand the process correctly, it would have to be something as follows:
                Code:
                webuse nhanes2f, clear
                oprobit health female black age c.age#c.age
                predict xb_pr,xb
                sum xb_pr
                gen rchealth=(xb_pr-r(min))/(r(max)-r(min))
                rifhdreg rchealth female black age c.age#c.age, rif(eindex(rchealth) lb(0) ub(1) )
                the only difference would be that within the option "rif" you will need to add the income or wealth variable instead of the RCHEALTH variable).

                I also still read carefully the paper they cite:
                Van Doorslaer E, Jones AM. Inequalities in self-reported health: validation of a new approach to measurement. J Health Econ. 2003;22:61–87.
                As it may explain alternatives to how to implement the initial transformation.

                Best
                Hi FernandoRios,

                What would it mean or how can we interpret xb_pr if it contains negative predicted values? I am not sure whether this could be the case for a health variable but I encounter such issues with income variable. Specifically, I have an income variable consisting of 5 categories of income (e.g., 5-10; 10-15; 15-20; 20-30; 30-40) and I did the same procedures as you did in #4 (only lines 1-2). Interestingly, I got both negative and positive predicted value of income. Do you have any suggestions on how to interpret those predicted values of income?

                Thanks.

                DL

                Comment


                • #23
                  Can you post your results and the command you use them to get them?

                  Comment


                  • #24
                    Originally posted by FernandoRios View Post
                    Can you post your results and the command you use them to get them?
                    Thanks for your response! The following is an example of data and codes:

                    Data
                    Code:
                    * Example generated by -dataex-. To install: ssc install dataex
                    clear
                    input byte D20 float age3 byte(sex mstt) float hhscl byte area float alone byte work float edu byte pinc float(club fsupp) byte psrh double weight_respondent
                    4 1 2 2 2.4627664 1 0 0 1 0 1 1 0  .6976066284430985
                    4 2 1 2    2.7275 1 0 0 3 1 0 1 1  .6976066284430985
                    2 0 2 0    2.7275 1 0 0 1 0 0 1 2 1.5542141366555642
                    2 1 2 2 2.4627664 1 0 0 0 1 0 1 1  .6976066284430985
                    2 1 2 2 2.4627664 1 0 0 0 0 0 1 .  .6976066284430985
                    2 1 2 2    2.7275 1 0 0 0 0 1 1 0  .6976066284430985
                    2 0 2 0    2.7275 1 0 0 1 0 1 1 2 2.3313212049833463
                    5 0 2 0  1.850069 1 0 0 1 0 1 1 1 1.5542141366555642
                    3 1 2 2 2.9734106 1 0 0 2 0 0 1 1  .6976066284430985
                    4 0 2 0    2.7275 1 0 0 4 0 0 1 2 1.5542141366555642
                    2 1 1 2 2.1734698 1 0 1 1 0 1 0 1  .6976066284430985
                    1 0 1 2 2.1734698 1 0 1 1 0 1 0 0  .7771070683277821
                    2 0 2 2 2.1734698 1 0 0 0 0 1 1 1  .7771070683277821
                    5 0 2 2 3.2042794 1 0 0 1 0 1 1 0  .7771070683277821
                    2 0 2 2  1.850069 1 0 1 1 0 1 1 1  .7771070683277821
                    5 0 2 2  1.850069 1 0 0 4 0 1 1 1  .7771070683277821
                    5 1 1 0    2.7275 1 0 1 4 0 0 1 1  1.395213256886197
                    4 0 1 0  1.850069 0 0 0 2 0 0 1 1 1.5542141366555642
                    2 1 1 2 2.4627664 0 0 0 3 0 1 1 1  .6976066284430985
                    2 2 1 0 2.4627664 0 0 0 3 0 0 1 1  1.395213256886197
                    2 2 1 0  1.850069 0 0 0 0 0 1 1 2  1.395213256886197
                    3 0 1 0 3.2042794 0 0 1 1 1 1 1 1  .7771070683277821
                    4 2 1 2 2.1734698 0 0 0 3 0 0 0 0  1.395213256886197
                    5 0 2 0    2.7275 0 0 0 0 1 0 1 1 1.5542141366555642
                    2 2 2 2  1.850069 0 0 0 0 0 1 1 2  .6976066284430985
                    2 1 2 2 1.4742693 0 0 0 1 0 0 1 1  .6976066284430985
                    3 0 2 2  1.850069 0 0 0 0 1 1 1 1  .7771070683277821
                    4 0 1 2 2.1734698 0 0 1 2 1 0 1 1  .7771070683277821
                    3 0 2 2  1.850069 0 0 1 4 0 1 1 1  .7771070683277821
                    4 2 2 2 2.9734106 0 0 0 0 0 0 0 1  .6976066284430985
                    1 2 2 2 1.4742693 0 0 0 0 0 1 1 1  .6976066284430985
                    2 0 2 0    2.7275 0 0 0 0 1 0 1 2 1.5542141366555642
                    4 2 2 2 2.4627664 0 0 0 0 0 1 1 0  .6976066284430985
                    3 2 1 0 2.9734106 0 0 0 3 1 1 1 0  1.395213256886197
                    4 1 2 2 2.1734698 0 0 0 0 0 1 1 2  .6976066284430985
                    1 0 1 0  1.850069 0 0 1 0 1 0 0 1  .7771070683277821
                    1 0 2 2         1 0 1 1 0 1 1 0 0  .7771070683277821
                    3 0 2 2 1.4742693 0 0 0 0 1 1 1 0  .7771070683277821
                    3 0 1 2 4.2054095 0 0 1 1 1 1 1 2  .7771070683277821
                    1 1 2 2    2.7275 0 0 0 3 1 1 1 0  .6976066284430985
                    1 1 1 0 2.9734106 0 0 0 0 0 1 1 1  1.395213256886197
                    2 2 2 2 2.4627664 0 0 0 0 0 1 1 1 2.0928198853292956
                    1 0 2 2 2.1734698 0 0 0 0 0 1 1 1  .7771070683277821
                    1 1 1 0 2.4627664 0 0 1 3 1 0 0 1  1.395213256886197
                    1 2 2 2 2.1734698 0 0 0 0 1 1 1 0  .6976066284430985
                    1 1 1 0 2.1734698 0 0 0 0 1 1 1 0  1.395213256886197
                    1 0 1 0 1.4742693 0 0 0 3 1 0 1 1 1.5542141366555642
                    1 2 1 2    2.7275 0 0 0 0 1 1 1 0  .6976066284430985
                    2 1 1 2 2.1734698 0 0 0 3 0 1 1 0  .6976066284430985
                    2 0 1 0 2.9734106 0 0 1 0 0 1 1 1 1.5542141366555642
                    4 0 2 2  3.422755 0 0 1 0 1 1 1 1  .7771070683277821
                    1 1 1 2  1.850069 0 0 0 0 1 0 1 0  .6976066284430985
                    2 1 2 0  1.850069 0 0 0 0 0 1 1 0  .6976066284430985
                    1 0 1 0  3.422755 0 0 1 0 0 0 1 1 1.5542141366555642
                    1 0 2 0 2.1734698 0 0 0 0 0 0 1 0 1.5542141366555642
                    5 1 2 0 2.1734698 0 0 0 2 0 1 1 1  1.395213256886197
                    1 2 2 2    2.7275 0 0 0 0 1 1 1 1  .6976066284430985
                    4 0 2 0 1.4742693 0 0 0 2 0 1 0 2 1.5542141366555642
                    1 2 2 2  1.850069 0 0 0 0 1 1 1 2  .6976066284430985
                    2 1 1 0    2.7275 0 0 0 3 1 1 1 0  1.395213256886197
                    1 1 2 2 2.1734698 0 0 1 2 1 1 1 .  .6976066284430985
                    1 0 2 0 3.6307805 0 0 1 0 0 0 1 1 1.5542141366555642
                    2 1 2 0 2.4627664 0 0 0 0 0 0 1 0  .6976066284430985
                    1 0 2 2         1 0 1 0 0 0 1 1 1  .7771070683277821
                    3 0 1 0 2.4627664 0 0 1 2 0 1 1 1  .7771070683277821
                    2 0 1 0 1.4742693 0 0 1 0 0 1 1 2  .7771070683277821
                    1 0 1 0 2.9734106 0 0 0 0 0 1 1 1 1.5542141366555642
                    5 0 1 0    2.7275 0 0 1 3 0 1 1 1  .7771070683277821
                    2 0 1 2 2.4627664 1 0 0 4 0 1 1 0  .7771070683277821
                    2 0 1 0 1.4742693 1 0 1 0 0 1 1 1 1.5542141366555642
                    2 1 2 2 1.4742693 1 0 0 3 0 0 1 1  .6976066284430985
                    2 0 1 2 2.9734106 1 0 1 4 0 1 1 2  .7771070683277821
                    3 1 2 2 1.4742693 1 0 0 0 1 1 1 1  .6976066284430985
                    2 0 2 2  1.850069 1 0 0 4 0 1 1 1  .7771070683277821
                    1 0 1 0  1.850069 1 0 1 0 0 0 1 1  .7771070683277821
                    1 1 1 0    2.7275 1 0 0 1 0 0 1 1  1.395213256886197
                    1 2 2 2 2.9734106 1 0 0 0 0 1 1 0  .6976066284430985
                    2 0 2 2 1.4742693 1 0 0 4 0 1 1 0  .7771070683277821
                    2 1 1 0 1.4742693 1 0 1 1 1 0 1 1  .6976066284430985
                    2 2 2 2  1.850069 1 0 0 0 1 0 1 .  .6976066284430985
                    2 0 2 2 1.4742693 1 0 0 0 0 0 1 1  .7771070683277821
                    4 1 1 2 2.4627664 1 0 0 0 0 1 1 1  .6976066284430985
                    5 2 1 2 3.6307805 1 0 0 2 0 0 1 2  .6976066284430985
                    2 1 1 0 3.6307805 1 0 0 1 0 1 1 2  1.395213256886197
                    2 1 1 0 2.1734698 1 0 0 2 1 1 1 2  .6976066284430985
                    2 0 1 0 2.4627664 0 0 1 0 0 1 1 1 1.5542141366555642
                    1 0 2 2  1.850069 0 0 1 0 1 1 1 1  .7771070683277821
                    2 0 2 0 2.1734698 0 0 1 1 0 1 1 1 1.5542141366555642
                    1 0 1 0 1.4742693 0 0 1 3 0 1 1 1  .7771070683277821
                    1 1 1 2         1 0 1 0 0 1 0 1 0  .6976066284430985
                    2 0 1 0 2.9734106 0 0 1 0 0 1 1 1  .7771070683277821
                    4 0 1 0 1.4742693 0 0 1 0 1 1 1 1  .7771070683277821
                    4 0 2 0 3.8298335 0 0 0 0 1 1 1 0  .7771070683277821
                    4 0 2 0 1.4742693 0 0 1 0 1 1 1 1 1.5542141366555642
                    3 0 1 0 3.2042794 0 0 0 0 1 1 1 2  .7771070683277821
                    4 0 1 0 1.4742693 0 0 1 0 0 1 1 1  .7771070683277821
                    1 1 2 2 1.4742693 0 0 0 0 1 0 1 1  .6976066284430985
                    1 0 2 2 2.9734106 0 0 0 0 0 1 1 1  .7771070683277821
                    1 1 2 2         1 0 1 0 0 1 0 1 0  .6976066284430985
                    1 1 1 0 2.4627664 0 0 0 0 0 1 1 1  1.395213256886197
                    end
                    label values D20 labels86
                    label def labels86 1 "Less than Ks 25,000", modify
                    label def labels86 2 "Ks 25,001 - ks 50,000", modify
                    label def labels86 3 "Ks 50,001 - ks 75,000", modify
                    label def labels86 4 "Ks 75,001 - ks 100,000", modify
                    label def labels86 5 "Over ks 100,000", modify
                    label values age3 age3
                    label def age3 0 "60-69", modify
                    label def age3 1 "70-79", modify
                    label def age3 2 "80+", modify
                    label values sex labels10
                    label def labels10 1 "Men", modify
                    label def labels10 2 "Women", modify
                    label values mstt mstt
                    label def mstt 0 "Married", modify
                    label values area area
                    label def area 0 "Rural", modify
                    label def area 1 "Urban", modify
                    label values alone alone
                    label def alone 0 "Living with others", modify
                    label def alone 1 "Living alone", modify
                    label values work work
                    label def work 0 "No", modify
                    label def work 1 "Yes", modify
                    label values edu edu
                    label def edu 0 "No schooling/ uncompleted primary school", modify
                    label def edu 1 "Primary school", modify
                    label def edu 2 "Secondary school", modify
                    label def edu 3 "Monastic school", modify
                    label def edu 4 "High school and above", modify
                    label values pinc pinc
                    label values club club
                    label def club 0 "No", modify
                    label def club 1 "Yes", modify
                    label values fsupp fsupp
                    label def fsupp 0 "No", modify
                    label def fsupp 1 "yes", modify
                    label values psrh psrh
                    label def psrh 0 "Worse", modify
                    label def psrh 1 "The same", modify
                    label def psrh 2 "Better", modify
                    Codes to estimate predicted values of income using oprobit command
                    Code:
                    * D20 is an income variable in five categories
                    * I used a set of covariates because I needed them for my next analyses
                    qui oprobit D20 i.age3 i.sex i.mstt c.hhscl i.area i.alone i.work i.edu i.pinc i.club i.fsupp i.psrh [pw=weight_respondent], vce(robust)
                        predict yhat, xb
                    Results
                    Code:
                    . sum yhat, detail
                    
                               Linear prediction (cutpoints excluded)
                    -------------------------------------------------------------
                          Percentiles      Smallest
                     1%    -5.116871      -5.116871
                     5%     .0229456      -4.520415
                    10%      .780854      -3.947094       Obs                  97
                    25%     1.035661      -3.834116       Sum of Wgt.          97
                    
                    50%     1.433494                      Mean           1.320705
                                            Largest       Std. Dev.      1.347206
                    75%     1.899404       2.976089
                    90%     2.573335       2.986378       Variance       1.814963
                    95%     2.924136       3.032029       Skewness      -3.011974
                    99%     3.137269       3.137269       Kurtosis       14.17129

                    Comment


                    • #25
                      Hi Dung,
                      I had to go back to my own post to remember the procedure you describe.
                      First of all, to clarify, I didn't suggest the procedure but simply provided my interpretation on how to implement the paper that was cited at the time.
                      Second, Oprobit always (or almost always) will generate negative predicted values because it is predicting a normalized latent value. By default that means some values will be positive, some negative.
                      Third, for the procedure described, the next step is to generate an index (say a 0-1 index), so that can be used later on for the inequality /RIF analysis.
                      That being said, I am not sure what is what you want to do. If your dependent variable is income quintiles, what will your ranking variable be?
                      What is your research question?

                      HTH
                      Fernando

                      Comment


                      • #26
                        Originally posted by FernandoRios View Post
                        Hi Dung,
                        I had to go back to my own post to remember the procedure you describe.
                        First of all, to clarify, I didn't suggest the procedure but simply provided my interpretation on how to implement the paper that was cited at the time.
                        Second, Oprobit always (or almost always) will generate negative predicted values because it is predicting a normalized latent value. By default that means some values will be positive, some negative.
                        Third, for the procedure described, the next step is to generate an index (say a 0-1 index), so that can be used later on for the inequality /RIF analysis.
                        That being said, I am not sure what is what you want to do. If your dependent variable is income quintiles, what will your ranking variable be?
                        What is your research question?

                        HTH
                        Fernando
                        Hi Fernando,

                        Thank you for your clarification. The following is my response
                        Code:
                        First of all, to clarify, I didn't suggest the procedure but simply provided my interpretation on how to implement the paper that was cited at the time.
                        Yes, I totally understand your point.
                        Code:
                        Second, Oprobit always (or almost always) will generate negative predicted values because it is predicting a normalized latent value. By default that means some values will be positive, some negative.
                        Thank you, this answered my question in #24.
                        Code:
                        If your dependent variable is income quintiles, what will your ranking variable be?
                        What is your research question?
                        No, income is not the dependent variable in my research, rather it is the ranking variable. In fact, what I did is to convert a categorical income variable to a continuous one as suggested by this paper https://www.sciencedirect.com/scienc...06919207000334. Please note that in that paper authors transformed a dummy obesity (not income) into a continuous one by running that dummy variable on a set of exogenous determinants of obesity.

                        Comment


                        • #27
                          Hi Dung
                          In that case, I think what the authors on that paper did is slightly different from what you are trying to do:
                          This is what they say"
                          Instead of simply taking the midpoint of each income bracket, we employed an interval regression model based on information of the head of the household (age, gender, education, socio-economic status and region of residence) to obtain a continuous household income variable (see Appendix). Once net monthly household income was predicted we divided it by an equivalence factor (equal to the number of household members elevated to 0.5), to adjust for differences in household size.
                          So, instead of using oprobit, you need to use intreg.
                          HTH
                          Fernando

                          Comment


                          • #28
                            Originally posted by FernandoRios View Post
                            Hi Dung
                            In that case, I think what the authors on that paper did is slightly different from what you are trying to do:
                            This is what they say"

                            So, instead of using oprobit, you need to use intreg.
                            HTH
                            Fernando
                            Hi Fernando,

                            Yes, I know that those authors used interval regression to transform their income variable. However, I compared model fits between oprobit and intreg using log likelihood, AIC, and BIC. The results showed that oprobit fitted data better than intreg. In addition, the Stata manual suggests that "if oprobit fit much better, you might want to modify the intreg model (e.g., take logs of the interval points) or use oprobit or ologit or some other ordinal method instead". That is why I chose to use oprobit instead of intreg.

                            DL

                            Comment


                            • #29
                              hi respected FarnadoRios
                              first I am very thankful for your constructive and comprehensive guidance ever.
                              upon earlier discussions on this page I have run the command you suggested and get the results of RIF-decomposition. I have mainly two questions.
                              1. there is difference in signs of explained and unexplained parts of decomposition. so how we may interpret these results in sense of rural urban disparities in child nutritional level (group two is rural).
                              2. in detailed decomposition I got a results in the sense that for example maternal education (v106) have -ve signs in explained but +ve signs in the unexplained part. moreover in case of wealth status (v190) sings are similar. so please how we may interpret these results.
                              u have attached the detailed results for Wagstaff and Erregers indices. I am confused much with these issues so please guide me in this regard.

                              thanks again for cooperation


                              Attached Files

                              Comment


                              • #30
                                Hi Atta
                                I have not directly work with wagstaff or Erregers indices, so cannot provide you with insights of how to interpret them.
                                Regarding the decomposition, it may become clear if you think about how the different components are created:
                                Below is a refresher for the standard OB decomposition, (reweighted follows the same principle)
                                Code:
                                composition=DX=(X1-X0)*B0
                                Structure(coefficient)=DB=X1*(B1-B0)
                                So with this in mind ask yourself why would the composition be positive? (what are the values that X1 and X2 , and B0 need to have for this to happen)
                                and what about the coefficient effect? What values should B1 and B0, and X1 need to have?
                                In what scenario (again think about what values) you will find one effect to be positive and the other negative.

                                HTH

                                Comment

                                Working...
                                X