Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Seemingly unrelated regression command

    Hello/Good Day. I would like to know how to control for, keep certain predicator variables constant when using the sureg command. I am running an analysis and I need to keep predicator/independent variables such as age, gender/sex, year of study constant. I will then examine the effects on the response/dependent variable of adding two additional predicator/independent variables.
    Thank you.
    Nick

  • #2
    Nobody knows..huh

    Comment


    • #3
      Your question isn't really clear. You can include whatever predictor covariates you like in the regression equations you specify in your -sureg- command in the usual way. It's no different from what you would do in simple -regress- commands: just add them to the varlist(s). That's the usual way to adjust for ("control" for) predictors.

      To estimate the effects on the response variable of including two additional predictor variables, all you need to do is run a single regression containing all the relevant variables and examine the results for the two additional variables of interest (probably including a test of their joint significance). No need for -sureg-. If you are interested in the change in R-square from adding the two variables, then re-run the regression omitting those two variables and restrict to the estimation sample of the full model, and then you will get it. That is

      Code:
      regress response x1 x2 v1 v2 v3...
      test x1 x2
      regress response v1 v2 v3... if e(sample)
      should solve your problem easily.

      That it's so easy suggests that you are asking something else, and when you refer to keeping variables constant, you seem to be confirming that. But it isn't clear what you mean by keeping variables constant. When you include predictors as covariates, that does not keep them constant in any realistic sense of the word: what it does is condition the regression on the values of those variables as they occur in the estimation sample.

      So if what I've suggested isn't what you need, please explain more fully.

      Comment


      • #4
        Hi Clyde, firstly thank you for responding to me on this. I am conducting research and I am more familiar with SPSS and still very new to STATA. The research hypothesis I am testing was originally preformed using STATA and using seemingly unrelated regression. The question examines two types of social capital and the ability for certain variables to predict these types of social capital. So each type of social capital is the response variable. Seemingly unrelated regression was used because the regression can be calculated simultaneously and the residuals for the two types are social capital are not independent. In the regression, I need to control for Age, Gender, Education, Time (minutes per day) and then add two additional predictor variables into the regression to see their effects. In SPSS I would just hierarchical linear regression and capture the control variables into the first block and the two additional IVs in block two. My question is how do I accomplish this in STATA using Seemingly unrelated regression. Based on your response I would add everything into the initial Sureg and then run it again without the two IV's of interest and restrict to the estimation sample of the full mode to get the r squared change information. Sureg also doesn't seem to provide any information on collinearity estimates. The VIF command only works with standard multiple regression.

        Comment


        • #5
          Nic, thanks for the explanation of your project. I see now why you want to use -sureg-, and I agree that it is appropriate for the purpose. And, yes, I would appraise the impact of those two IVs by (a) testing their joint significance with a -test- command after -sureg-, and (b) looking at the change in Rsquare when you re-run -sureg- without them.

          As for collinearity, collinearity is collinearity, and it doesn't depend on the particular kind of regression. So if you want VIFs, just take the same models and run -regress- on them, followed by -vif-. It's kind of a pain that Stata doesn't let you run -vif- after other regression models--I guess I'd put it on my wishlist for Stata 14, but I don't seem to care about collinearity nearly as much as a lot of other people, so I don't really use -vif- that often, in any context.

          Comment


          • #6
            Thank you Clyde for your help.

            Comment


            • #7
              Hi Clyde, I have another question on seemingly unrelated regression. Can I treat it the same way as standard multiple regression? Is there a limited to the amount of regression equations run at the same time.

              For example. I currently run a seemingly unrelated regression with two dependent variables (Bonding and Bridging Social Capital) both equations have the same independent variables. But I am not sure what to do with when measuring maintained social capital.

              This may seem like odd questions but I have replicated a study which used seemingly unrelated regression and my contribution is to include and measure maintained social capital and I not sure whether to do this with a separate multiple regression or to add it the seemingly unrelated regression and run them all at the same time.



              Comment

              Working...
              X