Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Hi Carlo and Jeff,

    I'd like to seek help with two another questions about conditional logistic regression. Thank you ahead for your previous time and intellectual inputs!

    1. When would you say Spearman correlation and Pearson correlation are very different such that outliers are a substantial concern? Please take the following tables as an illustration.
    Pearson Corr. (1) (2) (3) (4) (5) (6) (7) (8) (9)
    (1) 1
    (2) 0.233*** 1
    (3) 0.038* 0.180*** 1
    (4) 0.061*** -0.016 0.025 1
    (5) 0.116*** 0.017 -0.032 0.043* 1
    (6) 0.163*** -0.086*** -0.008 0.047** 0.066*** 1
    (7) 0.247*** 0.330*** 0.016 0.009 0.066*** -0.176*** 1
    (8) 0.181*** 0.233*** 0.02 0.104*** 0.064*** -0.091*** 0.097*** 1
    (9) 0.053** 0.028 0.023 0.012 -0.008 -0.096*** 0.160*** -0.059*** 1
    Spearman (1) (2) (3) (4) (5) (6) (7) (8) (9)
    (1) 1
    (2) 0.2295*** 1
    (3) 0.1000*** 0.2231*** 1
    (4) 0.0608*** -0.0145 0.0075 1
    (5) 0.1156*** 0.014 -0.0196 0.044** 1
    (6) 0.1723*** -0.0978*** -0.0222 0.0545*** 0.0609*** 1
    (7) 0.3470*** 0.4623*** 0.1471*** 0.0531*** 0.1062*** -0.1666*** 1
    (8) 0.2393*** 0.3125*** 0.0818*** 0.1021*** 0.0937*** -0.0799*** 0.4556*** 1
    (9) 0.0505*** 0.0125 0.0487*** 0.0245 -0.0089 -0.0366** 0.1228*** -0.034* 1
    2. Further, when are outliers an issue for estimating a conditional logistic regression model? How? How might I detect the extreme outliers that may influence model estimates and assess the model fit with and without such observations?

    Thank you a lot again!

    Comment


    • #32
      What variable does this apply to? Because the dependent variable is binary, the notion of "an outlier" doesn't really apply. There can be a problem with seeing almost all ones or all zeros, but that's really different. By outlier are you referring to the explanatory variables?

      One of the virtues of fixed effects methods is that they put no restriction on the the unobserved effect, and that, along, can account for very large differences in the explanatory variables of the units. Usually "outliers" aren't so much of an issue. In any case, just looking at the raw data doesn't tell you much about the kind of outliers that would matter for FE.

      Comment


      • #33
        Originally posted by Jeff Wooldridge View Post
        What variable does this apply to? Because the dependent variable is binary, the notion of "an outlier" doesn't really apply. There can be a problem with seeing almost all ones or all zeros, but that's really different. By outlier are you referring to the explanatory variables?

        One of the virtues of fixed effects methods is that they put no restriction on the the unobserved effect, and that, along, can account for very large differences in the explanatory variables of the units. Usually "outliers" aren't so much of an issue. In any case, just looking at the raw data doesn't tell you much about the kind of outliers that would matter for FE.
        Hi Jeff,

        Thank you a lot for illustrations. These nine variables are all explanatory variables. By outlier, I was thinking of whether the explanatory variables of some observations may significantly influence the estimation outcomes (coefficients or odds ratio).
        I think my questions might be reframed as "whether the fact that my Spearman correlation table differs from Pearson correlation table implies potential issues/caution of conditional logistic regressions?" I got that Pearson correlation captures linear correlation. I have a related question of whether multicollinearity is a concern in conditional logistic regression. I received this question because my model has interaction terms of a categorical variable with two continuous variables, which triggers mechanical collinearity. However, I am not sure whether and how it might be a concern for CLM.
        Thank you again. Your insights are highly appreciated.

        Best,
        Jessie

        Comment


        • #34
          Is there any explanation why the xtlogit, fe command drop these groups and why xtreg, fe do not? I have a similar problem with xtlogit and, just for curiosity ran the same model with xtreg, fe and it did run with my entire dataset

          Comment


          • #35
            sorry mistook the q
            Last edited by Chan Ge; 02 Mar 2024, 22:36.

            Comment


            • #36
              Originally posted by Jeff Stu View Post
              Is there any explanation why the xtlogit, fe command drop these groups and why xtreg, fe do not? I have a similar problem with xtlogit and, just for curiosity ran the same model with xtreg, fe and it did run with my entire dataset
              Apart from no within panel variance, there is the issue of perfect prediction in logistic regression that would lead to observations being dropped. This does not affect linear regression. See https://stats.oarc.ucla.edu/other/mu...with-the-issue.

              Comment


              • #37
                Originally posted by Andrew Musau View Post

                Apart from no within panel variance, there is the issue of perfect prediction in logistic regression that would lead to observations being dropped. This does not affect linear regression. See https://stats.oarc.ucla.edu/other/mu...with-the-issue.
                thank you Andrew Musau

                Comment

                Working...
                X