Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Ommited because of collinearity even though 'collin' and 'vif' commands show no correlation.

    Dear all,
    I have a challenge that I cannot understand or deal with. When I run my regression I get a note saying 'omitted because of collinearity' even though when I run 'collin' or 'vif' commands there is no serious correlation detected.


    . reg suic_new fear_z sadness_z shame_z despair_z anxiety_z
    note: fear_zscore omitted because of collinearity
    note: sadness_zscore omitted because of collinearity
    note: shame_zscore omitted because of collinearity
    note: despair_zscore omitted because of collinearity
    note: anxiety_zscore omitted because of collinearity

    Source | SS df MS Number of obs = 94
    -------------+---------------------------------- F(0, 93) = 0.00
    Model | 0 0 . Prob > F = .
    Residual | 340.066465 93 3.65662865 R-squared = 0.0000
    -------------+---------------------------------- Adj R-squared = 0.0000
    Total | 340.066465 93 3.65662865 Root MSE = 1.9122

    --------------------------------------------------------------------------------
    suic_new | Coef. Std. Err. t P>|t| [95% Conf. Interval]
    ---------------+----------------------------------------------------------------
    fear_zscore | 0 (omitted)
    sadness_zscore | 0 (omitted)
    shame_zscore | 0 (omitted)
    despair_zscore | 0 (omitted)
    anxiety_zscore | 0 (omitted)
    _cons | 11.36497 .1972316 57.62 0.000 10.97331 11.75664
    --------------------------------------------------------------------------------

    .

    Collinearity Diagnostics

    SQRT R-
    Variable VIF VIF Tolerance Squared
    ----------------------------------------------------
    suic_new 1.06 1.03 0.9470 0.0530
    fear_zscore 1.03 1.02 0.9698 0.0302
    sadness_zscore 1.07 1.03 0.9353 0.0647
    shame_zscore 1.06 1.03 0.9418 0.0582
    despair_zscore 1.05 1.02 0.9559 0.0441
    anxiety_zscore 1.02 1.01 0.9837 0.0163
    ----------------------------------------------------
    Mean VIF 1.05

    Cond
    Eigenval Index
    ---------------------------------
    1 2.1146 1.0000
    2 1.2727 1.2890
    3 1.0469 1.4212
    4 0.9718 1.4751
    5 0.8227 1.6033
    6 0.7583 1.6699
    7 0.0130 12.7747
    ---------------------------------
    Condition Number 12.7747
    Eigenvalues & Cond Index computed from scaled raw sscp (w/ intercept)
    Det(correlation matrix) 0.8707


    Moreover, even when I leave only one variable in my regression I get the same message. I cannot see how it is possible that one explanatory variable is collinear.

    reg suic_new fear_z
    note: fear_zscore omitted because of collinearity

    Source | SS df MS Number of obs = 94
    -------------+---------------------------------- F(0, 93) = 0.00
    Model | 0 0 . Prob > F = .
    Residual | 340.066465 93 3.65662865 R-squared = 0.0000
    -------------+---------------------------------- Adj R-squared = 0.0000
    Total | 340.066465 93 3.65662865 Root MSE = 1.9122

    ------------------------------------------------------------------------------
    suic_new | Coef. Std. Err. t P>|t| [95% Conf. Interval]
    -------------+----------------------------------------------------------------
    fear_zscore | 0 (omitted)
    _cons | 11.36497 .1972316 57.62 0.000 10.97331 11.75664
    ------------------------------------------------------------------------------

    Now, this happens only when I standardize my explanatory variables, that is, when I transform them to their z-scores (I do all this in Excel). Otherwise, everything runs smoothly. But I need to transform them to z-scores since I want to give them all an equal weight. By the way, I am dealing here with Google Ngrams emotional/affective word frequencies in a time -series perspective, thus rows in the following table indicate years.

    list suic_new fear_z sadness_z general_dislike_z shame_z despair_z anxiety_z

    +-----------------------------------------------------------------------------------------+
    | suic_new fear_zsc~e sadness_~e general_~e shame_zs~e despair_~e anxiety_~e |
    |-----------------------------------------------------------------------------------------|
    1. | 5.7776296 9.437e-16 -6.606e-15 3.109e-15 1.776e-15 0 -8.327e-16 |
    2. | 7.3340097 -1.621e-14 -1.749e-15 -1.027e-14 -5.274e-16 1.887e-15 1.998e-15 |
    3. | 7.8030722 7.994e-15 -2.220e-15 -4.774e-15 -2.609e-15 0 -5.551e-16 |
    4. | 7.8814376 2.498e-15 -5.440e-15 -5.995e-15 1.027e-15 -1.998e-15 1.277e-15 |
    5. | 9.1750743 3.109e-15 9.437e-16 -1.565e-14 0 0 -2.442e-15 |
    |-----------------------------------------------------------------------------------------|
    6. | 8.9673257 -1.127e-14 3.331e-16 -5.773e-15 5.523e-15 -1.665e-15 0 |
    7. | 9.1869375 3.109e-15 -5.135e-15 2.665e-15 1.554e-15 1.443e-15 2.165e-15 |
    8. | 9.8899214 -2.998e-15 -2.304e-15 1.887e-15 4.441e-15 3.664e-15 1.388e-15 |
    9. | 9.9566368 0 1.971e-15 -1.410e-14 -2.998e-15 -2.442e-15 3.331e-15 |
    10. | 9.0044233 1.016e-14 -1.416e-15 -2.476e-14 -6.772e-15 -2.220e-15 3.497e-15 |
    |-----------------------------------------------------------------------------------------|
    11. | 8.8672193 -1.732e-14 -3.580e-15 5.107e-15 -4.441e-16 1.998e-15 -1.388e-15 |
    12. | 9.3839625 -6.439e-15 3.109e-15 1.588e-14 -5.718e-15 0 -5.052e-15 |
    13. | 9.1308341 -2.665e-15 3.830e-15 1.998e-15 -3.164e-15 2.442e-15 0 |
    14. | 8.2565447 5.662e-15 3.164e-15 -1.432e-14 -1.443e-15 0 4.441e-15 |
    15. | 10.048094 1.038e-14 -1.166e-15 1.155e-14 2.054e-15 -2.220e-15 4.052e-15 |
    |-----------------------------------------------------------------------------------------|
    16. | 9.8828704 9.548e-15 0 -1.621e-14 3.109e-15 2.442e-15 0 |
    17. | 9.9118333 -8.826e-15 -2.554e-15 0 2.498e-16 -1.776e-15 0 |
    18. | 10.381302 1.954e-14 -7.133e-15 1.121e-14 1.554e-15 0 5.107e-15 |
    19. | 10.540538 0 -2.220e-15 1.188e-14 1.943e-15 -3.553e-15 -1.443e-15 |
    20. | 11.143385 -3.331e-15 -2.220e-15 -1.943e-14 -3.331e-15 0 -2.054e-15 |
    |-----------------------------------------------------------------------------------------|
    21. | 11.841895 9.714e-15 -1.499e-15 5.773e-15 -4.163e-16 2.109e-15 3.497e-15 |
    22. | 12.770001 9.437e-16 2.054e-15 1.110e-14 0 -3.553e-15 5.385e-15 |
    23. | 13.176805 7.216e-16 -6.023e-15 1.432e-14 6.661e-16 -4.108e-15 9.881e-15 |
    24. | 14.88746 1.277e-15 -3.997e-15 -1.388e-14 0 -9.992e-16 -8.882e-16 |
    25. | 15.968281 -8.993e-15 7.772e-16 6.883e-15 -4.441e-16 0 -2.720e-15 |
    |-----------------------------------------------------------------------------------------|
    26. | 16.537906 -1.632e-14 3.331e-15 -1.144e-14 -2.193e-15 2.554e-15 -7.216e-15 |
    27. | 15.920686 7.327e-15 -3.053e-15 5.551e-15 0 -8.882e-16 -3.775e-15 |
    28. | 14.898661 7.216e-16 -1.776e-15 0 -4.829e-15 5.440e-15 2.776e-15 |
    29. | 14.31353 -1.094e-14 7.216e-15 -1.887e-15 1.860e-15 8.882e-16 7.050e-15 |
    30. | 14.286252 -2.442e-15 2.803e-15 1.110e-15 -2.220e-16 0 -2.387e-15 |
    |-----------------------------------------------------------------------------------------|
    31. | 14.976927 -4.219e-15 6.189e-15 6.439e-15 0 -2.442e-15 -3.719e-15 |
    32. | 15.252848 6.717e-15 2.331e-15 -7.994e-15 2.276e-15 0 -1.776e-15 |
    33. | 14.143521 9.381e-15 3.830e-15 1.998e-14 -1.915e-15 1.776e-15 7.216e-16 |
    34. | 14.31021 6.939e-15 -3.886e-16 -1.465e-14 -6.106e-16 -2.109e-15 2.165e-15 |
    35. | 12.819852 4.330e-15 -2.609e-15 9.992e-15 -2.748e-15 -1.332e-15 9.770e-15 |
    |-----------------------------------------------------------------------------------------|
    36. | 11.950952 9.881e-15 -3.331e-16 0 -8.604e-16 -3.109e-15 2.887e-15 |
    37. | 10.037345 -4.607e-15 1.110e-15 -3.331e-15 -4.607e-15 2.554e-15 0 |
    38. | 9.5601545 7.494e-15 9.631e-15 0 0 0 -1.388e-15 |
    39. | 10.563992 -8.604e-15 2.692e-15 1.243e-14 -5.274e-16 -2.776e-15 -5.218e-15 |
    40. | 11.423837 -1.271e-14 8.049e-16 -6.439e-15 0 0 -1.610e-15 |
    |-----------------------------------------------------------------------------------------|
    41. | 11.474676 8.715e-15 -3.192e-15 1.066e-14 -2.276e-15 -9.992e-16 -5.329e-15 |
    42. | 11.153144 -1.832e-14 6.939e-16 6.439e-15 -2.831e-15 0 5.274e-15 |
    43. | 11.390316 5.107e-15 -4.913e-15 3.331e-15 7.494e-16 -2.554e-15 -6.606e-15 |
    44. | 11.2595 -3.275e-15 4.607e-15 1.910e-14 1.943e-15 -2.665e-15 0 |
    45. | 10.271963 0 8.327e-16 5.107e-15 -2.831e-15 2.220e-15 -1.332e-15 |
    |-----------------------------------------------------------------------------------------|
    46. | 9.880501 8.882e-16 -3.192e-15 1.732e-14 3.053e-15 -3.331e-15 1.332e-15 |
    47. | 9.9554143 7.772e-15 5.024e-15 -9.104e-15 3.914e-15 -2.665e-15 -2.220e-15 |
    48. | 10.032764 -6.051e-15 -2.803e-15 2.043e-14 0 4.663e-15 2.498e-15 |
    49. | 10.100572 -3.109e-15 -5.496e-15 -1.199e-14 -1.665e-15 0 2.442e-15 |
    50. | 9.9033155 2.276e-15 -3.331e-15 -4.663e-15 3.331e-16 1.221e-15 -2.276e-15 |
    |-----------------------------------------------------------------------------------------|
    51. | 9.6706597 1.832e-15 -6.689e-15 0 0 0 -4.829e-15 |
    52. | 10.589432 -3.497e-15 -5.385e-15 -1.066e-14 1.110e-15 0 7.605e-15 |
    53. | 10.478007 -5.274e-15 1.360e-15 3.997e-15 1.138e-15 2.887e-15 -3.331e-15 |
    54. | 10.539037 -1.338e-14 3.109e-15 8.660e-15 -3.192e-15 4.441e-15 1.554e-15 |
    55. | 10.342886 4.441e-15 -3.109e-15 2.243e-14 3.442e-15 -3.553e-15 0 |
    |-----------------------------------------------------------------------------------------|
    56. | 10.832661 6.661e-16 -4.163e-16 2.376e-14 4.996e-16 2.109e-15 -1.887e-15 |
    57. | 11.00444 -2.071e-14 1.582e-15 -3.775e-15 6.661e-16 8.882e-16 6.883e-15 |
    58. | 10.729131 1.610e-15 2.748e-15 -5.329e-15 2.609e-15 4.330e-15 -3.886e-15 |
    59. | 11.068797 1.332e-15 -3.303e-15 0 7.772e-16 -1.665e-15 1.005e-14 |
    60. | 10.826701 6.883e-15 2.609e-15 -4.885e-15 -3.691e-15 8.882e-16 0 |
    |-----------------------------------------------------------------------------------------|
    61. | 10.731609 -1.227e-14 -1.610e-15 -6.661e-15 -1.332e-15 -8.882e-16 5.995e-15 |
    62. | 10.648408 -5.884e-15 -2.720e-15 0 -2.359e-15 -1.776e-15 3.719e-15 |
    63. | 11.034309 1.371e-14 3.719e-15 1.799e-14 -8.882e-16 0 3.220e-15 |
    64. | 11.450744 5.995e-15 -4.358e-15 0 -1.943e-15 1.665e-15 -3.442e-15 |
    65. | 11.601619 -1.149e-14 -3.358e-15 0 -3.386e-15 0 7.438e-15 |
    |-----------------------------------------------------------------------------------------|
    66. | 11.912565 -1.033e-14 4.413e-15 0 1.388e-15 -3.664e-15 -4.774e-15 |
    67. | 11.853213 1.127e-14 -5.551e-16 8.882e-15 -1.915e-15 0 0 |
    68. | 12.009599 0 8.021e-15 1.688e-14 -2.859e-15 -9.992e-16 5.829e-15 |
    69. | 12.530721 -3.386e-15 4.885e-15 -7.550e-15 1.471e-15 0 2.609e-15 |
    70. | 12.306272 -3.386e-15 5.940e-15 -7.105e-15 5.551e-16 -1.887e-15 4.385e-15 |
    |-----------------------------------------------------------------------------------------|
    71. | 13.022646 6.661e-16 1.138e-15 7.550e-15 -1.221e-15 1.665e-15 4.441e-15 |
    72. | 12.262307 8.826e-15 -4.663e-15 -3.997e-15 1.998e-15 0 -1.260e-14 |
    73. | 12.088574 7.605e-15 4.385e-15 -7.105e-15 0 2.109e-15 -1.832e-15 |
    74. | 11.82486 -8.993e-15 -3.358e-15 0 -1.943e-15 1.443e-15 2.665e-15 |
    75. | 12.026198 -4.940e-15 -8.188e-15 -1.288e-14 -3.053e-16 3.664e-15 -3.442e-15 |
    |-----------------------------------------------------------------------------------------|
    76. | 12.190908 -6.661e-16 -4.829e-15 0 -2.387e-15 -1.776e-15 -2.331e-15 |
    77. | 12.102639 2.942e-15 -8.188e-15 0 -6.384e-16 -1.665e-15 -3.664e-15 |
    78. | 12.418536 -2.276e-15 -1.499e-15 0 2.581e-15 1.110e-15 1.721e-15 |
    79. | 12.379174 -1.021e-14 5.607e-15 1.688e-14 0 -1.554e-15 0 |
    80. | 12.869541 1.832e-15 -8.882e-16 9.326e-15 -2.776e-15 2.665e-15 1.277e-15 |
    |-----------------------------------------------------------------------------------------|
    81. | 12.710445 3.664e-15 0 0 -1.527e-15 5.329e-15 1.110e-14 |
    82. | 12.436453 -1.021e-14 3.164e-15 3.997e-15 -6.106e-16 0 -7.716e-15 |
    83. | 12.24864 1.277e-15 2.776e-15 -3.997e-15 -3.803e-15 0 4.052e-15 |
    84. | 12.388942 1.776e-15 0 4.885e-15 1.305e-15 -2.442e-15 2.554e-15 |
    85. | 12.218767 -4.330e-15 -4.746e-15 8.882e-15 -3.386e-15 -4.219e-15 -2.554e-15 |
    |-----------------------------------------------------------------------------------------|
    86. | 11.953118 -1.499e-15 -1.860e-15 -7.994e-15 2.026e-15 -1.221e-15 -5.218e-15 |
    87. | 12.065205 -9.159e-15 3.303e-15 2.309e-14 -3.525e-15 2.220e-15 3.830e-15 |
    88. | 11.962646 -8.771e-15 -1.249e-15 -1.110e-14 0 -4.219e-15 0 |
    89. | 11.903961 -6.661e-15 1.166e-15 0 0 -2.554e-15 -4.829e-15 |
    90. | 11.65146 -1.221e-15 0 0 -2.054e-15 0 -6.051e-15 |
    |-----------------------------------------------------------------------------------------|
    91. | 11.402864 1.282e-14 1.638e-15 8.882e-15 6.661e-16 1.332e-15 7.105e-15 |
    92. | 11.313682 -3.941e-15 -2.692e-15 5.773e-15 4.163e-16 1.110e-15 -7.105e-15 |
    93. | 10.700764 5.884e-15 -3.608e-16 9.326e-15 7.494e-16 -1.221e-15 1.388e-15 |
    94. | 10.418166 -2.276e-15 -1.332e-15 -1.310e-14 -2.387e-15 1.998e-15 -4.441e-16 |
    +-----------------------------------------------------------------------------------------+

    I understand that there is potentially some redundancy among my explanatory variables (all measured words refer - to some extent - to emotional/affective lexicon) but that is sort of a given in my case and my goal is to minimize it as much as possible, though would I not see it in 'collin' 'vif' results if they very redundant?

    Do you have some advice for me? Is there a way to deal with this issue? I have a hunch that I overlooked something, but I cannot figure it out yet.
    Thank you for your help!
    Kind regards,
    Paulius Pranskevicius

  • #2
    You did something seriously wrong when standardizing those variables, you made them all basically 0. For example, -2.276e-15 means 0.0000000000000002276. You should not get that many values of that type in a standardized variable. That is something you need to fix. The reason that you get multicolinearity warnings is presumably that these numbers are so small you start to get close to the limit of what a computer can differentiate from 0. That way you loose precision in the computations, and you get this type of problems. But they are not relevant. The real problem is that your variables are not what you think they are, so you put garbage in, and like any good statistical procedure, you get garbage out.

    What worries me is that you use Excel to your data preparation. Preparing your data is the most important step when doing an analysis. As you just noticed, that has a profound influence on your results. It must be meticulously documented, otherwise your results are impossible to replicate. A analysis that cannot be replicated is worthless. The required level of documentation is impossible in Excel. Instead you must do all your data preparations in Stata using .do files, or any other serious data analysis package that allows you to record your steps in some form of script. For more see here: https://www.stata.com/bookstore/work...nalysis-stata/
    ---------------------------------
    Maarten L. Buis
    University of Konstanz
    Department of history and sociology
    box 40
    78457 Konstanz
    Germany
    http://www.maartenbuis.nl
    ---------------------------------

    Comment


    • #3
      Maarten, thank you for the help!

      I had a feeling that these extremely small values might be a problem. I followed some guidelines of standardization from this article https://kops.uni-konstanz.de/handle/123456789/45613. It's called "Guideline for improving the reliability of Google Ngram studies : Evidence from religious terms". Not sure how to apply these recommendation then. Do you have some advice of how I could make my values, so to say, bigger?

      I will re-check all my data preparation steps and do them in Stata instead. Thanks again!



      p.s. just now I noticed that the article comes from the same University of Konstanz! What a coincidence, hahaha.
      Kind regards,
      Paulius Pranskevicius

      Comment

      Working...
      X