Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Factor Analysis

    Hi,
    I have a survey of 92 questions, grouped into 5 main topics. I applied factor analysis and determine 5 factors. The results I am getting for cumulative variance is very good, however the loading for questions in each factor is not as expectation, for example I am expecting the first 5 question to be in one factor. but the loading says different things. The uniqueness for the last 15 question( last group of question-red color) is vey high which means these questions are not covered by the model. Results as below:

    Code:
    . rotate
    
    Factor analysis/correlation                      Number of obs    =        107
        Method: maximum likelihood                   Retained factors =          5
        Rotation: orthogonal varimax (Kaiser off)    Number of params =        390
                                                     Schwarz's BIC    =    6662.08
        Log likelihood =  -2419.84                   (Akaike's) AIC   =    5619.68
    
        --------------------------------------------------------------------------
             Factor  |     Variance   Difference        Proportion   Cumulative
        -------------+------------------------------------------------------------
            Factor1  |      9.76978      3.99666            0.3438       0.3438
            Factor2  |      5.77312      0.72991            0.2032       0.5470
            Factor3  |      5.04321      0.62778            0.1775       0.7245
            Factor4  |      4.41544      1.00390            0.1554       0.8799
            Factor5  |      3.41153            .            0.1201       1.0000
        --------------------------------------------------------------------------
        LR test: independent vs. saturated: chi2(3160)= 5991.86 Prob>chi2 = 0.0000
        LR test:   5 factors vs. saturated: chi2(2770)= 3445.07 Prob>chi2 = 0.0000
    
    Rotated factor loadings (pattern matrix) and unique variances
    
        ----------------------------------------------------------------
            Variable |  Factor1   Factor2   Factor3   Factor4   Factor5 
        -------------+--------------------------------------------------
         encodeing12 |  -0.0933   -0.2025    0.7176    0.0954   -0.0035 
         encodeing13 |  -0.0060    0.0348    0.6730    0.0628   -0.0350 
         encodeing14 |  -0.2131   -0.0385    0.6665    0.1257   -0.1781 
         encodeing15 |  -0.1644   -0.1365    0.7539   -0.0583   -0.0574 
         encodeing16 |  -0.1210   -0.0632    0.7266    0.1039    0.0141 
         encodeing17 |  -0.1758   -0.3353    0.6521    0.0968    0.0621 
         encodeing18 |  -0.1480   -0.0096    0.7198    0.0769   -0.0376 
         encodeing19 |  -0.0433   -0.0126    0.6249    0.0949   -0.0739 
         encodeing20 |   0.0297   -0.0140    0.0281   -0.7945   -0.1221 
         encodeing21 |   0.1817   -0.0366    0.0609    0.7061    0.1150 
         encodeing22 |   0.2049   -0.1139   -0.0053   -0.6944   -0.1016 
         encodeing23 |   0.0412   -0.0193    0.1738    0.7179    0.1783 
         encodeing24 |   0.0198   -0.0954   -0.1089   -0.6883   -0.0292 
         encodeing25 |   0.0673    0.0067    0.1591    0.7478    0.1597 
         encodeing26 |   0.0276    0.4931   -0.0794    0.0198    0.1129 
         encodeing27 |   0.7345    0.0422   -0.2093   -0.2005   -0.0407 
         encodeing28 |   0.3384    0.5893   -0.1641   -0.0453   -0.1942 
         encodeing29 |   0.6874   -0.0409   -0.2290    0.0457    0.1772 
         encodeing30 |  -0.0795    0.0479   -0.0152    0.1677    0.8016 
         encodeing31 |   0.7249    0.2455   -0.0511    0.0353   -0.1157 
         encodeing32 |   0.5734    0.3136   -0.0815   -0.0961   -0.1156 
         encodeing33 |   0.2444    0.7825   -0.0986    0.1070    0.0586 
         encodeing34 |   0.7946    0.0901   -0.1248    0.0311    0.0515 
         encodeing35 |   0.3595    0.7682   -0.1376    0.0611   -0.0371 
         encodeing36 |   0.7243    0.0355    0.0206   -0.0759    0.2035 
         encodeing37 |   0.1281    0.0431    0.0237    0.0702    0.7182 
         encodeing38 |   0.7104    0.2707   -0.0639    0.0011    0.0806 
         encodeing39 |   0.5694    0.5148    0.1337    0.0782    0.0226 
         encodeing40 |   0.3474    0.7560   -0.0110   -0.0628   -0.0175 
         encodeing41 |   0.8014    0.2557   -0.1367    0.0336   -0.2327 
         encodeing42 |   0.4425    0.6586   -0.2039   -0.0276   -0.0595 
         encodeing43 |   0.7487    0.0703    0.0217    0.0663    0.2633 
         encodeing44 |   0.0204   -0.0701   -0.0976    0.1184    0.8814 
         encodeing45 |   0.8228    0.1766    0.0565   -0.0251   -0.0043 
         encodeing46 |   0.5410    0.4289   -0.0317   -0.0022    0.0575 
         encodeing47 |   0.1849    0.8610   -0.0721    0.0367    0.0424 
         encodeing48 |   0.7296    0.1653   -0.1379    0.0115   -0.1239 
         encodeing49 |   0.4605    0.6015    0.0221   -0.0184    0.0123 
         encodeing50 |   0.7759    0.0855   -0.0240   -0.0277    0.0702 
         encodeing51 |   0.0111    0.0218   -0.0771    0.2192    0.7068 
         encodeing52 |   0.7309    0.3467   -0.0704    0.0629   -0.0539 
         encodeing53 |   0.6506    0.3757   -0.1006    0.0315   -0.0423 
         encodeing55 |   0.0615   -0.3325   -0.0857    0.0439   -0.1348 
         encodeing56 |  -0.1050   -0.0652   -0.0817    0.0533    0.1155 
         encodeing57 |  -0.0596   -0.1929   -0.1452    0.0668    0.1267 
         encodeing58 |   0.0081   -0.1263   -0.1560    0.1220   -0.0887 
         encodeing59 |  -0.1684   -0.0496   -0.1086    0.2456   -0.0305 
         encodeing60 |  -0.1445   -0.1584   -0.0315   -0.0169   -0.1545 
         encodeing61 |  -0.2128   -0.1675   -0.1999    0.2662    0.1324 
         encodeing62 |  -0.0970   -0.1510    0.0655    0.0133    0.0314 
         encodeing63 |  -0.0798    0.0276    0.1638    0.3008   -0.0075 
         encodeing64 |   0.0241    0.0371   -0.0123    0.3091    0.0805 
         encodeing65 |  -0.0862   -0.0971   -0.1231   -0.0212    0.0879 
         encodeing66 |   0.0142   -0.1229   -0.1073   -0.0330   -0.1951 
         encodeing67 |  -0.0125    0.0174    0.0721    0.4038    0.0140 
         encodeing68 |   0.2272   -0.1879   -0.1279    0.0673   -0.0675 
         encodeing69 |  -0.0217   -0.0251    0.0973    0.1196   -0.0671 
         encodeing70 |  -0.1718    0.0344    0.1047    0.2735   -0.0600 
         encodeing71 |  -0.0811    0.0439    0.0830    0.2436    0.0784 
         encodeing72 |  -0.1058    0.0983    0.0148    0.0503    0.1055 
         encodeing73 |   0.0941    0.0488   -0.0371   -0.0983   -0.0688 
         encodeing74 |   0.0908    0.0539    0.0537   -0.0080    0.1881 
         encodeing75 |   0.0224   -0.0082   -0.2226   -0.0494    0.0207 
         encodeing76 |   0.0228   -0.1094    0.0138    0.0212    0.0122 
         encodeing77 |   0.0565    0.0049   -0.2808   -0.3047   -0.1632 
         encodeing78 |  -0.0620   -0.0437   -0.0242   -0.0906   -0.0087 
         encodeing79 |  -0.0147    0.0658   -0.3803    0.0084   -0.1257 
         encodeing80 |   0.0543   -0.1547   -0.1792    0.1212    0.0023 
         encodeing81 |  -0.1151    0.0126   -0.1887    0.1657   -0.0796 
         encodeing82 |   0.1014   -0.0599    0.0642    0.0208    0.0358 
         encodeing83 |   0.1059   -0.1553   -0.1794   -0.0557   -0.1282 
         encodeing84 |  -0.0267    0.0558   -0.1108    0.1628   -0.1582 
         encodeing85 |   0.1482    0.1727    0.1949    0.0727    0.0377 
         encodeing86 |   0.1873   -0.0787   -0.1296    0.0834   -0.1218 
         encodeing87 |   0.3671    0.0060   -0.0290    0.0598   -0.0889 
         encodeing88 |  -0.0561    0.0642    0.0121    0.1115   -0.0932 
         encodeing89 |   0.0136    0.0755   -0.0269    0.3223   -0.1165 
         encodeing90 |  -0.0509    0.3005    0.0851    0.0935    0.1193 
         encodeing91 |  -0.0264    0.1701   -0.2679    0.0205   -0.2584 
         encodeing92 |  -0.1803    0.0515   -0.2301   -0.0452   -0.2613 
        ----------------------------------------------------------------
    
        ----------------------------
            Variable |   Uniqueness 
        -------------+--------------
         encodeing12 |      0.4262  
         encodeing13 |      0.5407  
         encodeing14 |      0.4613  
         encodeing15 |      0.3793  
         encodeing16 |      0.4424  
         encodeing17 |      0.4183  
         encodeing18 |      0.4525  
         encodeing19 |      0.5931  
         encodeing20 |      0.3520  
         encodeing21 |      0.4502  
         encodeing22 |      0.4525  
         encodeing23 |      0.4206  
         encodeing24 |      0.5041  
         encodeing25 |      0.3854  
         encodeing26 |      0.7366  
         encodeing27 |      0.3732  
         encodeing28 |      0.4716  
         encodeing29 |      0.4399  
         encodeing30 |      0.3204  
         encodeing31 |      0.3969  
         encodeing32 |      0.5436  
         encodeing33 |      0.3034  
         encodeing34 |      0.3413  
         encodeing35 |      0.2566  
         encodeing36 |      0.4265  
         encodeing37 |      0.4604  
         encodeing38 |      0.4115  
         encodeing39 |      0.3863  
         encodeing40 |      0.3034  
         encodeing41 |      0.2185  
         encodeing42 |      0.3246  
         encodeing43 |      0.3603  
         encodeing44 |      0.1942  
         encodeing45 |      0.2880  
         encodeing46 |      0.5191  
         encodeing47 |      0.2161  
         encodeing48 |      0.4059  
         encodeing49 |      0.4252  
         encodeing50 |      0.3844  
         encodeing51 |      0.4459  
         encodeing52 |      0.3338  
         encodeing53 |      0.4227  
         encodeing55 |      0.8582  
         encodeing56 |      0.9619  
         encodeing57 |      0.9176  
         encodeing58 |      0.9369  
         encodeing59 |      0.8962  
         encodeing60 |      0.9289  
         encodeing61 |      0.7983  
         encodeing62 |      0.9623  
         encodeing63 |      0.8755  
         encodeing64 |      0.8959  
         encodeing65 |      0.9598  
         encodeing66 |      0.9340  
         encodeing67 |      0.8311  
         encodeing68 |      0.8876  
         encodeing69 |      0.9706  
         encodeing70 |      0.8799  
         encodeing71 |      0.9191  
         encodeing72 |      0.9653  
         encodeing73 |      0.9730  
         encodeing74 |      0.9505  
         encodeing75 |      0.9470  
         encodeing76 |      0.9867  
         encodeing77 |      0.7985  
         encodeing78 |      0.9854  
         encodeing79 |      0.8350  
         encodeing80 |      0.9263  
         encodeing81 |      0.9172  
         encodeing82 |      0.9803  
         encodeing83 |      0.9129  
         encodeing84 |      0.9324  
         encodeing85 |      0.9035  
         encodeing86 |      0.9201  
         encodeing87 |      0.8529  
         encodeing88 |      0.9715  
         encodeing89 |      0.8760  
         encodeing90 |      0.8769  
         encodeing91 |      0.8314  
         encodeing92 |      0.8416  
        ----------------------------
    I can not get the factors as expectation. Any recommendation will be appreciated.


  • #2
    what format are the question answers?

    Comment


    • #3
      they are categorical answers vary from 1 to 5. Actually, I changed to CFA instead of EFA. and I have some general questions:
      1- can I use the factor scores in regression analysis?
      2- I am getting small negative values of factor scores. what does negative means? does it mean these participants answered negatively to that factor?

      Comment


      • #4
        [deleted]
        Last edited by Dirk Enzmann; 23 Feb 2023, 20:02.

        Comment


        • #5
          [deleted -- I don't know why the post appeared twice]

          Comment


          • #6
            [Now the post has been deleted completely -- frustrating!]

            Comment


            • #7
              Several comments:
              • You should ask yourself whether the number of cases per item is sufficient in your case. A ratio of 1:1 is extremely low, rules of thumbs recommend a ratio of about 10:1 -- but it depends on whether your data are "well behaved", i.e. no cross-loadings, high communalities and several high item loadings (> 50) per factor. Additionally, in psychometrics it is rare to have measurement instruments with more than 30 items.
              • In EFA you should carefully decide which criterion to use to determine the number of factors -- I would recommend parallel analysis.
              • In EFA, again, you should have very good reasons to assume uncorrelated factors (you are using varimax rotation) -- otherwise you should rotate the solution obliquely and examine the correlation matrix of factors using the post-estimation command -estat common-.
              • Especially with many items you should give the variables names that allow you to immediately identify the sub-scale or suspected factor to which the items belong, and you should use the post-estimation command -sortl- (from SSC) that enables you to quickly see whether the factor loadings fit to the pattern expected.
              The following two introductory articles can give you a basic understanding of these issues:
              Last edited by Dirk Enzmann; 23 Feb 2023, 20:59. Reason: Corrected the title of Preacher & MacCallum's article

              Comment

              Working...
              X