Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • CFA: How to set variance of factor to one?

    Hi

    I am working on a measurement model in Stata. I want to extract the standardized factor score with mean 0 and SD 1.

    Stata by default sets the first factor loading to 1. I have spent quite some time searching for how to set the variance of the factor to 1 instead. I can not find any solution to this seemingly basic adjustment to my model. Can anyone help out?

    Best
    Tarjei

  • #2
    Your question really isn't clear without more detail, or at a minimum it is too difficult to guess at a good answer from what you have shared. Please help us help you. Show us the command you ran. Show us the output Stata showed you. Do this by copying your command and output from the Results window and pasting it into a post using code delimiters [CODE] and [/CODE].

    Comment


    • #3
      Add the option -var(F@1)-, replacing F by the name of the latent factor itself, to your -sem- command. (Note: Sometimes a model that converged under the usual constraint of first loading = 1 will fail to converge with this constraint. I don't know why that is, nor what to do about it if this happens to you.)

      Comment


      • #4
        Thank you, Clyde! This was the sort of easy solution I was hoping for (thankfully my model converged as well). For completeness of the post, I can add a code example:
        Code:
        * Model:
        sem (Diagnostic_symp -> item1-item5), var(Diagnostic_symp@1)
        * Obtain standardized factor score:
        predict Diagnostic_symp_score, latent(Diagnostic_symp)
        I obtain a standardized score with mean .006 and SD .95 which I assume is practically considered mean 0 and SD 1. To be sure, I will cross-check. (The alternative - setting first loading to 1 which is default - gave mean .002 and SD .4.)

        Comment


        • #5
          Clyde Schechter a follow-up question that perhaps you or someone else may know the answer to:

          I tried the exact same code as in post #4 on another data set. We aim to obtain two standardized factor scores from two separate CFA analysis. When using this code, one factor score has mean ~ 0 and SD .6 while the other has mean ~ 0 and SD .8. Since standardized factor scores should by definition have SD 1, I wonder if there is something missing in our predict specification. The help file does not seem to provide further guidance on this.

          Comment


          • #6
            I'm not sure I know the answer to your question. But when you say you are getting these results, I assume you are referring to descriptive statistics for the predicted scores. That's not the same thing as the latent factor variable itself. When you use -predict- on a latent variable, you are not getting the actual value of the latent variable in that case:the actual value is unknown and unknowable. You are getting an estimate of that value from the data and the model. The mean and variance of such estimates should come out close to 0 and 1, but they will not generally be exactly that. If you look at the output of the -sem- command itself, near the bottom of the regression table, I am sure you will find a row that tales that the variance of the latent variable itself is, in fact, 1.


            Comment


            • #7
              Thanks for your quick and helpful reply. You are right - I am referring to descriptive statistics for predicted scores and it makes perfect sense that we obtain an estimate which can deviate somewhat from mean 0 and SD 1. The output of the -sem- command states that the variance of the latent variable is 1.

              Comment

              Working...
              X