# Estimation of the structural model PSR2 <- ' # measurement model ECO =~ ECO_01 + ECO_02 + ECO_03 LEG =~ LEG_01 + LEG_02 + LEG_03 ETI =~ ETI_01 + ETI_02 + ETI_03 + ETI_04 FIL =~ FIL_01 + FIL_02 + FIL_03 + FIL_04 + FIL_05 MED =~ MED_01 + MED_02 + MED_03 + MED_04 SAT =~ SAT_01 + SAT_02 + SAT_03 + SAT_04 + SAT_05 SEST =~ SEST_01 + SEST_02 + SEST_03 + SEST_04 PSR <~ ECO + LEG + ETI + FIL + MED # regressions PSR ~ SAT + SEST ' fitS1 <- sem(PSR2, data = dataM) summary(fitS1, fit.measures = TRUE) fitMeasures(fitS1) inspect (fitS1, "cor.lv") standardizedSolution(fitS1) MI <- modificationIndices(fitS1) subset(MI, mi > 10)
But then I got the next message:
Warning messages: 1: In lav_model_vcov(lavmodel = lavmodel, lavsamplestats = lavsamplestats, : lavaan WARNING: could not compute standard errors! lavaan NOTE: this may be a symptom that the model is not identified. 2: In lav_model_test(lavmodel = lavmodel, lavpartable = lavpartable, : lavaan WARNING: could not compute scaled test statistic
Does anyone have an idea of the problem? We do not have the formative latent variable (PSR) measured, but therefore we are including two reflectively-measured constructs (SAT, SEST).
I would appreciate if you could help me.
Thank you very much in advance.
Sylvia.
--
You received this message because you are subscribed to the Google Groups "lavaan" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lavaan+un...@googlegroups.com.
To post to this group, send email to lav...@googlegroups.com.
Visit this group at http://groups.google.com/group/lavaan.
For more options, visit https://groups.google.com/d/optout.
If you mean for the two factors SAT and SEST to assist in identification of the parameters related to PSR, then they have to be dependent on PSR, as in
SAT + SEST ~ PSR
not PSR ~ SAT + SEST
--Ed Rigdon
--
Sylvia—
Yes, you must have outcomes to identify a factor model, for two reasons. Effectively, we are representing the covariance matrix of the variables as functions of model parameters and then solving for the model parameters as functions of the elements of the covariance matrix. With the focal variable itself not observed, we must find the parameters related to it in the variances and covariances of other variables.
So the first reason is that you will not find all of the needed parameters represented in this way. In particular, by the way regression works, you will not find the residual variance of the focal variable associated with the variable’s antecedents, because we generally assume that the residual is orthogonal to the predictors.
The second reason is that an element of the covariance matrix is consumed or expended each time we identify a parameter. The elements must be described by few enough parameters that we can borrow a covariance from over *here* to estimate a parameter from over *there*. With antecedents, their variances and covariances are just enough to cover their own parameters. But with multiple dependent variables and a constrained factor model, we often have covariances left over, among the outcomes, that we can use to identify parameters related to unobserved variables. Without outcomes, however, these opportunities do not exist.