Hello again,
I have a problem with the covariates in my model but I don't know if it's a question of understanding the method or the package. In both cases, thank you in advance for your help.
I have a model with a second-order latent variable (depression) and four outcomes. I have three adjustment variables, namely age, sex and education in 3 categories (so I created 2 binary variables educ2 and educ3). I adjust both when creating my second-order latent variable and when regressing:
depression=~latent1 + latent2 +age + sex + educ2 + educ3
outcome1~depression + age + educ2 + educ3 + sex
outcome2~depression + age + educ2 + educ3 + sex
outcome3~depression + age + educ2 + educ3 + sex
outcome4~depression + age + educ2 + educ3 + sex
The age-adjusted model (withoout sex and education) works as well as the all-adjusted model. On the other hand, the model adjusted only for age and education (without sex) does not work:
Warning messages:
1: In lav_data_full(data = data, group = group, cluster = cluster, :
lavaan WARNING: some observed variances are (at least) a factor 1000 times larger than others; use varTable(fit) to investigate
2: In sqrt(A1[[g]]) : NaN production
3: In lav_model_vcov(lavmodel = lavmodel, lavsamplestats = lavsamplestats, :
lavaan WARNING:
Could not compute standard errors! The information matrix could
not be inverted. This may be a symptom that the model is not
identified.
4: In sqrt(A1[[g]]) : NaN production
5: In lav_test_satorra_bentler(lavobject = NULL, lavsamplestats = lavsamplestats, : lavaan WARNING: could not invert information matrixA little by chance, I tried to add to the syntax "ordered=c(..., "educ2", "educ3")" and it works again. But if I compare the all-adjusted model with or without this parameter, the results are different...
1) What is the difference between these two models?
2) It seemed to me that when the variables were exogenous, it was not necessary to specify among the "ordered" variables. Was I wrong?
3) Moreover, why is the addition of sex in the model enough to solve this problem?
Thank you very much
Emmanuel