good model fit, not significant loadings, second-order model

69 views
Skip to first unread message

Sara Esposito

unread,
Dec 29, 2021, 10:39:28 AM12/29/21
to lavaan
Hi,

My question is not very syntax-related, I apologise for that. However, I could not find the answer elsewhere and I hope that some of you could help me understanding my cfa output.

I did a series of model comparisons between nested model (est = MLR). Output from semToos::compareFit shows that the best model is a second-order model. However, some of the item-level loadings are not significant (even if they are > .7) Therefore, I cannot accept this model. I then tried with a bifactor model (but factors were not orthogonalised), and the same happened.

In contrast, the model containing only first-order factors has both loadings and model fit good. 

This happens only when std.lv = TRUE. When std.lv = FALSE (default) loadings in the second-order and in the bifactor model are significant. However, the estimate variance of 2, out of 7, first-order factors became not significant and very low. The same variances are, instead, significant and explained well (i.e., > .4) when the marker approach is used in the model without the G factor.

I have 6 first-order factors, three items each. The second-order model includes two G factors (three first-order factors load on each of the G factor). N = 400.

Discriminant validity is moderate for two first-order factors (i.e., upper limit cov CFA with std.lv = TRUE is > .8 & < .9). One of these first-order factors is the same that explains negligible variance when setting std.lv = FALSE in the second-order model. However, as I said before, the same first-order factor does not have the same problem when the G factors are excluded, regardless of how the model is scaled. Also, I did not see any relevant residual correlations.

Do you have any theoretically explanation, and possibly a reference, for why loadings are not significant when including the G factors? Also, why with nonsignificant loadings the model fit is good?

Thank you all!
Sara



 

Reply all
Reply to author
Forward
0 new messages