Constraining all error variances to be positive

616 views
Skip to first unread message

alodie

unread,
Jan 31, 2018, 9:51:14 AM1/31/18
to lavaan

Dear all,

 

I’d like to constrain all error variances to be positive. Therefore, I used the following model syntax:

m2 <- 'inhib =~ antisaccade + stopsignal + colorstroop + simon + numberstroop + arrowflanker

                # constraints

                antisaccade ~~ b1*antisaccade

                stopsignal ~~ b2*stopsignal

                colorstroop ~~ b3*colorstroop

                simon ~~ b4*simon

                numberstroop ~~ b5*numberstroop

                arrowflanker ~~ b6*arrowflanker

                b1 > 0

                b2 > 0

                b3 > 0

                b4 > 0

                b5 > 0

                b6 > 0'

 

fit2 <- sem(m2, data=d, std.lv = TRUE)

summary(fit2, standardized = TRUE)

 

Variances:

Estimate

Std.Err

z-value

P(>|z|)

Std.lv

Std.all

.antisaccd

(b1)

379.343

39.876

9.513

0

379.343

0.975

.stopsignl

(b2)

369.442

38.835

9.513

0

369.442

0.978

.colorstrp

(b3)

73.121

7.686

9.513

0

73.121

0.987

.simon

(b4)

-0.000

NA



-0.000

-0.000

.numbrstrp

(b5)

90.735

9.538

9.513

0

90.735

0.989

.arrwflnkr

(b6)

205.233

21.574

9.513

0

205.233

0.995

inhib

1

1

1

 

Here, I got the warning that some estimated ov variances were negative, although all error variances were constrained to be positive. 
 
How is this possible? What was false in the model syntax of m2?
 
Thanks in advance for all your responses!
 
Best,
Alodie

Terrence Jorgensen

unread,
Feb 12, 2018, 10:19:59 AM2/12/18
to lavaan

Estimate

Std.Err

z-value

P(>|z|)

Std.lv

Std.all

.simon

(b4)

-0.000

NA



-0.000

-0.000

Here, I got the warning that some estimated ov variances were negative, although all error variances were constrained to be positive. 
 
How is this possible? What was false in the model syntax of m2?  

Syntax is correct, but clearly a positive variance is not the maximum likelihood estimate for your data.  As you can see in your output, the estimate is zero, although the machine-precision in this case was a little to the left instead of a little to the right of zero. 

Constrained estimation is not a good idea.  


Better to test whether the negative variance can be explained by sampling error


If your model fits well and the CI includes positive values, then you can dismiss it as sampling error.  Constraining it to be positive will just bias your other estimates and the model-fit test statistic.

Terrence D. Jorgensen
Postdoctoral Researcher, Methods and Statistics
Research Institute for Child Development and Education, the University of Amsterdam

Franz

unread,
Nov 9, 2022, 11:54:14 AM11/9/22
to lavaan
Dear Dr Jorgensen,

I would highly appreciate your further guidance in this topic: I have a such a Heywood case in a SEM project, too. If I understand you correctly, one can test whether the Heywood case occurred due to sampling error via the CIs. This would be good news to me! I have two related questions: I assume you refer to the CI of the (negative) variance estimate of the observed variable - correct (and if not, which CI)? Further, do you know of any lavaan-related R-function that allows the calculation of such CIs for ov variances?

I would be very happy to hear from you. Thanks for your time!

Best,
Franz

Terrence Jorgensen

unread,
Nov 10, 2022, 1:07:55 AM11/10/22
to lavaan
I assume you refer to the CI of the (negative) variance estimate of the observed variable

Yes, the CI of whatever parameter is the Heywood case.

which CI?

The paper I linked to recommends a robust one (e.g., when you set estimator = "MLM" or "MLR"), see the paper for details and discussion.
  
Further, do you know of any lavaan-related R-function that allows the calculation of such CIs for ov variances?

parameterEstimates()

Terrence D. Jorgensen
Assistant Professor, Methods and Statistics
Reply all
Reply to author
Forward
0 new messages