Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

I am getting an error message when I do a reliability analysis

7,722 views
Skip to first unread message

E S

unread,
Aug 18, 2009, 11:56:48 PM8/18/09
to
Error Message: "The value is negative due to a negative average
covariance among items. This violates reliability model assumptions.
You may want to check item codings."
Case Description:
I am trying to run a reliability analysis and the above error message
appears with my results in relation to the cronbach's alpha if item
deleted. I am certain that everything has been reverse scored, so I
can't figure out why this message is appearing. any ideas?

Thanks

Frank

unread,
Aug 19, 2009, 10:52:37 PM8/19/09
to

I can help you if you can send me your data set.
Frank

Rich Ulrich

unread,
Aug 20, 2009, 12:08:38 AM8/20/09
to
On Tue, 18 Aug 2009 20:56:48 -0700 (PDT), E S <evmi...@gmail.com>
wrote:

Look at the correlation matrix and SEE what is negative.

While you are at it, making sure that you know what data
you have on hand, look at Frequencies on all the variables,
and make sure that your MISSING are all coded as Missing,
and that there are no oddball scores.

If correlations of good data scatter randomly around zero,
then you apparently do not have a scale with internal
reliability worth mentioning. (That ordinarily would
seem to indicate that you have totally *misread* your
input data, or you are using the wrong variables.)

Of course, if reversing scoring *everything* was a
precise statement, then it achieved nothing, and you
would need to go back and reverse-score a subset that
would scale the latent dimension in a consistent direction.

--
Rich Ulrich

Ray Koopman

unread,
Aug 20, 2009, 4:55:07 AM8/20/09
to
On Aug 18, 8:56 pm, E S <evmis...@gmail.com> wrote:
> Error Message: "The value is negative due to a negative average
> covariance among items. This violates reliability model assumptions.
> You may want to check item codings."

The error message is wrong. A negative alpha does not violate any
assumptions of the reliability model. The error is in interpreting
alpha as the reliability. In general, alpha is only a lower bound
for the reliability. Only in the special case where the items are
essentially tau-equivalent does alpha equal the reliability. Since a
reliability can not be negative, a negative alpha simply provides no
information about the reliability. See Lord & Novick, _Statistical
Theories of Mental Test Scores_, Theorem 4.4.3 and Corollary 4.4.3b.

Rich Ulrich

unread,
Aug 21, 2009, 12:49:59 AM8/21/09
to
On Thu, 20 Aug 2009 01:55:07 -0700 (PDT), Ray Koopman <koo...@sfu.ca>
wrote:

Ray,
I'm not unhappy with the message.

I think that my "model assumption" includes the idea that
there is a latent dimension which is readily discernible.
Doesn't that say, "There isn't"?

--
Rich Ulrich

Ray Koopman

unread,
Aug 21, 2009, 3:43:09 AM8/21/09
to
On Aug 20, 9:49 pm, Rich Ulrich <rich.ulr...@comcast.net> wrote:

A readily discernible latent dimension may be desirable, but it's not
part of the reliability model. At the univariate level, for each item
separately, the model is tautological and contains no assumptions. At
the multivariate level, when the items are considered together, the
only assumption is that the errors of measurement are mutually
uncorrelated.

ijji_...@hotmail.com

unread,
Jun 25, 2013, 9:10:46 AM6/25/13
to
Anyone can help me to see whats the problem of my reliability analysis? i kept get negative result on my variables.

Rich Ulrich

unread,
Jun 25, 2013, 8:42:27 PM6/25/13
to
On Tue, 25 Jun 2013 06:10:46 -0700 (PDT), ijji_...@hotmail.com
wrote:
WHERE DOES THE SCALE COME FROM?
- Reliability looks at "internal consistency", so it expects
all the items to be measuring a common dimension.
- On the other hand it *is* possible to have a scale that
measures items that are *supposed* to be separate and
independent, or even negatively correlated. You would have
no expectation that Relability would say anything useful
about that sort of scale.

If this is scaling a single, simple dimension, you should be able to
tell whether High vs. Low always codes to mean the same thing
simply by reading the items.

Otherwise. Mechanically.
Look at the correlation matrix of your items. There should be
zero or very few negative r's among the items that are
supposed to make up one scale.

If "Item_07" has mostly negative correlations, then
you want to use, instead, the reverse-scored version
of it, Ritem_07.

If the scoring is, for example, 1-4, you can reverse it with
COMPUTE Ritem_07= 5- Item_07 .
- If it was 0-4, then you subract from 4 instead, so the new
minimum will be 0.

--
Rich Ulrich


sairah...@gmail.com

unread,
Jul 20, 2013, 4:33:49 AM7/20/13
to

Art Kendall

unread,
Jul 20, 2013, 6:41:40 AM7/20/13
to
use the options to see the correlation matrix. look at items that have
negative correlations. Are you sure you used the reflected variables
and not the original variables?

Art Kendall
Social Research Consultants

artur.cha...@gmail.com

unread,
Apr 1, 2014, 3:25:31 PM4/1/14
to
четверг, 20 августа 2009 г., 5:52:37 UTC+3 пользователь Frank написал:
the same problem. maybe you can help me?

Art Kendall

unread,
Apr 1, 2014, 5:28:55 PM4/1/14
to
The usual cause of this is that some items have negative correlations
with other items.

Good scale construction practice to to reduce response bias by wording
some of the items in a scale so that agreement (or high extent)
indicates the high end of the construct and other items so that
agreement (or high extent) indicated the low end of the construct.

Check you scoring key should some of the items be "reflected"?
If the items are grouped via factor analysis did some load positively
and some load negatively on the factor that was used to build the scale?

Art Kendall
Social Research Consultants

Rich Ulrich

unread,
Apr 1, 2014, 5:53:12 PM4/1/14
to
On Tue, 1 Apr 2014 12:25:31 -0700 (PDT), artur.cha...@gmail.com
wrote:

>???????, 20 ??????? 2009 ?., 5:52:37 UTC+3 ???????????? Frank ???????:
Check your syntax:

Did you create new variables for a few items that
are reversed? Does Freq show that the Missings are
handled properly for the new ones?

Did you use *those* variables (instead of the old ones)
in Reliability? ... so your list in a scale might read
"var1 to var4, revers5 to revers8, var9 var10".

Look at the correlations in the correlation matrix, when
you request it. How many are negative? ... if there are
more than a couple, figure out what is wrong with the
coding of the biggest negative r or two.

--
Rich Ulrich

masvelous

unread,
Nov 30, 2014, 9:18:09 PM11/30/14
to
Hi frank,

i'm currently having the same problem too. can you help fix mine as well ?

David Marso

unread,
Dec 1, 2014, 9:06:58 AM12/1/14
to
Perhaps you should pursue some of the advice provided by several people in this thread? At minimum post your correlation matrix?

mfch...@gmail.com

unread,
Jul 20, 2015, 9:03:06 PM7/20/15
to
Hi,

Anyone can help me?Please~
I have the same problem too.
Before reverse coding, cronbach alpha for 10 items of attitude is 0.764.
But, after I have reversed coding the 5 items of negative statements, cronbach alpha become 0.331.

Thanks.

Rich Ulrich

unread,
Jul 21, 2015, 3:20:01 AM7/21/15
to
Dammit, if you are going to write as a Reply to a message from
2009, please have the consciousness to know that WE do not
read via Google-Groups, so WE do not have the text of that message
unless our ISP has saved it. Here:
****from 2009.
****end 2009.

0) (Your report of numerical results seems unlikely
to be fully accurate.)

1) I'm surprised if there wasn't an Reply in 2009. Did you look
for one?

2) ERROR says you must have some negative correlations.


So, LOOK AT the correlation matrix. Is everything
positive? It is an option in Reliability, or you can just
get correlations.

The popular error is to compute the reversed score
with a new name (good idea) but to feed Reliability
the original set of names. Oops!

Less often, you may have merely screwed up while
trying to reverse the scores. As a check: The
original matrix (with some reversed scores) should
be exactly the same as the corrected matrix, except
for reversed signs whereever only one of the variables
had to be reversed.

--
Rich Ulrich




Rich Ulrich

unread,
Jul 21, 2015, 3:23:07 AM7/21/15
to
On Mon, 20 Jul 2015 18:03:03 -0700 (PDT), mfch...@gmail.com wrote:

- I see that I also answered this in 2013 and 2014.
Does Google Groups hide those previous answers?


--
Rich Ulrich

abnea...@gmail.com

unread,
Oct 4, 2015, 12:26:49 PM10/4/15
to
Hey can you please help me. I'm getting negative sum in my SPSS relability test. And I'd checked all my data is correct. What should I do to make this answer positive?

abnea...@gmail.com

unread,
Oct 4, 2015, 12:26:58 PM10/4/15
to

Rich Ulrich

unread,
Oct 4, 2015, 2:19:54 PM10/4/15
to
On Sun, 4 Oct 2015 09:26:42 -0700 (PDT), abnea...@gmail.com wrote:

>Hey can you please help me. I'm getting negative sum in my SPSS relability test. And I'd checked all my data is correct. What should I do to make this answer positive?

Does Google-Groups reallly make it difficult to look for the
messages up-thread?

Three minutes before I posted what your Reply references
directly, I posted (a repetition of answers from years ago)
some details that include -

****From July, 2015
2) ERROR says you must have some negative correlations.


So, LOOK AT the correlation matrix. Is everything
positive? It is an option in Reliability, or you can just
get correlations.

The popular error is to compute the reversed score
with a new name (good idea) but to feed Reliability
the original set of names. Oops!

Less often, you may have merely screwed up while
trying to reverse the scores. As a check: The
original matrix (with some reversed scores) should
be exactly the same as the corrected matrix, except
for reversed signs whereever only one of the variables
had to be reversed.
****End excerpt.

--
Rich Ulrich

Bruce Weaver

unread,
Oct 5, 2015, 2:42:10 PM10/5/15
to
On Sunday, October 4, 2015 at 2:19:54 PM UTC-4, Rich Ulrich wrote:
> On Sun, 4 Oct 2015 09:26:42 -0700 (PDT), abnea...@gmail.com wrote:
>
> >Hey can you please help me. I'm getting negative sum in my SPSS relability test. And I'd checked all my data is correct. What should I do to make this answer positive?
>
> Does Google-Groups reallly make it difficult to look for the
> messages up-thread?

--- snip ---

No, it does not! I am reading via Google Groups right now, and can see a message dated 18-Aug-2009 that appears to be the original message in the thread.

https://groups.google.com/forum/#!original/comp.soft-sys.stat.spss/IVp-wCTSxR8/alQsUu4hFIEJ

HTH.

Rich Ulrich

unread,
Oct 12, 2015, 7:11:35 PM10/12/15
to
Thanks.

Someone in the alt.usage.english group has suggested
how these posts appear, and why our responses to them
are useless.

A Google search, if it is particular enough, may bring up
some old usenet post. - When I do this, there is a button
up on the right that, when I pass the cursor over it, it
asks me to log-in if I wish to reply. (Is it more obvious if
you are already logged in to Google?)

When the searcher knows nothing about Usenet groups,
that person who posts a Reply is almost surely expecting
an email response; and they don't know a thing about
"reading a group", such as, "reading the previous post."

In the case of the folks who are posting in reply to
these spammed lists of manuals, we may notice that they
are clueless enough that they are *not* following the
instructions, to click on the site or to use the email address.

--
Rich Ulrich

pujiru...@gmail.com

unread,
Jan 6, 2016, 2:13:16 AM1/6/16
to

I had analysed alpha cronbach but it was showing that The value is negative due to a negative average covariance among items. This violates reliability model assumptions. You may want to check item codings. im using scale 0 until 5. Perhaps, the problem is scale 0. please coment and share for this.



David Marso

unread,
Jan 6, 2016, 1:03:55 PM1/6/16
to
You obviously have something reverse scaled resulting in negative correlations/covariances (or your measurement instrument is terribly flawed). Please check the signs of the values in the correlation matrix and take corrective action.

Rich Ulrich

unread,
Jan 12, 2016, 1:01:25 AM1/12/16
to
Other than a bad scale: I have seen people fail to
exclude "Missing" and thus treat -9 or 999 (or some such)
and thus treat a few wild numbers as if they were scores.

--
Rich Ulrich

ryant...@gmail.com

unread,
Dec 20, 2017, 2:42:31 PM12/20/17
to
On Wednesday, August 19, 2009 at 11:56:48 AM UTC+8, E S wrote:
> Error Message: "The value is negative due to a negative average
> covariance among items. This violates reliability model assumptions.
> You may want to check item codings."
> Case Description:
> I am trying to run a reliability analysis and the above error message
> appears with my results in relation to the cronbach's alpha if item
> deleted. I am certain that everything has been reverse scored, so I
> can't figure out why this message is appearing. any ideas?
>
> Thanks

I am currently facing the same problem, but my case is that i am trying to test a "test" paper which has correct and incorrect answers, when i ran the data on SPSS my alpha came back as (-2.4), i don't know if i should test the reliability of a graded(right and wrong) test using SPSS or there is a different way to go about it. HELP!

Rich Ulrich

unread,
Dec 23, 2017, 9:23:45 PM12/23/17
to
I don't remember for sure whether negative r's make the formula
go that thoroughly kaput. Except for that, getting an alpha of
like -2.4 would sound like you have a corrupted SPSS which
would need to be re-installed. Are other procedures running
okay?

I hope you have read other posts in this usenet thread, which
mentions things like reverse-scoring items that need it.

Cronbach's alpha represents a transformation of the average
correlation. For positive correlations, it shows an shrinkage of the
"error" measured by (1- (average r)squared). This shrinkage depends
on the number of items; so the proper alpha is something larger
than the average (positive) correlation. So - Look at your matrix
of correlations. If there is an expectation of useful "internal
reliability", that means that you expect all the individual r's to be
positive.

The Wikipedia article on Cronbach's alpha has a simple formula
for using the average r's.

Also, for a dichtomy, I would probably not use Cronbach's, since
it is apt to come out smaller (and seemingly less impressive) than
the alphas which folks are used to seeing for scaled data.

[If the test is very long, I would be tempted to divide it into
subscales - according to a-priori judgment or factor analysis -
and then obtain the alpha across the subscales.]

--
Rich Ulrich

yewwe...@gmail.com

unread,
Jan 18, 2018, 10:48:13 PM1/18/18
to
I'm having the same problem currently !
Can anyone pls hlp me
Thx a lot...

singh23...@gmail.com

unread,
Mar 15, 2020, 3:37:15 AM3/15/20
to
I am trying to apply reliability analysis but I get reliability in negative

Rich Ulrich

unread,
Mar 17, 2020, 12:50:16 PM3/17/20
to
On Sun, 15 Mar 2020 00:37:12 -0700 (PDT), singh23...@gmail.com
wrote:

>I am trying to apply reliability analysis but I get reliability in negative

Rescore the particular items which are opposite in direction
from how you want to interpret your overall score.

If an item is scored from 1-5, you can reverse the scoring by
COMPUTE rvar= 6- var . And so on.

In order to keep your file accurate with its original variables,
you should NOT simple reverse-score an item in place, unless
you make sure NOT to save the file with the reversed score.

When there are only a couple of vars that need reversing, I've
been satisfied with creating new versions of only those couple,
so my file might end up with (say) "var1 to var10, rvar4, rvar8".

Reliability is computed from formulas that have several important
assumptions, and add and subtract "variance components".
The important failure-of-assumption is that your correlation
matrix apparently does have sizable negative r's.

LOOK at the correlation matrix that can come with the analysis,
to make sure that you are looking at a consistent set.

--
Rich Ulrich

istiqama...@gmail.com

unread,
Apr 3, 2020, 9:08:51 PM4/3/20
to
Me too

Rich Ulrich

unread,
Apr 6, 2020, 6:34:06 PM4/6/20
to
On Fri, 3 Apr 2020 18:08:49 -0700 (PDT), istiqama...@gmail.com
wrote:

>Me too

I see approximately 33 follow-ups to this thread from its
first posting in 2009. See -

https://groups.google.com/forum/#!topic/comp.soft-sys.stat.spss/IVp-wCTSxR8

I found this by Googling on
< "error message when I do a reliability analysis" >


--
Rich Ulrich

Amy Hester

unread,
Jan 7, 2022, 5:33:46 PM1/7/22
to
I followed all the guidance here, but my problem was different. When I exported my dataset from the survey software used to collect instrument scoring responses, it coded the responses accurately, but the dataset was inversely coded where the columns should have been the rows and the rows should have been the columns. This may be what others referred to as reflection or reflecting above. I was testing a gold standard tool, had the same issues, corrected my columns to rows and vice versa and it worked perfectly.
0 new messages