different Eigenvalue results Julia vs Matlab

352 views
Skip to first unread message

Dennis Eckmeier

unread,
Sep 10, 2016, 7:34:51 AM9/10/16
to julia-users
Hi,

I am new to Julia and rather lay in math. For practice, I am translating a Matlab script (written by somebody else) and compare the result of each step in Matlab and Julia.

The Matlab script uses  [V,D] = eig(covX);

which I translated to Julia as: (D,V) = eig(covX)

However, the outcomes don't match, although covX is the same.

D.


Michele Zaffalon

unread,
Sep 10, 2016, 7:48:17 AM9/10/16
to julia...@googlegroups.com
You mean they are not simply permuted? Can you report the MATLAB and Julia results in the two cases for a small covX matrix?

Dennis Eckmeier

unread,
Sep 10, 2016, 11:11:46 AM9/10/16
to julia-users
Hi!

Here is a simple example:

Matlab:
(all types are double)
 covX(1:4,1:4)

  1.0e+006 *

    3.8626    3.4157    2.4049    1.2403
    3.4157    3.7375    3.3395    2.3899
    2.4049    3.3395    3.7372    3.4033
    1.2403    2.3899    3.4033    3.8548

[V,D] = eig(covX(1:4,1:4))

V =

    0.2583    0.5402   -0.6576    0.4571
   -0.6622   -0.4518   -0.2557    0.5404
    0.6565   -0.4604    0.2552    0.5403
   -0.2525    0.5404    0.6611    0.4551


D =

  1.0e+007 *

    0.0006         0         0         0
         0    0.0197         0         0
         0         0    0.3010         0
         0         0         0    1.1978



Julia:
(all types are Float 64)
D = 
5759.7016...
197430.4962...
3.0104794 ... e6
1.1978 ... e7

V =
   0.2583    0.5402    -0.6576     -0.4571
   -0.6622   -0.4518   -0.2557    -0.5404
    0.6565   -0.4604    0.2552    -0.5403
   -0.2525    0.5404    0.6611    -0.4551

cheers,

D

Tracy Wadleigh

unread,
Sep 10, 2016, 11:15:23 AM9/10/16
to julia-users

Looks good to me. The eigenvalues look the same up to the precision shown and eigenvectors are only unique up to a scalar multiple.

Dennis Eckmeier

unread,
Sep 10, 2016, 11:26:55 AM9/10/16
to julia-users
Thanks, that's reassuring. 

Stuart Brorson

unread,
Sep 10, 2016, 12:20:25 PM9/10/16
to julia-users
Just a question from a non-mathematician. Also, this is a math
question, not a Julia/Matlab question.

I agree that Matlab and Julia are both correct -- within the
definitions of eigenvector and eigenvalue they compute, it's OK that
one eigenvector differes between the two by a factor -1. They're
probably calling the same library anyway.

However, from a physics perspective, this bugs me. In physics, an
important feature of a space is its orientation. For example,
physics and engineering undergrads learn early on about about the
"right-hand rule" for taking cross products of vectors. This type of
product imposes an orientation (or chirality) on the vector space
under consideration. Real mathematicians may be familiar with
the wedge product in exterior algebra, which (I belive) also imposes
an orientation on a vector space. Wikipedia says:

https://en.wikipedia.org/wiki/Orientation_(vector_space)

The problem I have with Julia and Matlab returning a set of
eigenvectors where only one eigenvector is the negative of the other
is that the two eigenvector sets span spaces of opposite
orientation, so the returns are -- in some sense -- not the same.

So my question: Is there anything like a set of "oriented"
eigenvectors? Is this even possible? Or am I totally wrong, and
the two different sets of eigenvectors span the same space independent
of concerns about orientation?

Stuart




On Sat, 10 Sep 2016, Tracy Wadleigh wrote:

> Looks good to me. The eigenvalues look the same up to the precision shown
> and eigenvectors are only unique up to a scalar multiple.
>
> On Sep 10, 2016 8:11 AM, "Dennis Eckmeier" <
> dennis....@neuro.fchampalimaud.org> wrote:
>
>> Hi!
>>
>> Here is a simple example:
>>
>> *Matlab:*
>> (all types are double)
>> covX(1:4,1:4)
>>
>> 1.0e+006 *
>>
>> 3.8626 3.4157 2.4049 1.2403
>> 3.4157 3.7375 3.3395 2.3899
>> 2.4049 3.3395 3.7372 3.4033
>> 1.2403 2.3899 3.4033 3.8548
>>
>> [V,D] = eig(covX(1:4,1:4))
>>
>> V =
>>
>> 0.2583 0.5402 -0.6576 0.4571
>> -0.6622 -0.4518 -0.2557 0.5404
>> 0.6565 -0.4604 0.2552 0.5403
>> -0.2525 0.5404 0.6611 0.4551
>>
>>
>> D =
>>
>> 1.0e+007 *
>>
>> 0.0006 0 0 0
>> 0 0.0197 0 0
>> 0 0 0.3010 0
>> 0 0 0 1.1978
>>
>>
>>
>> *Julia:*
>> (all types are Float 64)
>> *D = *
>> *5759.7016...*
>> *197430.4962...*
>> *3.0104794 ... e6*
>> *1.1978 ... e7*
>>
>> V =
>> 0.2583 0.5402 -0.6576 *-0.4571*
>> -0.6622 -0.4518 -0.2557 *-0.5404*
>> 0.6565 -0.4604 0.2552 *-0.5403*
>> -0.2525 0.5404 0.6611 *-0.4551*

Yichao Yu

unread,
Sep 10, 2016, 1:06:31 PM9/10/16
to Julia Users
On Sat, Sep 10, 2016 at 12:20 PM, Stuart Brorson <s...@cloud9.net> wrote:
Just a question from a non-mathematician.  Also, this is a math
question, not a Julia/Matlab question.

I agree that Matlab and Julia are both correct -- within the
definitions of eigenvector and eigenvalue they compute, it's OK that
one eigenvector differes between the two by a factor -1.  They're
probably calling the same library anyway.

However, from a physics perspective, this bugs me.  In physics, an
important feature of a space is its orientation.  For example,
physics and engineering undergrads learn early on about about the
"right-hand rule" for taking cross products of vectors.  This type of
product imposes an orientation (or chirality) on the vector space
under consideration.  Real mathematicians may be familiar with
the wedge product in exterior algebra, which (I belive) also imposes
an orientation on a vector space.  Wikipedia says:

https://en.wikipedia.org/wiki/Orientation_(vector_space)

The problem I have with Julia and Matlab returning a set of
eigenvectors where only one eigenvector is the negative of the other
is that the two eigenvector sets span spaces of opposite
orientation, so the returns are -- in some sense -- not the same.

So my question:  Is there anything like a set of "oriented"
eigenvectors?  Is this even possible?  Or am I totally wrong, and
the two different sets of eigenvectors span the same space independent
of concerns about orientation?


I don't think you can define that in a continuous way.
In general, if your application relies on certain property of the basis, you should just normalize it that way. If you don't have a requirement than you should worry about it.

Stuart Brorson

unread,
Sep 10, 2016, 2:13:17 PM9/10/16
to Julia Users
> I don't think you can define that in a continuous way.
> In general, if your application relies on certain property of the basis,
> you should just normalize it that way. If you don't have a requirement than
> you should worry about it.

Thanks for the thoughts.

I did a little more thinking and Googling. My conclusion:

Yes, you can define the concept of orientation very clearly. The
orientation of a vector space is given by the determinant of the
matrix formed by its basis set (eigenvectors). The determinant will
be either + or -1 for an orthonormal set of basis vectors. The sign
of the determinant divides the orthonormal set into two equivalence
classes of basis vectors, one with + orientation, the other with -.

I guess the concept of eigenvector/eigevalue is more general than the
concept of orientation. Put another way, you can have a basis set
(eigenvectors) without defining the orientation property, but you
can't have orientation without first having a basis set.

The OP's results from Matlab and Julia belong to opposite
equivalence classes, which is what bothers me. However,
from the standpoint of getting a basis, the concept of orientation is
not required. One would need to impose some additional constraint on
the eigenvalue decomposition algorithm in order to always get an
eigenvector set with the same orientation, and I guess that additional
constraint isn't important for the problems usually solved by
eigendecomposition.

Thanks for making me think about this,

Stuart

Yichao Yu

unread,
Sep 10, 2016, 2:19:02 PM9/10/16
to Julia Users
On Sat, Sep 10, 2016 at 2:12 PM, Stuart Brorson <s...@cloud9.net> wrote:
I don't think you can define that in a continuous way.
In general, if your application relies on certain property of the basis,
you should just normalize it that way. If you don't have a requirement than
you should worry about it.

Thanks for the thoughts.

I did a little more thinking and Googling.  My conclusion:

Yes, you can define the concept of orientation very clearly.  The
orientation of a vector space is given by the determinant of the
matrix formed by its basis set (eigenvectors).  The determinant will
be either + or -1 for an orthonormal set of  basis vectors.  The sign
of the determinant divides the orthonormal set into two equivalence
classes of basis vectors, one with + orientation, the other with -.


What I actually mean is that you can constraint the result of eigs.

Yichao Yu

unread,
Sep 10, 2016, 2:24:02 PM9/10/16
to Julia Users
On Sat, Sep 10, 2016 at 2:18 PM, Yichao Yu <yyc...@gmail.com> wrote:


On Sat, Sep 10, 2016 at 2:12 PM, Stuart Brorson <s...@cloud9.net> wrote:
I don't think you can define that in a continuous way.
In general, if your application relies on certain property of the basis,
you should just normalize it that way. If you don't have a requirement than
you should worry about it.

Thanks for the thoughts.

I did a little more thinking and Googling.  My conclusion:

Yes, you can define the concept of orientation very clearly.  The
orientation of a vector space is given by the determinant of the
matrix formed by its basis set (eigenvectors).  The determinant will
be either + or -1 for an orthonormal set of  basis vectors.  The sign
of the determinant divides the orthonormal set into two equivalence
classes of basis vectors, one with + orientation, the other with -.


What I actually mean is that you can constraint the result of eigs.

Obviously I meant "can't" . You can certainly flip signs to get the orientation you want but not in a meaningful and unique way.

Christoph Ortner

unread,
Sep 10, 2016, 5:08:02 PM9/10/16
to julia-users
Yichao is right, you cannot give eigenvectors an orientation; A good way to think of them is as defining linear subspaces. 

So what is unique is the projector  v\|v| \otimes v/|v|  or in the case of multiple e-vals the  projector onto the eigenspace  \sum v_i \otimes v_i. But never the e-evecs themselves. 

Steven G. Johnson

unread,
Sep 10, 2016, 6:32:51 PM9/10/16
to julia-users


On Saturday, September 10, 2016 at 12:20:25 PM UTC-4, Stuart Brorson wrote:
However, from a physics perspective, this bugs me.  In physics, an
important feature of a space is its orientation.

I should also remind you that a lot of modern physics tries to separate the laws of physics from particular choices of basis or coordinate system.  (Think of differential geometry in relativity or Dirac notation in quantum mechanics.)    Similarly, if you have an expression that depends on the precise choice of basis for the eigenvectors of a matrix, it is likely that you have made a mistake, or at the very least you have to be very careful.

(Yes, the matrix of eigenvectors of a real-symmetric problem can be chosen orthonormal, with determinant ±1, and of course it is possible to choose the eigenvectors so that the determinant is +1, but this still does not uniquely determine the eigenvectors: in an NxN orthonormal matrix, there are 2^(N-1) choices of signs that lead to the same sign of the determinant.  And if some eigenvalues have multiplicity > 1 you have even more choices.)

In short, don't get hung up on this.

colint...@gmail.com

unread,
Sep 10, 2016, 7:12:22 PM9/10/16
to julia-users
This question pops up on StackOverflow fairly frequently, although usually in relation to Mathematica, which uses a totally different normalisation to most other languages. I was surprised by your question actually, since I thought Matlab and Julia used *exactly* the same normalisation, but obviously there are some minor differences in the algorithm. Anyway, here is a link to a StackOverflow question if you want to read about possible normalisations a little further:

https://stackoverflow.com/questions/13041178/could-we-get-different-solutions-for-eigenvectors-from-a-matrix

Cheers,

Colin

Tracy Wadleigh

unread,
Sep 10, 2016, 7:22:52 PM9/10/16
to julia-users

+10**10**10

Reply all
Reply to author
Forward
0 new messages