Anybody knows how to prove, or at least knows the reference that shows,
that the special pseudo orthogonal group has exactly two connected
components?
By special pseudo orthogonal i mean, the group of all linear
transformations of determinant one which preserve a symmetric
nondegenerate bilinear form, or more explicitly the set of matrices
which preserve the matrix whith only pluses and minuses along the
diagonal.
BTW, are there general theorems, techniques, algorithms.. which are
used to establish the number of connected components for general lie
groups?
Thanks!
Tim
Consider the action of this group's elements on the points of the
two sheet hyperboloid x_1^2 - x_2^2 - x_3^2 - ... = 1. There are
group elements that map each sheet of the hyperboloid into itself, and
those that map one sheet into the other. The identity element belongs to
the former group and cannot be connected by a continuous curve to any
element of the latter group. Indeed, if there were such a curve, then
the image of any point on one sheet of the hyperboloid make a continuous
transition to the other sheet under the action of the group elements
lying on this curve. This, however, is impossible, since we already know
that the two sheets of the hyperboloid are not connected.
> BTW, are there general theorems, techniques, algorithms.. which are
> used to establish the number of connected components for general lie
> groups?
In general you can try to find continuous paths that connect the
identity element to any other given group element. Wherever you succeed,
the given group element will be in the same connected component as the
identity. Wherever you fail, it will be in a different connected
components. In the proof above, I showed that there are at least two
connected components. To prove that there are only two, you have to do a
bit more work. One thing you can do is try to interpolate the
eigenvalues between the those of the identity element and those of an
element that maps each hyperboloid into itself. You should succeed.
Hope this helps.
Igor
Your proof only works for the Lorentz group, i.e. for the group that
preserves the metric - ++++... (one minus and everything else pluses).
What i asked for was a proof for the general metric. For example, how
would you do it for SO(2,2)? It's not clear to me that the
a^2+b^2-c^2-d^2=1 is disconnected. In fact i have reasons to believe
that it IS connected. There are other groups as well SO(3,2), SO(5,9)
and so on..How to handle them?
One thing you can do is try to interpolate the
eigenvalues between the those of the identity element and those of an
element that maps each hyperboloid into itself. You should succeed.
I didn't understand what did you mean by this sentence.
Regards,
Tim
To establish connectedness, find a connecting path of group elements
between any given group element and the identity. To establish
the number of components, find discriminating invariants.
In particular, SO(p,q) is connected if the product pq is even,
and has two connected components otherwise. The connected
subgroup preserves in the latter case the sign of the quadratic form.
The book by
R. Gilmore,
Lie groups, Lie algebras, and some of their applications
Wiley, New York 1974
contains a lot of material about specific classical groups.
The above result is stated there (without proof) on p. 199.
Arnold Neumaier
Sorry, I presumed that you would be most interested in the Lorentz
group wich is of greatest importance in physics.
> I wrote:
>> One thing you can do is try to interpolate the
>> eigenvalues between the those of the identity element and those of an
>> element that maps each hyperboloid into itself. You should succeed.
>
> I didn't understand what did you mean by this sentence.
This involves finding the eigenvalues and eigenvectors of a given group
element. Here's how it works for SO(3). Any orthogonal 3x3 matrix has a
real eigenvector, which must have eigenvalue 1. The eigenvector that
corresponds to this eigenvalue is the axis of rotation. The other two
eigenvalues are complex and will be complex conjugates of each other.
The 2-dimensional invariant subspace corresponding to these eigenvalues
will is the plane of rotation. Since the modulus of the complex
eigenvalues must be 1 as well, we can write them as exp(+-i*theta).
Given a group element, we can find the corresponding axis and plane of
rotation as well as this number, theta, which represents the angle of
rotation. It's clear that if we continuously change theta from the
given value to 0, while keeping everything else the same, we obtain a
continuous curve in SO(3) from the given group element to the identity.
This shows that SO(3) is connected.
If we had started with O(3) instead of SO(3), then the real eigenvalue
would have two possibilities, either +1 or -1. It would not have been
possible to continuously interpolate between an orthogonal matrix with
eigenvalue -1 and the identity. This failure indicates (and with a
slightly more sophisticated argument, proves) that O(3) is not
connected.
To generalize this method to other groups, you must be able to
diagonalize individual elements. For orthogonal groups this is not
hard. For pseudo orthogonal groups, it's not so clear, but probably not
impossible. Whether this is the most efficient way to show
connectedness, I don't know.
Hope this helps.
Igor
I've looked through that book, and couldn't find even a hint on how to
proceed with the proof or why the statement is true. So, my question
still stands.
>The connected
>subgroup preserves in the latter case the sign of the quadratic form.
What do you mean by this, and why is this useful? Doesn't it follow
immediately from the definition of SO that the quadratic form is
preserved?
Regards,
Tim
>Anybody knows how to prove, or at least knows the reference that shows,
>that the special pseudo orthogonal group has exactly two connected
>components?
>By special pseudo orthogonal i mean, the group of all linear
>transformations of determinant one which preserve a symmetric
>nondegenerate bilinear form, or more explicitly the set of matrices
>which preserve the matrix whith only pluses and minuses along the
>
>
> diagonal.
I don't know a reference, but it is not hard to prove that SO(p,q) has
exactly two connected components if pq>0. Details follow.
O(p,q) is the set of all block matrices
/A B\
\C D/
where A is p x p, B is p x q, C is q x p, D is q x q, such that
-A' A + C' C = -1
-A' B + C' D = 0
-B' B + D' D = 1 ;
here ' denotes the transpose of a matrix. (So p is in this convention
the number of *negative* entries in a diagonalisation of the bilinear
form.) SO(p,q) consists of the elements of O(p,q) with positive determinant.
If ((A,B),(C,D)) is in O(p,q), then det(A)^2 = det(A'A) = det(1+C'C) >0
since 1+C'C is positive definite. Similarly det(D)^2 >0.
Let SOpos(p,q) denote the set of elements ((A,B),(C,D)) of SO(p,q) with
det(A)>0. Let SOneg(p,q) denote the set of elements ((A,B),(C,D)) of
SO(p,q) with det(A)<0.
These two disjoint sets are open in SO(p,q) and their union is SO(p,q).
The set SOpos(p,q) contains the identity. If pq>0, then there exists an
element A of O(p) with det(A)<0, and an element D of O(q) with det(D)<0;
the matrix ((A,0),(0,D)) lies in SOneg(p,q). So both sets are nonempty,
and the only fact that remains to show is that they are connected.
CLAIM: Each element of O(p,q) can be connected by a path in O(p,q) to an
element of the subgroup of O(p,q) consisting of all matrices
((U,0),(0,V)) with U in O(p) and V in O(q).
COROLLARY: SOpos(p,q) and SOneg(p,q) are connected.
PROOF of the corollary: Let M=((A0,B0),(C0,D0)) and N=((E0,F0),(G0,H0))
be in SO(p,q) such that det(A0) and det(E0) have the same sign s. By the
CLAIM, M can be connected in O(p,q) to ((A1,0),(0,D1)) with A1 in O(p),
D1 in O(q). The path does actually run through SO(p,q) since the sign of
the determinant of the block matrix is constant along the path. Since
the sign of the determinant of each diagonal block is constant along the
path, the signs of det(A1) and det(D1) are equal to s. Analogously N can
be connected in SO(p,q) to a matrix ((E1,0),(0,H1)) with sgn(det(E1)) =
sgn(det(H1)) = s. Because SO(k) and {M in O(k) | det(M)<0} are
connected, there is a path P in O(p) from A1 to E1, and there is a path
Q in O(q) from D1 to H1. The path which maps t to ((P(t),0),(0,Q(t)))
runs obviously through SO(p,q). Hence M and N can be connected by a path
in SO(p,q). This proves the corollary.
PROOF of the CLAIM. Let ((A,B),(C,D)) be an element of SO(p,q). By polar
decomposition, there is a matrix U in O(p) such that A = U sqrt(1+C'C).
There is a matrix V in O(q) such that D = V sqrt(1+B'B). For every t in
[0,1], we define
B(t) = tB ,
D(t) = V sqrt[1+B(t)'B(t)] ,
N(t) = [D(t)']^{-1} B(t)' U .
Now we show that for every p x q matrix F (e.g. F=B(t)), the matrix 1-
F(1+F'F)^{-1}F' is positive definite: Let v be in R^p. Then v = Fx +w
for some x in R^q and some w in R^p which is orthogonal to the image of
F. Let G denote the positive semidefinite matrix F'F. We have to show that
<v,v> - <F(1+G)^{-1}F'v,v>
is positive if v is nonzero. Since F'w = 0, we obtain
<v,v> - <F(1+G)^{-1}F'v,v> = <w,w> + <Fx,Fx> - <F(1+G)^{-1}F'Fx, Fx> ,
so it suffices to prove that <Gx,x> - <(1+G)^{-1}Gx, Gx> is positive if
x is nonzero. Let x be nonzero, y = (1+G)^{-1}x. Then
0 < <Gy, y> + <GGy, y>
= <G(1+G)y, y>
= <Gx, (1+G)^{-1}x>
= <Gx, (1+G)^{-1}(1+G)x -(1+G)^{-1}Gx>
= <Gx, x> - <Gx, (1+G)^{-1}Gx> ,
as claimed. Thus 1- F(1+F'F)^{-1}F' is indeed positive definite.
Hence for each t in [0,1] the matrix
1-N(t)'N(t)
= 1-U' B(t) D(t)^{-1} D(t)'^{-1} B(t)' U
= U' [1-B(t)[D(t)'D(t)]^{-1}B(t)'] U
= U' [1-B(t)[1+B(t)'B(t)]^{-1}B(t)'] U
is positive definite. We define
C(t) = N(t) sqrt[1-N(t)'N(t)]^{-1} ,
A(t) = U sqrt[1+C(t)'C(t)] .
Thus
1+C(t)'C(t)
= 1+ sqrt[1-N(t)'N(t)]^{-1} N(t)'N(t) sqrt[1-N(t)'N(t)]^{-1}
= 1+ [1-N(t)'N(t)]^{-1} N'(t)N(t)
= [1-N(t)'N(t)]^{-1}
and therefore
D(t)'C(t)
= D(t)' N(t) sqrt[1+C(t)'C(t)]
= B(t)' U sqrt[1+C(t)'C(t)]
= B(t)' A(t) .
So we obtain
-A(t)' A(t) + C(t)' C(t) = -1
-A(t)' B(t) + C(t)' D(t) = 0
-B(t)' B(t) + D(t)' D(t) = 1 .
I.e., the path w given by w(t) = ((A(t), B(t)),(C(t), D(t))) runs
through O(p,q).
We have B(1) = B and D(1) = D. Moreover
C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C] , hence (by an
easy computation)
C(1)'C(1) = C'C , hence (by inserting this in the previous line)
C(1) = C.
Finally A(1) = U sqrt[1+C(1)'C(1)] = U sqrt[1+C'C] = A.
Obviously B(0) = 0 and D(0) = V and N(0) = 0 and C(0) = 0 and A(0) = U.
Putting everything together, the path w connects ((A,B),(C,D)) to
((U,0),(0,V)) inside O(p,q). This completes the proof of the CLAIM and
the whole argument.
-- Marc Nardmann
> In particular, SO(p,q) is connected if the product pq is even,
> and has two connected components otherwise.
This is not true. SO(p,q) has always two connected components if pq>0.
At least with the standard definition of SO(p,q) that I repeat in
another message on this thread. In fact, it is very easy to see that
there are always *at least* two connected components; cf. my other message.
> The book by
> R. Gilmore,
> Lie groups, Lie algebras, and some of their applications
> Wiley, New York 1974
> contains a lot of material about specific classical groups.
> The above result is stated there (without proof) on p. 199.
I haven't looked it up. Maybe a clash of notation?
-- Marc Nardmann
>Now we show that for every p x q matrix F (e.g. F=B(t)), the matrix 1-
>F(1+F'F)^{-1}F' is positive definite: Let v be in R^p. Then v = Fx +w
>for some x in R^q and some w in R^p which is orthogonal to the image of
>F. Let G denote the positive semidefinite matrix F'F. We have to show that
>
><v,v> - <F(1+G)^{-1}F'v,v>
>
>is positive if v is nonzero. Since F'w = 0, we obtain
>
><v,v> - <F(1+G)^{-1}F'v,v> = <w,w> + <Fx,Fx> - <F(1+G)^{-1}F'Fx, Fx> ,
>
>so it suffices to prove that <Gx,x> - <(1+G)^{-1}Gx, Gx> is positive if
>x is nonzero. Let x be nonzero, y = (1+G)^{-1}x. Then
>
>0 < <Gy, y> + <GGy, y>
> = <G(1+G)y, y>
> = <Gx, (1+G)^{-1}x>
> = <Gx, (1+G)^{-1}(1+G)x -(1+G)^{-1}Gx>
> = <Gx, x> - <Gx, (1+G)^{-1}Gx> ,
>
>as claimed. Thus 1- F(1+F'F)^{-1}F' is indeed positive definite.
>
>
Oops, not quite. What I should have said is this:
> [...] so it suffices to prove that <Gx,x> - <(1+G)^{-1}Gx, Gx> is
> positive if
> Fx is nonzero. Let Fx be nonzero, y = (1+G)^{-1}x. Then Gx is nonzero
> because <Gx,x> = <Fx,Fx> is nonzero. Hence
>
> <Gy,Gx> = <G(1+G)^{-1}x, Gx> = <(1+G)^{-1}Gx,Gx> > 0 ,
>
> so Gy is nonzero. Thus
>
> 0 < <Gy,y> + <Gy,Gy> = <Gy, y> + <GGy, y>
>
> [...]
Near the end of the message, there was a typo:
>C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C] , hence
>
should be:
>C(1) sqrt[1+C(1)'C(1)]^{-1} = D'^{-1}B' U = C sqrt[1+C'C]^{-1} , hence
>
-- Marc Nardmann