Index order in xCoba

42 views
Skip to first unread message

Salva Mengual Sendra

unread,
Feb 8, 2023, 7:56:06 AM2/8/23
to xAct Tensor Computer Algebra
Hello everyone,

I have what I think should be some very basic questions, but I honestly can't figure them out:

Consider that a metric and a chart are defined in a given manifold, and then you define two 2-tensors A and B. In the attached notebook I use the Schwarzschild metric and two random 2-vectors as example. The questions are:

1.- Why does the contraction A[-a,-i] B[i,b] give something of the form CTensor[...][b,-a] instead of CTensor[...][-a,b]? (like in [15] in the notebook)

2.- Why doesn't HeadOfTensor[ A[-a,-i] B[i,b] ,{-a,b}][b,-a] give the same result as before? (see [16] in the notebook)

3.- Why do I need to write HeadOfTensor[ A[-a,-i] B[i,b] ,{-a,b}][-a,b] to get what I expected to obtain from  A[-a,-i] B[i,b]? (see [17])

Something similar happens with more complicated contractions of tensors with more indices and I don't know what is going on.

Thank you very much in advance for your help.

Cheers,
Salva
Example-Indices.nb

Jose

unread,
Feb 8, 2023, 8:14:14 PM2/8/23
to xAct Tensor Computer Algebra
Hi,

Answers inlined.

On Wednesday, February 8, 2023 at 6:56:06 AM UTC-6 salvam...@hotmail.com wrote:
Hello everyone,

I have what I think should be some very basic questions, but I honestly can't figure them out:

Consider that a metric and a chart are defined in a given manifold, and then you define two 2-tensors A and B. In the attached notebook I use the Schwarzschild metric and two random 2-vectors as example. The questions are:

1.- Why does the contraction A[-a,-i] B[i,b] give something of the form CTensor[...][b,-a] instead of CTensor[...][-a,b]? (like in [15] in the notebook)

The order of indices is irrelevant in the following sense: CTensor[{{1, 2}, {3, 4}}, {B, B}][b, a] is the same as CTensor[{{1, 3}, {2, 4}}, {B, B}][a, b]. There is nothing wrong in choosing one or the other, and xCoba chooses one more or less randomly, not worrying about order. If you then want to choose a canonical order, use the function ToCCanonical, which is a bit like ToCanonical in rearranging things into a canonical form, but for CTensor objects. ToCanonical and ToCCanonical do have a preference of order of indices, but most of the rest of the system does not. Worrying about order of indices takes time.
 

2.- Why doesn't HeadOfTensor[ A[-a,-i] B[i,b] ,{-a,b}][b,-a] give the same result as before? (see [16] in the notebook)

Because HeadOfTensor[ctensor, {a, b}][b, a] is effectively a transposition of ctensor. You are indicating that the matrix of components should be extracted from ctensor with index configuration {a, b}, but then you are reconstructing a different tensor from that array with indices {b, a}. In other words, for a given matrix M, the tensors CTensor[M, {B, B}][a, b] and CTensor[M, {B, B}][b, a] are transposes of each other. Think of the operation T[a, b] - T[b, a] for the same T. Don't we expect to get twice the antisymmetric part of T ? This interpretation is the same for abstract tensors (defined with DefTensor) and component tensors (i.e. CTensor objects).
 

3.- Why do I need to write HeadOfTensor[ A[-a,-i] B[i,b] ,{-a,b}][-a,b] to get what I expected to obtain from  A[-a,-i] B[i,b]? (see [17])

I think this is the same question as 2., and my answer is the same.

In practical terms, when you perform a component computation, you need to think of the order of indices in which you want to get your final result. It is standard to work with a sorted collection of indices, so that the result CTensor[array, bases][a, b, c, ...] contains the array with no extra transpositions. This is what ToCCanonical will achieve at the end of a xCoba computation, again a bit like ToCanonical does at the end of a xTensor computation. But you could have decided to use any other order, so to get the correct array at the end, you would have to extract it with HeadOfTensor[..., indices], specifying what your non-canonical order of indices is.

Hope this clarifies things.

Cheers,
Jose.
Reply all
Reply to author
Forward
Message has been deleted
0 new messages