In some treatments of tensors, they're described as linear maps. So, in GR, if we have a linear map described as a 4x4 matrix of real numbers, which operates on a 4-vector described as a column matrix with entries (ct, x, y, z), which transforms to another 4-vector, what must be added in this description to claim that the linear transformation satisfies the definition of a tensor? TY, AG
Vectors are used to represent things like motion, force, flow. You think of them as little arrows that show the direction of the motion, force or flow and the length of the arrow tells you the magnitude ot the motion, force or flow. So vectors have terms, one for each dimension of the space they are in. In the plane, which is two dimensional, vectors have two components. But the values of the components change depending on which coordinate system you choose to represent them. The coordinates are something we impose to facilitate our calculation. But the vector or tensor is a THING that is independent of the coordinates we use to describe them. Just as I could give directions from this building to the upper parking lot by saying it's 250ft that way or I could say it's 200ft north and 150ft west. The distance and direction would be the same, only the description is different.
Tensors are just a another step up from vectors. They describe how vectors are changing. Here's a good example of a vector field. It's called a field because there's a vector at each point. It shows the flow of water out of Monterey bay at a particular moment.

For any particular flow field we can ask how do the vectors around a particular bit of water change as that bit of water is carried along by the flow. The answer is a tensor. Just as a vector is the abstraction of a little arrow, the tensor can be thought of an as abstraction of a little circle. And then the question becomes how does this circle get distorted as the water flows.

If the water is
moving uniformly then the circle doesn't get distorted.
This is like the no-curvature tensor. But the flow in
Monterey Bay is not uniform, so the circle/tensor gets
distorted.
For flow like this in two dimensions the tensor is just an
ellipse. It has two directions corresponding to the axes
of the ellipse and it has a size corresponding to the
strength of the flow. So it only takes three number to
describe it. The Txy component is the same as the Tyx
component. But this is a tensor FIELD. So there's a
different tensor at each point. It takes 3 numbers at
each point.

If we choose some particular coordinate system then the
components have interpretations like, “How much does the x
flow speed change as you move in the y direction.”
On 11/9/2025 1:11 AM, Alan Grayson wrote:
On Saturday, November 8, 2025 at 6:25:17 AM UTC-7 Alan Grayson wrote:
In some treatments of tensors, they're described as linear maps. So, in GR, if we have a linear map described as a 4x4 matrix of real numbers, which operates on a 4-vector described as a column matrix with entries (ct, x, y, z), which transforms to another 4-vector, what must be added in this description to claim that the linear transformation satisfies the definition of a tensor? TY, AG
Let's call the linear transformation T, then the answer to my question might be that T is a tensor iff it has a continuous inverse. I'm not sure if this is correct, but I seem to recall this claim in a video about tensors I viewed in another life. But even if it's true, it seems to conflict with the claim that an ordinary vector in Euclidean space is a tensor because it's invariant under linear (?) transformations. In this formulation, it is the argument of T, which we can call V, which is invariant, not the map T. I'd appreciate it if someone here could clarity my confusion. TY, AGA tensor is a geometric object (possibly in an abstract space). It transforms covariantly; which means that changes in coordinates (even non-linear changes in coordinates) leave it the same.
On Sun, Nov 09, 2025 at 01:11:47AM -0800, Alan Grayson wrote:
>
>
> On Saturday, November 8, 2025 at 6:25:17 AM UTC-7 Alan Grayson wrote:
>
> In some treatments of tensors, they're described as linear maps. So, in GR,
> if we have a linear map described as a 4x4 matrix of real numbers, which
> operates on a 4-vector described as a column matrix with entries (ct, x, y,
> z), which transforms to another 4-vector, what must be added in this
> description to claim that the linear transformation satisfies the
> definition of a tensor? TY, AG
>
>
> Let's call the linear transformation T, then the answer to my question might be
> that T is a tensor iff it has a continuous inverse. I'm not sure if this is
> correct, but I seem to recall this claim in a video about tensors I viewed in
> another life. But even if it's true, it seems to conflict with the claim that
> an ordinary vector in Euclidean space is a tensor because it's invariant under
> linear (?) transformations. In this formulation, it is the argument of T, which
> we can call V, which is invariant, not the map T. I'd appreciate it if someone
> here could clarity my confusion. TY, AG
It's got nothing to do with being invertible (which is the conjunction
of being 1:1 and onto).
Rather a tensor is a multilinear map -
is a map with multiple
arguments, and linear in each. Obviously a standard linear map R^n -> R^n is a
rank 2 tensor. We recognise them generally as matrices. Vectors
correspond to linear maps by means of transposing them and forming the
inner product, ie a linear map from R^n->R, and are rank 1 tensors as a result.
>
On Sun, Nov 09, 2025 at 03:56:16PM -0800, Alan Grayson wrote:
>
> If it's a map, how can an ordinary vector in Euclidean space be a tensor?
> Such vectors are NOT maps! See my problem? AG
I did explain that in my post if you read it. In an inner product
space, every vector is isomorphic to a linear map from the space to
its field. Eg R^n->R in the case of the space R^n. That linear map is
the rank 1 tensor. In mathematics, something walks and quacks like a
duck is a duck.
Even the inner product operation is an example of a bilinear map,
hence a rank 2 tensor. In Minkowski spacetime, the inner product is
known as the Levi-Civita tensor.
On Sunday, November 9, 2025 at 5:12:54 PM UTC-7 Russell Standish wrote:On Sun, Nov 09, 2025 at 03:56:16PM -0800, Alan Grayson wrote:
>
> If it's a map, how can an ordinary vector in Euclidean space be a tensor?
> Such vectors are NOT maps! See my problem? AG
I did explain that in my post if you read it. In an inner product
space, every vector is isomorphic to a linear map from the space to
its field. Eg R^n->R in the case of the space R^n. That linear map is
the rank 1 tensor. In mathematics, something walks and quacks like a
duck is a duck.
Even the inner product operation is an example of a bilinear map,
hence a rank 2 tensor. In Minkowski spacetime, the inner product is
known as the Levi-Civita tensor.So a tensor is nothing more than a multi linear map to the reals? But ifwe represent a tensor by a matrix, will it be automatically invariantunder coordinate transformations? Do we need an inner product spaceto define a tensor? TY, AG
On Sunday, November 9, 2025 at 2:07:30 PM UTC-7 Brent Meeker wrote:
On 11/9/2025 1:11 AM, Alan Grayson wrote:
On Saturday, November 8, 2025 at 6:25:17 AM UTC-7 Alan Grayson wrote:
In some treatments of tensors, they're described as linear maps. So, in GR, if we have a linear map described as a 4x4 matrix of real numbers, which operates on a 4-vector described as a column matrix with entries (ct, x, y, z), which transforms to another 4-vector, what must be added in this description to claim that the linear transformation satisfies the definition of a tensor? TY, AG
Let's call the linear transformation T, then the answer to my question might be that T is a tensor iff it has a continuous inverse. I'm not sure if this is correct, but I seem to recall this claim in a video about tensors I viewed in another life. But even if it's true, it seems to conflict with the claim that an ordinary vector in Euclidean space is a tensor because it's invariant under linear (?) transformations. In this formulation, it is the argument of T, which we can call V, which is invariant, not the map T. I'd appreciate it if someone here could clarity my confusion. TY, AGA tensor is a geometric object (possibly in an abstract space). It transforms covariantly; which means that changes in coordinates (even non-linear changes in coordinates) leave it the same.
Round and round we go, but what a tensor is remains elusive! Please define the property that allows it to transform covariantly. Is it a map represented by a matix, and if so, what property must it have that allows it to transform covariantly? AG
On Sunday, November 9, 2025 at 6:16:15 PM UTC-7 Alan Grayson wrote:
On Sunday, November 9, 2025 at 5:12:54 PM UTC-7 Russell Standish wrote:
On Sun, Nov 09, 2025 at 03:56:16PM -0800, Alan Grayson wrote:
>
> If it's a map, how can an ordinary vector in Euclidean space be a tensor?
> Such vectors are NOT maps! See my problem? AG
I did explain that in my post if you read it. In an inner product
space, every vector is isomorphic to a linear map from the space to
its field. Eg R^n->R in the case of the space R^n. That linear map is
the rank 1 tensor. In mathematics, something walks and quacks like a
duck is a duck.
Even the inner product operation is an example of a bilinear map,
hence a rank 2 tensor. In Minkowski spacetime, the inner product is
known as the Levi-Civita tensor.
So a tensor is nothing more than a multi linear map to the reals? But ifwe represent a tensor by a matrix, will it be automatically invariantunder coordinate transformations? Do we need an inner product spaceto define a tensor? TY, AG
If the tensor, represented by a matrix, is "unchanged" under a coordinatetransformation, does this mean its determinant is unchanged? AG
--
----------------------------------------------------------------------------
Dr Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders hpc...@hpcoders.com.au
http://www.hpcoders.com.au
----------------------------------------------------------------------------
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/everything-list/ca01bcc3-ef25-43cc-9e37-dde88e8d97ecn%40googlegroups.com.
On 11/9/2025 5:24 PM, Alan Grayson wrote:
On Sunday, November 9, 2025 at 6:16:15 PM UTC-7 Alan Grayson wrote:
On Sunday, November 9, 2025 at 5:12:54 PM UTC-7 Russell Standish wrote:
On Sun, Nov 09, 2025 at 03:56:16PM -0800, Alan Grayson wrote:
>
> If it's a map, how can an ordinary vector in Euclidean space be a tensor?
> Such vectors are NOT maps! See my problem? AG
I did explain that in my post if you read it. In an inner product
space, every vector is isomorphic to a linear map from the space to
its field. Eg R^n->R in the case of the space R^n. That linear map is
the rank 1 tensor. In mathematics, something walks and quacks like a
duck is a duck.
Even the inner product operation is an example of a bilinear map,
hence a rank 2 tensor. In Minkowski spacetime, the inner product is
known as the Levi-Civita tensor.
So a tensor is nothing more than a multi linear map to the reals? But ifwe represent a tensor by a matrix, will it be automatically invariantunder coordinate transformations? Do we need an inner product spaceto define a tensor? TY, AG
If the tensor, represented by a matrix, is "unchanged" under a coordinatetransformation, does this mean its determinant is unchanged? AGNo, in general it transforms like a density. So it's only unchanged if the determinant of the transformation matrix is 1
Brent
On Sun, Nov 09, 2025 at 09:55:15PM -0800, Alan Grayson wrote:
>
>
> Someday I might find a teacher who can really define tensors, but that day has
> yet to arrive. Standish seems to come close, but does every linear multivariate
> function define a tensor? I'm waiting to see his reply. AG
Well I did say multilinear function, but the answer is yes, every
multilinear function on a vector space is a tensor, and vice-versa.
On 11/9/2025 1:11 AM, Alan Grayson wrote:
On Saturday, November 8, 2025 at 6:25:17 AM UTC-7 Alan Grayson wrote:
In some treatments of tensors, they're described as linear maps. So, in GR, if we have a linear map described as a 4x4 matrix of real numbers, which operates on a 4-vector described as a column matrix with entries (ct, x, y, z), which transforms to another 4-vector, what must be added in this description to claim that the linear transformation satisfies the definition of a tensor? TY, AG
Let's call the linear transformation T, then the answer to my question might be that T is a tensor iff it has a continuous inverse. I'm not sure if this is correct, but I seem to recall this claim in a video about tensors I viewed in another life. But even if it's true, it seems to conflict with the claim that an ordinary vector in Euclidean space is a tensor because it's invariant under linear (?) transformations. In this formulation, it is the argument of T, which we can call V, which is invariant, not the map T. I'd appreciate it if someone here could clarity my confusion. TY, AGA tensor is a geometric object (possibly in an abstract space).
On Sun, Nov 09, 2025 at 10:34:38PM -0800, Alan Grayson wrote:
>
>
> On Sunday, November 9, 2025 at 11:16:05 PM UTC-7 Russell Standish wrote:
>
> On Sun, Nov 09, 2025 at 09:55:15PM -0800, Alan Grayson wrote:
> >
> >
> > Someday I might find a teacher who can really define tensors, but that
> day has
> > yet to arrive. Standish seems to come close, but does every linear
> multivariate
> > function define a tensor? I'm waiting to see his reply. AG
>
> Well I did say multilinear function, but the answer is yes, every
> multilinear function on a vector space is a tensor, and vice-versa.
>
>
> How does one prove that every multilinear function on a vector space is
> invariant under a
> change in coordinates? What exactly happens to its matrix representation? And
A vector is a geometric quantity having direction and length. As such
it is independent of any coordinate system that might be applied to
the space (although the list of numbers representing the components of
the vector in a given coordinate system must vary covariantly with the
coordinate system varying). A function operating on vectors, and
returning vectors or scalars must therfore also be independent of the
coordinate system.
On Mon, Nov 10, 2025 at 12:19:49AM -0800, Alan Grayson wrote:
>
> It's an ALGEBRAIC object, NOT a geometric object, defined on a vector space, as
> a multilinear function which maps to a real number.
Calling it an _algebraic_ object is exactly what obfuscates what
tensors are all about. Tensors are not a collection of numbers, just
as vectors are not collections of numbers. Vectors are geometric
objects, as are tensors.
On 11/9/2025 1:11 AM, Alan Grayson wrote:
On Saturday, November 8, 2025 at 6:25:17 AM UTC-7 Alan Grayson wrote:
In some treatments of tensors, they're described as linear maps. So, in GR, if we have a linear map described as a 4x4 matrix of real numbers, which operates on a 4-vector described as a column matrix with entries (ct, x, y, z), which transforms to another 4-vector, what must be added in this description to claim that the linear transformation satisfies the definition of a tensor? TY, AG
Let's call the linear transformation T, then the answer to my question might be that T is a tensor iff it has a continuous inverse. I'm not sure if this is correct, but I seem to recall this claim in a video about tensors I viewed in another life. But even if it's true, it seems to conflict with the claim that an ordinary vector in Euclidean space is a tensor because it's invariant under linear (?) transformations. In this formulation, it is the argument of T, which we can call V, which is invariant, not the map T. I'd appreciate it if someone here could clarity my confusion. TY, AGA tensor is a geometric object (possibly in an abstract space). It transforms covariantly; which means that changes in coordinates (even non-linear changes in coordinates) leave it the same. A physical vector, like velocity, is a one-dimensional tensor. The same vector, or tensor, has different representations in different coordinate systems. Here's an excerpt from my general relativity lectures.Vectors are used to represent things like motion, force, flow. You think of them as little arrows that show the direction of the motion, force or flow and the length of the arrow tells you the magnitude ot the motion, force or flow. So vectors have terms, one for each dimension of the space they are in. In the plane, which is two dimensional, vectors have two components. But the values of the components change depending on which coordinate system you choose to represent them. The coordinates are something we impose to facilitate our calculation. But the vector or tensor is a THING that is independent of the coordinates we use to describe them. Just as I could give directions from this building to the upper parking lot by saying it's 250ft that way or I could say it's 200ft north and 150ft west. The distance and direction would be the same, only the description is different.
Tensors are just a another step up from vectors. They describe how vectors are changing. Here's a good example of a vector field. It's called a field because there's a vector at each point. It shows the flow of water out of Monterey bay at a particular moment.
For any particular flow field we can ask how do the vectors around a particular bit of water change as that bit of water is carried along by the flow. The answer is a tensor. Just as a vector is the abstraction of a little arrow, the tensor can be thought of an as abstraction of a little circle. And then the question becomes how does this circle get distorted as the water flows.
On Mon, Nov 10, 2025 at 12:19:49AM -0800, Alan Grayson wrote:
>
> It's an ALGEBRAIC object, NOT a geometric object, defined on a vector space, as
> a multilinear function which maps to a real number.
Calling it an _algebraic_ object is exactly what obfuscates what
tensors are all about. Tensors are not a collection of numbers, just
as vectors are not collections of numbers. Vectors are geometric
objects, as are tensors.
On Sun, Nov 09, 2025 at 10:34:38PM -0800, Alan Grayson wrote:
>
>
> On Sunday, November 9, 2025 at 11:16:05 PM UTC-7 Russell Standish wrote:
>
> On Sun, Nov 09, 2025 at 09:55:15PM -0800, Alan Grayson wrote:
> >
> >
> > Someday I might find a teacher who can really define tensors, but that
> day has
> > yet to arrive. Standish seems to come close, but does every linear
> multivariate
> > function define a tensor? I'm waiting to see his reply. AG
>
> Well I did say multilinear function, but the answer is yes, every
> multilinear function on a vector space is a tensor, and vice-versa.
>
>
> How does one prove that every multilinear function on a vector space is
> invariant under a
> change in coordinates? What exactly happens to its matrix representation? And
A vector is a geometric quantity having direction and length. As such
it is independent of any coordinate system that might be applied to
the space (although the list of numbers representing the components of
the vector in a given coordinate system must vary covariantly with the
coordinate system varying). A function operating on vectors, and
returning vectors or scalars must therfore also be independent of the
coordinate system.
> I did write an 8 page article appearing in our student rag "The
> Occasional Quark" when I was a physics student, which was my attempt
> at explaining General Relativity when I was disgusted by the hash job
> done by our professor. I haven't really thought about it much since
> that time, though. I can also recommend the heavy tome by Misner,
> Thorne and Wheeler.
>
> I could scan the article and post it to this list, but not today - I
> have a few other things on my plate before finishing up.
>---------------------------------------------------------------------------
>