Senseless leading_<stuff> methods introduced on all matrices

97 views
Skip to first unread message

Johan S. H. Rosenkilde

unread,
Aug 18, 2017, 9:48:08 AM8/18/17
to sage-devel
Compared to Sage 8.0, matrices in Sage 8.1.beta1 now have a host of new
methods:

- leading_coefficient
- leading_item
- leading_monomial
- leading_support
- leading_term

These are inherited from the category of finite dimensional modules with
basis of which matrix spaces are now members.

The semantics is that an e.g. 3x3 matrix over R is an R-module over the
basis

[1 0 0] [0 1 0] [0 0 1]
[0 0 0], [0 0 0], [0 0 0], ...
[0 0 0] [0 0 0] [0 0 0]

and hence, M.leading_coefficient() on such a matrix returns M[2,2] if
this is non-zero, otherwise M[2,1] if this is non-zero, etc.

While it is arguably too rigid to say that this is "senseless" (as I
wrote in the subject), I believe that the use of these functions for
matrices is very narrow. And since matrices are an extremely central
object that beginners immediately start playing around with, it is
unfortunate that they will pollute the tab-completion to such a degree,
and with doc strings which are not very helpful to the algebraically
uninitiated.

My question here is whether it is really intended that all matrices get
stuck with these (almost) senseless methods?

(this came up during #23619 where we are introducing "leading_matrix"
and "leading_position" for matrices over a univariate polynomial ring.)

Best,
Johan

John Cremona

unread,
Aug 18, 2017, 10:01:19 AM8/18/17
to SAGE devel
On 18 August 2017 at 14:48, Johan S. H. Rosenkilde <mail...@atuin.dk> wrote:
> Compared to Sage 8.0, matrices in Sage 8.1.beta1 now have a host of new
> methods:
>
> - leading_coefficient
> - leading_item
> - leading_monomial
> - leading_support
> - leading_term
>
> These are inherited from the category of finite dimensional modules with
> basis of which matrix spaces are now members.
>
> The semantics is that an e.g. 3x3 matrix over R is an R-module over the
> basis
>
> [1 0 0] [0 1 0] [0 0 1]
> [0 0 0], [0 0 0], [0 0 0], ...
> [0 0 0] [0 0 0] [0 0 0]
>
> and hence, M.leading_coefficient() on such a matrix returns M[2,2] if
> this is non-zero, otherwise M[2,1] if this is non-zero, etc.
>
> While it is arguably too rigid to say that this is "senseless" (as I
> wrote in the subject), I believe that the use of these functions for
> matrices is very narrow. And since matrices are an extremely central
> object that beginners immediately start playing around with, it is
> unfortunate that they will pollute the tab-completion to such a degree,
> and with doc strings which are not very helpful to the algebraically
> uninitiated.

I entirely agree. If you had asked me what these methods
('leading_term') etc meant for matrices I would have had absolutely no
idea. But then the same would be true if you had asked me what the
"leading term" was of an element of the free module R^n (for any ring
R). So my recommendation would be to get rid of these at that level.
They may have meaning for some modules which happen to be free
R-modules of finite rank (such as polynomials over R of fixed degree,
where they do of course make sense) but not on __all__ R-modules!
They have been put in the wrong place. [In my opinion!]

>
> My question here is whether it is really intended that all matrices get
> stuck with these (almost) senseless methods?
>
> (this came up during #23619 where we are introducing "leading_matrix"
> and "leading_position" for matrices over a univariate polynomial ring.)
>
> Best,
> Johan
>
> --
> You received this message because you are subscribed to the Google Groups "sage-devel" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to sage-devel+...@googlegroups.com.
> To post to this group, send email to sage-...@googlegroups.com.
> Visit this group at https://groups.google.com/group/sage-devel.
> For more options, visit https://groups.google.com/d/optout.

Simon King

unread,
Aug 18, 2017, 10:05:41 AM8/18/17
to sage-...@googlegroups.com
Hi Johan,

On 2017-08-18, Johan S. H. Rosenkilde <mail...@atuin.dk> wrote:
> My question here is whether it is really intended that all matrices get
> stuck with these (almost) senseless methods?

I didn't introduce them, but I believe it would be quite useful for my
applications (provided it is a *fast* method!). Sketch: I am using
Gröbner basis methods for modules over finite dimensional quotients
of path algebras.
"Gröbner basis methods" means that one has all these notions leading_*,
and "finite dimensional" means that elements can be represented by
matrices. By taking a basis that corresponds to the standard monomials
of a Gröbner basis for the quotient relations, the leading_* of matrices
exactly corresponds to the leading_* notion in the sense of Gröbner
bases.

Best regards,
Simon

Travis Scrimshaw

unread,
Aug 19, 2017, 9:59:02 PM8/19/17
to sage-devel, mail...@atuin.dk

Compared to Sage 8.0, matrices in Sage 8.1.beta1 now have a host of new
methods:

  - leading_coefficient
  - leading_item
  - leading_monomial
  - leading_support
  - leading_term

These are inherited from the category of finite dimensional modules with
basis of which matrix spaces are now members.

As they should be.

The semantics is that an e.g. 3x3 matrix over R is an R-module over the
basis

   [1 0 0]  [0 1 0]  [0 0 1]
   [0 0 0], [0 0 0], [0 0 0], ...
   [0 0 0]  [0 0 0]  [0 0 0]

and hence, M.leading_coefficient() on such a matrix returns M[2,2] if
this is non-zero, otherwise M[2,1] if this is non-zero, etc.

While it is arguably too rigid to say that this is "senseless" (as I
wrote in the subject),

You already did that, and because you started off calling them "senseless," you have polluted this issue with your heavily loaded question. That is unfair and demeaning.
 
I believe that the use of these functions for
matrices is very narrow.

That is untrue. They are very useful to create any sort of submodule/algebra of the matrix module/algebra to be consistent with all other modules or algebras with a distinguished basis. There is also a natural order on the basis too, so things like triangularity of module morphisms need such methods. So perhaps the field of representation theory and triangularity of morphisms is "very narrow."
 
And since matrices are an extremely central
object that beginners immediately start playing around with, it is
unfortunate that they will pollute the tab-completion to such a degree,
and with doc strings which are not very helpful to the algebraically
uninitiated. 

Let's get rid of hamming_weight for polynomials since polynomials are also extremely central objects that beginners start playing around with. Of course, I am not actually proposing this, but this is in the same spirit.

My question here is whether it is really intended that all matrices get
stuck with these (almost) senseless methods?

Yes, and because they are there, I can do things like easily construct an inverse of a module morphism. (Although there is an issue with the fact that the matrices are not immutable that we have to work around.)

(this came up during #23619 where we are introducing "leading_matrix"
and "leading_position" for matrices over a univariate polynomial ring.)

I think you should be more explicit with your method names and call the first one leading_term_matrix() and the second one I have no idea why it would do.


I entirely agree.  If you had asked me what these methods
('leading_term') etc meant for matrices I would have had absolutely no
idea.  But then the same would be true if you had asked me what the
"leading term" was of an element of the free module R^n (for any ring
R).  So my recommendation would be to get rid of these at that level.
They may have meaning for some modules which happen to be free
R-modules of finite rank (such as polynomials over R of fixed degree,
where they do of course make sense) but not on __all__ R-modules!
They have been put in the wrong place.  [In my opinion!]
 
Last time I checked, R^n would be a (finite ranked) free R-module. So I don't follow your argument. In fact, these methods make sense for any free R-module where there is an (partial) order on the basis. The category ModulesWithBasis means objects that are free R-modules. Yet, we don't have a good way to check when the basis is indexed by something with a fixed partial order, so we just assume that (which is almost always true). So I believe they are in the correct place, moreover, I don't see any other reasonable place to put these (very useful) methods.

Best,
Travis

Johan S. H. Rosenkilde

unread,
Aug 20, 2017, 4:41:42 AM8/20/17
to sage-devel

Travis Scrimshaw writes:
>> While it is arguably too rigid to say that this is "senseless" (as I
>> wrote in the subject),
>
> You already did that, and because you started off calling them "senseless,"
> you have polluted this issue with your heavily loaded question. That is
> unfair and demeaning.

Yes, I'm guilty, I gave the thread a provocative name to make people
read it...


>> I believe that the use of these functions for
>> matrices is very narrow.
>
> That is untrue. They are very useful to create any sort of
> submodule/algebra of the matrix module/algebra to be consistent with all
> other modules or algebras with a distinguished basis. There is also a
> natural order on the basis too, so things like triangularity of module
> morphisms need such methods. So perhaps the field of representation theory
> and triangularity of morphisms is "very narrow."

Good point, as was Simon's concrete example.

> Let's get rid of hamming_weight for polynomials since polynomials are also
> extremely central objects that beginners start playing around with. Of
> course, I am not actually proposing this, but this is in the same spirit.

I disagree that this is in the same spirit: the hamming weight/sparsity
of a polynomial is a readily understandable concept, pertaining to
polynomials directly, whose functionality anyone "playing around" with
polynomials will recognise. As opposed to this, these generic finite
rank module methods seem not useful for matrices themselves, but
primarily as glue in a much more abstract context that you described
above.


>> (this came up during #23619 where we are introducing "leading_matrix"
>> and "leading_position" for matrices over a univariate polynomial ring.)
>>
>> I think you should be more explicit with your method names and call the
> first one leading_term_matrix() and the second one I have no idea why it
> would do.

I would argue that the burden of explicit method names should be *much
higher* on abstract, inherited methods! While I can see the argument for
preferring "leading_term_matrix" in favour of "leading_matrix", it is
important to keep in mind that this method will only appear for matrices
over univariate polynomials, and the user therefore expects its methods
to be named and behave accordingly.

Compare this with "leading_coefficient" on a matrix; most users would
have no idea what that is before they read the doc, and even then -
since the doc is written with totally different examples in mind - they
have to call that method and perhaps "leading_position" several times
before they can figure it out.


I won't argue further against whether the methods should be there
or not. If several mathematicians feel they are really useful, I'm sure
you're right. But then I would argue that they should be renamed into
something much more precise, or perhaps hidden behind an indirection.
Perhaps something like:

sage: M.as_finite_rank_module.leading_<stuff>?

(except that this sounds like the matrix becomes a finite rank module
which is nonsense.)

Best,
Johan

Vincent Delecroix

unread,
Aug 20, 2017, 6:49:06 AM8/20/17
to sage-...@googlegroups.com
If the basis of a "Finite dimensional module with basis" is always
assumed to be ordered, then such method make sense. However, the
terminology is quite strange. I see 1+1/2 ambiguities for matrices over
polynomial ring such as Mat(ZZ[X], 3).

1) leading_coefficient might be a termwise application of
leading_coefficient

[1 X^2+X+1]
[X+3 2*X-3 ]

would result in

[1 1]
[1 2]

2) There is an additional trouble if Mat(ZZ[X], 3) is intended to be
equivalent to polynomial ring over matrices Mat(ZZ, 3)[X]. Such matrix
can naturally be written as

M0 + M1 X + M2 X^2 + ... + Md X^d

where M0, M1, ..., Md are matrices with coefficients in ZZ. With the
above writing, the leading coefficient is Md.

Vincent

Johan S. H. Rosenkilde

unread,
Aug 20, 2017, 12:47:39 PM8/20/17
to sage-...@googlegroups.com

Vincent Delecroix writes:

> If the basis of a "Finite dimensional module with basis" is always assumed to be
> ordered, then such method make sense. However, the terminology is quite strange.
> I see 1+1/2 ambiguities for matrices over polynomial ring such as Mat(ZZ[X], 3).
>
> 1) leading_coefficient might be a termwise application of leading_coefficient
>
> [1 X^2+X+1]
> [X+3 2*X-3 ]
>
> would result in
>
> [1 1]
> [1 2]
>
> 2) There is an additional trouble if Mat(ZZ[X], 3) is intended to be equivalent
> to polynomial ring over matrices Mat(ZZ, 3)[X]. Such matrix can naturally be
> written as
>
> M0 + M1 X + M2 X^2 + ... + Md X^d
>
> where M0, M1, ..., Md are matrices with coefficients in ZZ. With the above
> writing, the leading coefficient is Md.

Indeed. And then, as a third option, the one we are introducing in
#23619 as leading_matrix, where we take for each row v, the coefficients
of terms of deg(v):

[1 X^2+X+1] [0 1]
leading_matrix( [X+3 2*X-3 ] ) = [1 1]

Best,
Johan

John Cremona

unread,
Aug 20, 2017, 1:02:32 PM8/20/17
to SAGE devel
Don't get too carried away with thinking up ways in which users might
conceivably get confused! The rings "matrices over polynomials over
F" and "polynomials over matrices over F" are isomorphic but not the
same.

More to the point, a free module over a ring R will know what its base
ring and rank are, in computer science terms, even if in mathematical
terms it might have the structure of a free module over different
rings. (For example every QQ-vector space is a free module over QQ,
and also over ZZ.)

My criticism was robustly countered. I'll go back to just saying that
if you had asked me what the "leading monomial" of an element of R^n
is, I might have guessed it (probably not, or wrongly), but I have
never heard of this expression in this context. And I have been
around for a while, teaching graduate courses in algebra etc.

John

Dima Pasechnik

unread,
Aug 20, 2017, 4:49:44 PM8/20/17
to sage-devel
Hmm, D.Eisenbud, "Commutative algebra with ...", Chapter 15, talks about this stuff at length.
(Of course this book is too thick for a UK graduate course ;-))

Travis Scrimshaw

unread,
Aug 21, 2017, 11:31:02 AM8/21/17
to sage-devel


On Sunday, August 20, 2017 at 12:02:32 PM UTC-5, John Cremona wrote:

Don't get too carried away with thinking up ways in which users might
conceivably get confused!   The rings "matrices over polynomials over
F" and "polynomials over matrices over F" are isomorphic but not the
same.

I think a really good example of this is the four isomorphic-but-not-identical:

ZZ['x']['y']
ZZ['y']['x']
ZZ['x,y']
ZZ['y,x']

Each will behave in slightly different ways with respect to the usual polynomial methods.

Best,
Travis

Reply all
Reply to author
Forward
0 new messages