commening the Nemo 0.4 release announce

119 views
Skip to first unread message

Jakob Kroeker

unread,
Dec 11, 2015, 6:22:04 PM12/11/15
to sage-flame


If we can trust the report at danluu.com about the julia language (I didn't check),
there is likely little reason for sage developers to fear competition with Nemo (just in case) in the near future:
the julia language development seems to have serious quality issues.

On the other side it is sad, that another computer algebra package (Nemo),
 potentially reusable in other CAS might struggle on software development issues...





 







Bill Hart

unread,
Dec 11, 2015, 7:58:27 PM12/11/15
to sage-flame
I don't think I'd pay too much attention to a report written by someone with a potential conflict of interest.

Besides that report was written a long time ago. For example Julia since started using continuous integration and took numerous other measures such as introducing a code of conduct.

But as I said, that was a long time ago. If Julia is good enough for the Reserve Bank of New York, I'm pretty sure that old report is pretty irrelevant.

You might be interested to know that Julia just got a modest sized grant to bring the language to version 1.0 within two years. Probably circulating the one old negative report about Julia instead of any of the positive ones isn't going to change their bank balance.

Bill.

--
You received this message because you are subscribed to the Google Groups "sage-flame" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sage-flame+...@googlegroups.com.
To post to this group, send email to sage-...@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-flame.
For more options, visit https://groups.google.com/d/optout.

Richard Fateman

unread,
Dec 11, 2015, 8:13:11 PM12/11/15
to sage-...@googlegroups.com
On 12/11/2015 3:22 PM, Jakob Kroeker wrote:


If we can trust the report at danluu.com about the julia language (I didn't check),
there is likely little reason for sage developers to fear competition with Nemo (just in case) in the near future:
the julia language development seems to have serious quality issues.
Looking at the description of Nemo here
http://nemocas.org/

I would not call it a computer algebra system (CAS) as usually used to denote
symbolic and algebraic manipulation programs like Maxima, Mathematica, Maple.

As a computer algebra package it makes use of
Flint, Antic, Pari, GMP/MPIR, MPFR, Singular and Arb.
all of which are presumably in Sage already.


On the other side it is sad, that another computer algebra package (Nemo),
 potentially reusable in other CAS might struggle on software development issues...

Frankly, any application package that asserts in the first line of its description
that it is "for the X programming language"  is likely to be written to appeal to
programmers using X.   If Sage appeals to pythonistas and Nemo to
julianistas, there is no conflict.

I assume that the people doing Nemo have some reason to do it,
otherwise endlessly rewriting the basic  routines used for  (say) polynomial arithmetic
does not advance the understanding of such software.  Maybe the speed is
improved and this makes people happy.  Sometimes that makes me happy, not
that I am so impatient to get results out of my computer programs.

Bill Hart

unread,
Dec 11, 2015, 8:25:49 PM12/11/15
to sage-flame
So now sage-flame is used for flaming Nemo? Why the sudden interest? Bored with life? Can't sleep?

On 12 December 2015 at 02:13, Richard Fateman <fat...@berkeley.edu> wrote:
On 12/11/2015 3:22 PM, Jakob Kroeker wrote:


If we can trust the report at danluu.com about the julia language (I didn't check),
there is likely little reason for sage developers to fear competition with Nemo (just in case) in the near future:
the julia language development seems to have serious quality issues.
Looking at the description of Nemo here
http://nemocas.org/

I would not call it a computer algebra system (CAS) as usually used to denote
symbolic and algebraic manipulation programs like Maxima, Mathematica, Maple.

Correct. It is not a computer algebra system. Which is why we call it a "computer algebra package" with the aim of eventually covering commutative algebra (also often called computer algebra), number theory and group theory (most of which it does not cover yet, because you know, time and all). 

As a computer algebra package it makes use of
Flint, Antic, Pari, GMP/MPIR, MPFR, Singular and Arb.
all of which are presumably in Sage already.

And yet Singular is for computer algebra, Pari for number theory, etc.

Damned if you do, damned if you don't.
 


On the other side it is sad, that another computer algebra package (Nemo),
 potentially reusable in other CAS might struggle on software development issues...

Frankly, any application package that asserts in the first line of its description
that it is "for the X programming language"  is likely to be written to appeal to
programmers using X.

Really.
 
   If Sage appeals to pythonistas and Nemo to
julianistas, there is no conflict.

I assume that the people doing Nemo have some reason to do it,
otherwise endlessly rewriting the basic  routines used for  (say) polynomial arithmetic
does not advance the understanding of such software.  Maybe the speed is
improved and this makes people happy.  Sometimes that makes me happy, not
that I am so impatient to get results out of my computer programs.
      

Well, you might be impatient if the computations would take longer than the age of the universe to complete.

It's certainly faster for certain things, even at this very early stage (take a look at our benchmarks page). But there are other reasons for doing it, some of which are even helpfully spelled out in the first few pages of our documentation.

The Julia people have for some time been considering rewriting their compiler in Lisp, by the way (seriously). Perhaps we should refer to it as a "computer algebra package for Julia, based on Lisp?"

Bill.


 







--
You received this message because you are subscribed to the Google Groups "sage-flame" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sage-flame+...@googlegroups.com.
To post to this group, send email to sage-...@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-flame.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "sage-flame" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sage-flame+...@googlegroups.com.
To post to this group, send email to sage-...@googlegroups.com.

William Stein

unread,
Dec 11, 2015, 8:43:56 PM12/11/15
to sage-flame
On Fri, Dec 11, 2015 at 5:25 PM, 'Bill Hart' via sage-flame
<sage-...@googlegroups.com> wrote:
> So now sage-flame is used for flaming Nemo? Why the sudden interest? Bored
> with life? Can't sleep?
>
> On 12 December 2015 at 02:13, Richard Fateman <fat...@berkeley.edu> wrote:
>>
>> On 12/11/2015 3:22 PM, Jakob Kroeker wrote:
>>
>>
>>
>> If we can trust the report at danluu.com about the julia language (I
>> didn't check),
>> there is likely little reason for sage developers to fear competition with
>> Nemo (just in case) in the near future:
>> the julia language development seems to have serious quality issues.
>>
>> Looking at the description of Nemo here
>> http://nemocas.org/
>>
>> I would not call it a computer algebra system (CAS) as usually used to
>> denote
>> symbolic and algebraic manipulation programs like Maxima, Mathematica,
>> Maple.
>
>
> Correct. It is not a computer algebra system. Which is why we call it a
> "computer algebra package" with the aim of eventually covering commutative
> algebra (also often called computer algebra), number theory and group theory
> (most of which it does not cover yet, because you know, time and all).

"Computer algebra package" versus "computer algebra system" is not
really much of a distinction.

Everywhere I always try to call Sage "mathematical software". Calling
"mathematical software" is something I just made up, since I
personally don't like the term "computer algebra" to describe Sage (or
Nemo), since it's misleading.

>>
>>
>> As a computer algebra package it makes use of
>> Flint, Antic, Pari, GMP/MPIR, MPFR, Singular and Arb.
>> all of which are presumably in Sage already.
>
>
> And yet Singular is for computer algebra, Pari for number theory, etc.
>
> Damned if you do, damned if you don't.

Don't -- just use "mathematical software" instead.

Or you could call it "The world's definitive system for modern
technical computing"...



--
William (http://wstein.org)

William Stein

unread,
Dec 11, 2015, 8:48:47 PM12/11/15
to sage-flame
On Fri, Dec 11, 2015 at 3:22 PM, Jakob Kroeker
<google...@spaceship-earth.net> wrote:
>
>
> If we can trust the report at danluu.com about the julia language (I didn't
> check),
> there is likely little reason for sage developers to fear competition with
> Nemo (just in case) in the near future:

Sage developers are on the same team as Julia developers -- there's no
fear here. When Nemo has substantial 10x-better functionality than
Sage in enough ways, it'll just be included in Sage; our frickin'
source download is 500MB after all.

I fear competition with Magma, Mathematica, Matlab, and maybe Maple much more.

For what it's worth, I appreciate you posting that blog post; I read
it carefully, along with some others on the guy's page. He basically
just described how every rapidly developing software project is during
the early days, and says so very clearly at the end. The best
approach to organizing the development of software depends hugely on
the age of the software, just like the best approach to treating a
child depends on their age, to running a company depends on its age
and size (and industry), etc. The only thing that is definitely
wrong is trying to apply one-size-fits-all approaches... to anything.

> the julia language development seems to have serious quality issues.
>
> On the other side it is sad, that another computer algebra package (Nemo),
> potentially reusable in other CAS might struggle on software development
> issues...
>
>
>
>
>
>
>
>
>
>
>
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "sage-flame" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to sage-flame+...@googlegroups.com.
> To post to this group, send email to sage-...@googlegroups.com.
> Visit this group at http://groups.google.com/group/sage-flame.
> For more options, visit https://groups.google.com/d/optout.



--
William (http://wstein.org)

Bill Hart

unread,
Dec 11, 2015, 10:17:37 PM12/11/15
to sage-flame
On 12 December 2015 at 02:43, William Stein <wst...@gmail.com> wrote:
On Fri, Dec 11, 2015 at 5:25 PM, 'Bill Hart' via sage-flame
<sage-...@googlegroups.com> wrote:
> So now sage-flame is used for flaming Nemo? Why the sudden interest? Bored
> with life? Can't sleep?
>
> On 12 December 2015 at 02:13, Richard Fateman <fat...@berkeley.edu> wrote:
>>
>> On 12/11/2015 3:22 PM, Jakob Kroeker wrote:
>>
>>
>>
>> If we can trust the report at danluu.com about the julia language (I
>> didn't check),
>> there is likely little reason for sage developers to fear competition with
>> Nemo (just in case) in the near future:
>> the julia language development seems to have serious quality issues.
>>
>> Looking at the description of Nemo here
>> http://nemocas.org/
>>
>> I would not call it a computer algebra system (CAS) as usually used to
>> denote
>> symbolic and algebraic manipulation programs like Maxima, Mathematica,
>> Maple.
>
>
> Correct. It is not a computer algebra system. Which is why we call it a
> "computer algebra package" with the aim of eventually covering commutative
> algebra (also often called computer algebra), number theory and group theory
> (most of which it does not cover yet, because you know, time and all).

"Computer algebra package" versus "computer algebra system" is not
really much of a distinction.

No, we really intend to target computer algebra eventually.

The description is not a mistake but quite deliberate, it's just that people are drawing natural, but ultimately wrong conclusions. Hopefully what Nemo is really all about will become clearer to people over time. It's certainly not clear just yet since we actually have very little of anything at the moment.

I'd like to clarify things more now, but actually there are quite a few decisions yet to be made. We can simply say what we currently intend.


Everywhere I always try to call Sage "mathematical software".  Calling
"mathematical software" is something I just made up, since I
personally don't like the term "computer algebra" to describe Sage (or
Nemo), since it's misleading.

Confusing perhaps. I wouldn't say it was misleading. Computer algebra as a term is used in at least two different ways.
 

>>
>>
>> As a computer algebra package it makes use of
>> Flint, Antic, Pari, GMP/MPIR, MPFR, Singular and Arb.
>> all of which are presumably in Sage already.
>
>
> And yet Singular is for computer algebra, Pari for number theory, etc.
>
> Damned if you do, damned if you don't.

Don't -- just use "mathematical software" instead.

There's a very solid tradition where I work. We won't be changing the terminology.
 

Or you could call it "The world's definitive system for modern
technical computing"...

Ha ha.

Bill.
 



--
William (http://wstein.org)

Richard Fateman

unread,
Dec 12, 2015, 12:00:28 AM12/12/15
to sage-...@googlegroups.com
On 12/11/2015 5:43 PM, William Stein wrote:
> On Fri, Dec 11, 2015 at 5:25 PM, 'Bill Hart' via sage-flame
> <sage-...@googlegroups.com> wrote:
>> So now sage-flame is used for flaming Nemo? Why the sudden interest? Bored
>> with life? Can't sleep?
>>
>> On 12 December 2015 at 02:13, Richard Fateman <fat...@berkeley.edu> wrote:
>>> On 12/11/2015 3:22 PM, Jakob Kroeker wrote:
>>>
>>>
>>>
>>> If we can trust the report at danluu.com about the julia language (I
>>> didn't check),
>>> there is likely little reason for sage developers to fear competition with
>>> Nemo (just in case) in the near future:
>>> the julia language development seems to have serious quality issues.
>>>
>>> Looking at the description of Nemo here
>>> http://nemocas.org/
>>>
>>> I would not call it a computer algebra system (CAS) as usually used to
>>> denote
>>> symbolic and algebraic manipulation programs like Maxima, Mathematica,
>>> Maple.
>>
>> Correct. It is not a computer algebra system. Which is why we call it a
>> "computer algebra package" with the aim of eventually covering commutative
>> algebra (also often called computer algebra), number theory and group theory
>> (most of which it does not cover yet, because you know, time and all).
> "Computer algebra package" versus "computer algebra system" is not
> really much of a distinction.
I agree with Bill Hart, disagree with William Stein.
You see, there is a history to this. The Association for Computing
Machinery (ACM), the generally
recognized main professional association for computer scientists has a
special interest group
(SIG) called SIGSAM. The SAM stands for Symbolic and Algebraic
Manipulation.
Certain persons launched a campaign to change the name of SIGSAM but
failed. They
nevertheless managed to make the phrase Computer Algebra System more common.
The word Manipulation bugged them.

One could, I think, defend the assertion that anything that is done
systematically
with symbols Including say, parts of analysis, geometry, topology, ...
algorithmically
is done by virtue of converting the problem and the solution to algebra
(some algebra). Thus
the computer algebra name could be defended that way. It's not that
(say) Maxima
isn't doing oh, differential geometry, or asymptotic analysis, or logic
etc, but that
it is doing "the algebra of these things".

A computer system that was circumscribed by, say, the contents of a
textbook on Algebra
(say Birkhoff/MacLane: Survey of Modern Algebra, which I have on my
shelf) would
be much different from Maxima and its ilk. I might call such a system
"algorithms for computations in modern algebra". One might call that
"computer algebra" but that name has already been taken for something else.

Computer "algebra package" describes Nemo. I'm mostly OK with that.

"Computer algebra" package, if it is a synonym for Computer Algebra
System (CAS)
is not Nemo, so far at least.


>
> Everywhere I always try to call Sage "mathematical software". Calling
> "mathematical software" is something I just made up, since I
> personally don't like the term "computer algebra" to describe Sage (or
> Nemo), since it's misleading.
If you think you made up the term mathematical software, you didn't do
due diligence in looking up the term.

See. for example, the book by John Rice, "Mathematical Software",
Academic Press, 1971.
There were 2 more books by Rice, and then of course ACM's
Transactions on Mathematical Software, as well as conferences etc.

I think the intersection of Sage and the traditional notion of
mathematical software is nearly empty. So maybe you shouldn't
use that term either. math software is more numerical
computation (floating-point libraries and algorithms and such.
Not to exclude integer algorithms, discrete, combinatorial,
etc. but not so much symbolic/algebraic.)





>
>>>
>>> As a computer algebra package it makes use of
>>> Flint, Antic, Pari, GMP/MPIR, MPFR, Singular and Arb.
>>> all of which are presumably in Sage already.
>>
>> And yet Singular is for computer algebra, Pari for number theory, etc.
>>
>> Damned if you do, damned if you don't.
> Don't -- just use "mathematical software" instead.
>
> Or you could call it "The world's definitive system for modern
> technical computing"...

maybe equating Sage's current domain with constructive mathematics
would be clearer.
RJF


PS. nice to know that Julia has defenders too.


>
>
>

Bill Hart

unread,
Dec 12, 2015, 12:01:10 AM12/12/15
to sage-flame
On 12 December 2015 at 02:48, William Stein <wst...@gmail.com> wrote:
On Fri, Dec 11, 2015 at 3:22 PM, Jakob Kroeker
<google...@spaceship-earth.net> wrote:
>
>
> If we can trust the report at danluu.com about the julia language (I didn't
> check),
> there is likely little reason for sage developers to fear competition with
> Nemo (just in case) in the near future:

Sage developers are on the same team as Julia developers -- there's no
fear here.  When Nemo has substantial 10x-better functionality than
Sage in enough ways, it'll just be included in Sage; our frickin'
source download is 500MB after all.

Julia plus its dependencies would add quite a lot to that. But since some of those dependencies are already in Sage, it probably wouldn't be a huge hit.

It should in theory be possible to embed Julia and use it from within Sage fairly easily should that ever be desirable. We'll see about the 10x-better functionality. That's a pretty tall order.

I'm not sure how motivated people will be to incorporate ideas from Nemo into Sage. That'll probably dictate whether Nemo ends up being part of Sage or not.
 

I fear competition with Magma, Mathematica, Matlab, and maybe Maple much more.

Maple, Mathematica and Matlab have vastly more resources, so they will be much more competition for Sage.

Nemo on the other hand isn't even trying to be the next Mathematica, Maple or Matlab. 

We are much more interested in targeting the research community, like Magma. Magma of course has something like a 30 year head start, so we have our work cut out for us if we even want to think about emulating the success of Magma.

We obviously need a much more efficient way of getting to our goals, otherwise Magma will always be 30 years ahead. But we are certainly working on a strategy for long term success.

What's surprising me the most is just how sophisticated the requirements of serious mathematicians are. It's actually very hard to convince them that using a standard programming language is sufficient to do serious maths. The Gap programming language for example, is significantly more powerful than just about any general purpose language out there. And there are others with equally sophisticated object models and dispatch mechanisms.

For this reason, Nemo doesn't aspire to be a single solution for everyone, but to fill a niche that exists. We want to make it possible to implement very high performance, mid-level, generic algorithms.

That will eventually have to be combined with a mathematical front end for which no general purpose programming language currently available is sufficient.


For what it's worth, I appreciate you posting that blog post; I read
it carefully, along with some others on the guy's page.   He basically
just described how every rapidly developing software project is during
the early days, and says so very clearly at the end.   The best
approach to organizing the development of software depends hugely on
the age of the software, just like the best approach to treating a
child depends on their age, to running a company depends on its age
and size (and industry), etc. 

My own take on it is a bit different. 

I observe that Julia development is extremely rapid due to the unusually high productivity of its developers, who also often have their fingers in more than one technological pie.

Some pundits are used to working on projects where things remain on the wishlist for years or even decades. They aren't used to projects that can completely rewrite an entire subsystem just to fix a handful of design issues, within a few weeks or months, often before the pundit finishes complaining.

On the other hand, some pundits come from commercial environments working with languages like Java and expect Julia to be the same as Java in every possible way (insert whatever language you prefer here).

In short, Julia works as a project really, really effectively. They actually aren't in need of advice, yet they keep getting it. You can imagine the disruption that would be caused if they put down their tools every time this happened.

I'm sure the Julia developers' own take on it would be very different to mine. But that's my take on it. They are more or less graciously handling things whilst keeping productive and creative output at a maximum.

  The only thing that is definitely
wrong is trying to apply one-size-fits-all approaches... to anything.

Absolutely.

Bill.

Bill Hart

unread,
Dec 12, 2015, 12:46:01 AM12/12/15
to sage-flame
What Nemo currently intends by "computer algebra" is basically commutative algebra on a computer. 

There is however a long tradition here in Germany and many parts of Europe, defending a specific definition of the term. It certainly includes symbolic manipulation for some, but as I understand it is essentially synonymous with commutative algebra on a computer.

I work in a "Computer Algebra Centre", in the "Computer Algebra group", which also produces the "Computer Algebra System", Singular, which is 30 years old. So it seems to be a well defended term. I've come to understand computer algebra proper as being something that you probably need a Groebner basis for.

Someone asked me the other day whether we would be implementing noncommutative, differential algebras. At this stage I would say not. But Singular does, for example.

Nemo fully and deliberately intends to be a meld of commutative algebra, number theory and group theory, eventually. If that makes it a computer algebra system, so be it. However, that phrase might suggest to people that we are going to include various things that we are not.

Bill.

Bill Hart

unread,
Dec 12, 2015, 12:59:56 AM12/12/15
to sage-flame
Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".


It outlines the elementary parts of the subject pretty well.

Bear in mind that we intend to rely on Singular for the majority of the computer algebra we intend to provide in Nemo.

Specifically, its Groebner basis engine is crucial for what we have in mind. We also intend to supply access to the many Singular libraries written in the Singular language, modulo many improvements we have planned. Its obviously not going to happen overnight.

Bill.

William Stein

unread,
Dec 12, 2015, 1:52:51 AM12/12/15
to sage-flame
>> Everywhere I always try to call Sage "mathematical software". Calling
>> "mathematical software" is something I just made up, since I
>> personally don't like the term "computer algebra" to describe Sage (or
>> Nemo), since it's misleading.
>
> If you think you made up the term mathematical software, you didn't do
> due diligence in looking up the term.

No, I don't think I magically made the term "mathematical software"
up! Give me a break. When I wrote 'Calling it "mathematical
software" is something I just made up' I mean that I didn't do it
because somebody told me to do so, or because I was purposely copying
some existing standard.

That said, a google search for "mathematical software" has a lot of
things I wrote...

> I think the intersection of Sage and the traditional notion of
> mathematical software is nearly empty. So maybe you shouldn't
> use that term either. math software is more numerical
> computation (floating-point libraries and algorithms and such.
> Not to exclude integer algorithms, discrete, combinatorial,
> etc. but not so much symbolic/algebraic.)

"Mathematical software" is by far the best 2-words to describe what
Sage is. I could care less how that phrase might have been used in
some book from before I was born.

Mathematical Software is a great term for software that helps one in
doing mathematics. At least the ACM thinks so too, with their "

ACM Transactions on

Mathematical Software (TOMS)

Specializes in research in the methodology and practice of
mathematical computer program development, evaluation and use.
"

http://toms.acm.org/


> The Association for Computing Machinery (ACM), the generally
recognized main professional association for computer scientists has a
special interest group (SIG) called SIGSAM. The SAM stands for
Symbolic and Algebraic Manipulation. Certain persons launched a
campaign to change the name of SIGSAM but failed. They nevertheless
managed to make the phrase Computer Algebra System more common. The
word Manipulation bugged them.

Thanks for the history. I have often wondered why people still use
the awkward misleading phrase "Computer Algebra system"....

-- William

>
>
>
>
>
>>
>>>>
>>>> As a computer algebra package it makes use of
>>>> Flint, Antic, Pari, GMP/MPIR, MPFR, Singular and Arb.
>>>> all of which are presumably in Sage already.
>>>
>>>
>>> And yet Singular is for computer algebra, Pari for number theory, etc.
>>>
>>> Damned if you do, damned if you don't.
>>
>> Don't -- just use "mathematical software" instead.
>>
>> Or you could call it "The world's definitive system for modern
>> technical computing"...
>
>
> maybe equating Sage's current domain with constructive mathematics would be
> clearer.
> RJF
>
>
> PS. nice to know that Julia has defenders too.
>
>
>
>>
>>
>>
>
> --
> You received this message because you are subscribed to the Google Groups
> "sage-flame" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to sage-flame+...@googlegroups.com.
> To post to this group, send email to sage-...@googlegroups.com.
> Visit this group at https://groups.google.com/group/sage-flame.
> For more options, visit https://groups.google.com/d/optout.



--
William (http://wstein.org)

William Stein

unread,
Dec 12, 2015, 1:58:16 AM12/12/15
to sage-flame
On Fri, Dec 11, 2015 at 9:01 PM, 'Bill Hart' via sage-flame
<sage-...@googlegroups.com> wrote:
>> Sage developers are on the same team as Julia developers -- there's no
>> fear here. When Nemo has substantial 10x-better functionality than
>> Sage in enough ways, it'll just be included in Sage; our frickin'
>> source download is 500MB after all.
>
>
> Julia plus its dependencies would add quite a lot to that. But since some of
> those dependencies are already in Sage, it probably wouldn't be a huge hit.

How much? I would rather no sooner rather than later...

> It should in theory be possible to embed Julia and use it from within Sage
> fairly easily should that ever be desirable. We'll see about the 10x-better
> functionality. That's a pretty tall order.
> I'm not sure how motivated people will be to incorporate ideas from Nemo
> into Sage. That'll probably dictate whether Nemo ends up being part of Sage
> or not.

A huge point of Sage is incorporating ideas from other software, at
least if the software is sufficiently good. Building the car, not
reinventing the wheel. We'll want Nemo.



--
William (http://wstein.org)

Bill Hart

unread,
Dec 12, 2015, 3:05:06 AM12/12/15
to sage-flame
On 12 December 2015 at 07:57, William Stein <wst...@gmail.com> wrote:
On Fri, Dec 11, 2015 at 9:01 PM, 'Bill Hart' via sage-flame
<sage-...@googlegroups.com> wrote:
>> Sage developers are on the same team as Julia developers -- there's no
>> fear here.  When Nemo has substantial 10x-better functionality than
>> Sage in enough ways, it'll just be included in Sage; our frickin'
>> source download is 500MB after all.
>
>
> Julia plus its dependencies would add quite a lot to that. But since some of
> those dependencies are already in Sage, it probably wouldn't be a huge hit.

How much?  I would rather no sooner rather than later...

I think it'll come in under 100mb. But it's certainly not going down and is still changing.

It's the standard Julia source tree plus dependencies, plus Cxx, libclang, lldb, llvm and of course Nemo and Antic. Those are I think the dependencies you don't have at the moment.
 

> It should in theory be possible to embed Julia and use it from within Sage
> fairly easily should that ever be desirable. We'll see about the 10x-better
> functionality. That's a pretty tall order.
> I'm not sure how motivated people will be to incorporate ideas from Nemo
> into Sage. That'll probably dictate whether Nemo ends up being part of Sage
> or not.

A huge point of Sage is incorporating ideas from other software, at
least if the software is sufficiently good.   Building the car, not
reinventing the wheel.  We'll want Nemo.

What I meant is whether Sage spends its programmer cycles speeding up existing things in Sage or implementing new things, say by learning from what we did in Nemo, or whether it just includes Nemo wholus bolus. 

I don't think coming up with a port of an existing implementation is quite the same as reinventing the wheel.

I don't mind either way, it's just that doing things one way may not be any more complicated than doing it the other. Time will tell.

Bill.

Bill Hart

unread,
Dec 12, 2015, 3:32:56 AM12/12/15
to sage-flame
On 12 December 2015 at 09:05, Bill Hart <goodwi...@googlemail.com> wrote:


On 12 December 2015 at 07:57, William Stein <wst...@gmail.com> wrote:
On Fri, Dec 11, 2015 at 9:01 PM, 'Bill Hart' via sage-flame
<sage-...@googlegroups.com> wrote:
>> Sage developers are on the same team as Julia developers -- there's no
>> fear here.  When Nemo has substantial 10x-better functionality than
>> Sage in enough ways, it'll just be included in Sage; our frickin'
>> source download is 500MB after all.
>
>
> Julia plus its dependencies would add quite a lot to that. But since some of
> those dependencies are already in Sage, it probably wouldn't be a huge hit.

How much?  I would rather no sooner rather than later...

I think it'll come in under 100mb. But it's certainly not going down and is still changing.

I think the figures are as follows:

75.9mb : Julia + dependencies (OpenBLAS, etc) + LLVM + LLDB 
8.7mb : libclang
87kb : Cxx
127kb : Antic
234kb : Nemo

Cxx also has some dependencies. They include libedit-devlibncurses5-dev.

You already have Flint, NTL, Pari, Arb, Singular, MPIR, MPFR, Gap and various other dependencies we plan to add. So they won't add to the budget.

Bill.

Richard Fateman

unread,
Dec 12, 2015, 12:30:51 PM12/12/15
to sage-...@googlegroups.com
On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".


It outlines the elementary parts of the subject pretty well.
Well, if you choose to define computer algebra that way, then it defines it.
I would not agree at all.

It treats "symbolic integration" in about 1.5 pages out of 222, and there it is misleading,
not distinguishing the two rather different problems of symbolic definite integration
and symbolic (indefinite) integration in terms of elementary functions. It is also incorrect
in asserting that Risch's methods are an algorithm. And (I suspect) is incorrect in
claiming Axiom is best on this.

It omits entirely some very significant mathematics, e.g. look up Schanuel's conjecture and
results of Daniel Richardson.

It omits mention of Collins, Brown, Zippel, etc regarding computation of polynomial GCDs.
In fact, it seems to omit complexity entirely.  It also omits entirely mention of the FFT,
an important underlying (discrete or continuous) method.  I think I could find other
omissions if I compared a course outline of mine to the notes.

This appears to be a course on algebra.  It occasionally mentions computer programs, but
it is a course in mathematics essentially devoid of the (in my opinion) far more interesting
components of the subject (Doing Mathematics on a Computer), whatever you might wish to call it. 

Mr Boehm's notes are not unique in its style of coverage. Just (again, my opinion)
unappealing.  As a teacher of a graduate course with a similar title, I could never
use these notes.  My course was in Computer Science, not Mathematics.

While we are dealing with naming issues, it was my impression that the objection to
the term "manipulation" came from German-speaking participants at a conference,
and I was told that it had, in German, uncomfortable associations.  Now that Google translate
is available, I tried some translations.   Manipulieren doesn't sound so bad. Frisieren
has implications of "trickery". I was under the impression that it might have been worse,
that the translation was too close to Masturbieren  for comfort. 

Maybe it was just that manipulation sounds non-rigorous, and not a proper subject for
study?  I have certainly noticed the emphasis on rigor in teaching from our visitors
from Germany.  Unless they can change their approach,
it  makes them generally ineffective as undergraduate instructors.

There is the example of a large apparently well-researched book titled "Modern Computer Algebra"
whose first edition omitted most of the work on symbolic indefinite integration  (beyond rational
functions), because the complexity of the Risch "algorithm" was not yet analyzed.
(The fact that Risch relies on recursively non-computable sub-tasks is a hazard).
This book does, however, represent a point of view that is also dubious, which is
that the proper study of the subject should be dominated by complexity analysis of
worst cases.  While it does occasionally delve into timing, its analysis is
not (in my opinion) credible.



Bear in mind that we intend to rely on Singular for the majority of the computer algebra we intend to provide in Nemo.

Specifically, its Groebner basis engine is crucial for what we have in mind. We also intend to supply access to the many Singular libraries written in the Singular language, modulo many improvements we have planned. Its obviously not going to happen overnight.
Oddly, Groebner basis calculation form a very small part of the course I usually taught, though.
Henrik Lenstra teaches (or taught) about applications of Groebner calcs.

(We are both emeriti now.)

Bill Hart

unread,
Dec 12, 2015, 1:48:28 PM12/12/15
to sage-flame
On 12 December 2015 at 18:30, Richard Fateman <fat...@berkeley.edu> wrote:
On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".


It outlines the elementary parts of the subject pretty well.
Well, if you choose to define computer algebra that way, then it defines it.
I would not agree at all.

I think I did say it covers only the elementary parts of the subject. Much of the stuff you mention would be suitable for a graduate course maybe. I wouldn't teach it to undergraduates.
 

It treats "symbolic integration" in about 1.5 pages out of 222, and there it is misleading,
not distinguishing the two rather different problems of symbolic definite integration
and symbolic (indefinite) integration in terms of elementary functions. It is also incorrect
in asserting that Risch's methods are an algorithm. And (I suspect) is incorrect in
claiming Axiom is best on this.

It omits entirely some very significant mathematics, e.g. look up Schanuel's conjecture and
results of Daniel Richardson.

Honestly, that just sounds like two random results out of many. You really think such a course should include mention of every such thing?
 

It omits mention of Collins, Brown, Zippel, etc regarding computation of polynomial GCDs.
In fact, it seems to omit complexity entirely. 

I actually agree with you that the course could benefit from the addition of these. In fact, we were discussing this the other day at lunch (not with Dr. Boehm though). But there is the matter of the appetite of a student to be taken into account. On a first reading they find some things too tedious.
 
It also omits entirely mention of the FFT,
an important underlying (discrete or continuous) method.  I think I could find other
omissions if I compared a course outline of mine to the notes.

You could fill a book with results on that. There's over 800 papers in the literature on this. Probably its omission says more about Dr. Boehm's interests than Computer Algebra as a subject.
 

This appears to be a course on algebra.  It occasionally mentions computer programs, but
it is a course in mathematics essentially devoid of the (in my opinion) far more interesting
components of the subject (Doing Mathematics on a Computer), whatever you might wish to call it. 

Oh, I don't agree with that.
 

Mr Boehm's notes are not unique in its style of coverage. Just (again, my opinion)
unappealing.  As a teacher of a graduate course with a similar title, I could never
use these notes.  My course was in Computer Science, not Mathematics.

While we are dealing with naming issues, it was my impression that the objection to
the term "manipulation" came from German-speaking participants at a conference,
and I was told that it had, in German, uncomfortable associations.  Now that Google translate
is available, I tried some translations.   Manipulieren doesn't sound so bad. Frisieren
has implications of "trickery". I was under the impression that it might have been worse,
that the translation was too close to Masturbieren  for comfort. 

Yeah, I don't know anything about that. Never heard anyone use any other term except computer algebra.
 

Maybe it was just that manipulation sounds non-rigorous, and not a proper subject for
study?  I have certainly noticed the emphasis on rigor in teaching from our visitors
from Germany.  Unless they can change their approach,
it  makes them generally ineffective as undergraduate instructors.

They prepare the students well. I don't think I can agree with you there. I'm sure they make lousy US undergraduate instructors. Germans do University one year later I think, and so their students have a certain maturity. They also don't threaten to sue their instructors or expect them to hold their hands, either.
 

There is the example of a large apparently well-researched book titled "Modern Computer Algebra"
whose first edition omitted most of the work on symbolic indefinite integration  (beyond rational
functions), because the complexity of the Risch "algorithm" was not yet analyzed.
(The fact that Risch relies on recursively non-computable sub-tasks is a hazard).
This book does, however, represent a point of view that is also dubious, which is
that the proper study of the subject should be dominated by complexity analysis of
worst cases.  While it does occasionally delve into timing, its analysis is
not (in my opinion) credible.

Yes, I'd forgotten about this book. Another good example of the term being defended. Probably the canonical example really.
 



Bear in mind that we intend to rely on Singular for the majority of the computer algebra we intend to provide in Nemo.

Specifically, its Groebner basis engine is crucial for what we have in mind. We also intend to supply access to the many Singular libraries written in the Singular language, modulo many improvements we have planned. Its obviously not going to happen overnight.
Oddly, Groebner basis calculation form a very small part of the course I usually taught, though.
Henrik Lenstra teaches (or taught) about applications of Groebner calcs.

Well, it's essential for computations in commutative algebra, and certainly in Kaiserslautern this and its applications are front and centre.

Bill.

Bill Hart

unread,
Dec 12, 2015, 2:11:45 PM12/12/15
to sage-flame
On 12 December 2015 at 19:48, Bill Hart <goodwi...@googlemail.com> wrote:


On 12 December 2015 at 18:30, Richard Fateman <fat...@berkeley.edu> wrote:
On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".


It outlines the elementary parts of the subject pretty well.
Well, if you choose to define computer algebra that way, then it defines it.
I would not agree at all.

I think I did say it covers only the elementary parts of the subject. Much of the stuff you mention would be suitable for a graduate course maybe. I wouldn't teach it to undergraduates.
 

It treats "symbolic integration" in about 1.5 pages out of 222, and there it is misleading,
not distinguishing the two rather different problems of symbolic definite integration
and symbolic (indefinite) integration in terms of elementary functions. It is also incorrect
in asserting that Risch's methods are an algorithm. And (I suspect) is incorrect in
claiming Axiom is best on this.

It omits entirely some very significant mathematics, e.g. look up Schanuel's conjecture and
results of Daniel Richardson.

Honestly, that just sounds like two random results out of many. You really think such a course should include mention of every such thing?
 

It omits mention of Collins, Brown, Zippel, etc regarding computation of polynomial GCDs.
In fact, it seems to omit complexity entirely. 

I actually agree with you that the course could benefit from the addition of these. In fact, we were discussing this the other day at lunch (not with Dr. Boehm though). But there is the matter of the appetite of a student to be taken into account. On a first reading they find some things too tedious.
 
It also omits entirely mention of the FFT,
an important underlying (discrete or continuous) method.  I think I could find other
omissions if I compared a course outline of mine to the notes.

You could fill a book with results on that. There's over 800 papers in the literature on this. Probably its omission says more about Dr. Boehm's interests than Computer Algebra as a subject.

Though I should add that I agree that the definitive course on Computer Algebra probably should cover some of the elementary aspects of the FFT.

Richard Fateman

unread,
Dec 12, 2015, 3:28:01 PM12/12/15
to sage-...@googlegroups.com
(starting fresh without including quotes)..

I disagree that Boehm's course notes are somehow foundational and other
topics that might be
in a graduate course would depend on these. One could presumably
construct such a course,
but it would not be one I would care to teach (or study). A few
sections of the notes are useful,
maybe.

What I find important about FFT is that it is possible to multiply
polynomials surprisingly
fast that way, (and with a bit of handwaving one can point out that it
can be used to multiply long numbers.)

I think students find it interesting that there is no algorithmic
solution to the
problem of "is f(x) always zero" for a modest class of expressions
defining f.

The question that should be is students' minds is
"Can we mechanize all of mathematics?"


Bill Hart

unread,
Dec 12, 2015, 3:30:52 PM12/12/15
to sage-flame
After reflecting on your comments more carefully, I now think I see more clearly what is happening.

Dr. Boehm's definition of computer algebra focuses on commutative algebra (for which pretty much everything that is serious requires a Groebner basis) because he is a mathematician and your definition focuses slightly more on complexity, symbolic manipulation (and the Rischi methods) and fundamental algorithms (Collins, Brown, FFT, etc.) because you are a computer scientist.

The Computer Algebra Centre I work in combines mathematics, computer science and computer engineering departments. And I suspect if I went down to the computer science department, I'd get a slightly different definition of the subject.

I now see more clearly that both are essential to the definition of the subject.

Bill.

Bill Hart

unread,
Dec 12, 2015, 3:37:05 PM12/12/15
to sage-flame
On 12 December 2015 at 21:28, Richard Fateman <fat...@berkeley.edu> wrote:
(starting fresh without including quotes)..

I disagree that Boehm's course notes are somehow foundational and other topics that might be
in a graduate course would depend on these.

I think that in our department, the commutative algebra stuff is very much foundational. I agree that there are other areas of the subject that are not discussed in detail, or indeed not even mentioned in this course.
 
  One could presumably construct such a course,
but it would not be one I would care to teach (or study).  A few sections of the notes are useful,
maybe.

What I find important about FFT is that it is possible to multiply polynomials surprisingly
fast that way,  (and with a bit of handwaving one can point out that it can be used to multiply long numbers.)

Sure, I know all about the FFT. I've spent a considerable portion of my career implementing them.

I agree the FFT is important to the subject, and I'd like to see it in such a course. However this may not be the focus they want in our department, where the fundamental technique is the Groebner basis. As I say, if I go down to the computer science department, I might see a completely different attitude.
 

I think students find it interesting that there is no algorithmic solution to the
problem of   "is f(x) always zero"    for a modest class of expressions defining f.

The question that should be is students' minds is
   "Can we mechanize all of mathematics?"

Bill. 

Dima Pasechnik

unread,
Dec 12, 2015, 4:14:24 PM12/12/15
to sage-flame


On Saturday, 12 December 2015 17:30:51 UTC, fateman wrote:
On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".


It outlines the elementary parts of the subject pretty well.
Well, if you choose to define computer algebra that way, then it defines it.
I would not agree at all.

Indeed, this is a course on computational commutative algebra (that's what they are good
at over there). It skims over other topics in the introduction, but otherwise it's just mistitled.
 

It treats "symbolic integration" in about 1.5 pages out of 222, and there it is misleading,
not distinguishing the two rather different problems of symbolic definite integration
and symbolic (indefinite) integration in terms of elementary functions. It is also incorrect
in asserting that Risch's methods are an algorithm.

I thought that Bronstein basically made it into algorithm, no?
 
 
And (I suspect) is incorrect in
claiming Axiom is best on this.
nobody seems to know a better implementation, at least for things like rational functions
are integrated by Axiom (well, FriCAS, to be precise) better than anything else, i.e. it does not
introduce unnecessary field extensions.

 

This is a book written by an eminent complexity theorist and his PhD student...
Again, just as with the lecture notes cited, you got a lot of bias.
 


Bear in mind that we intend to rely on Singular for the majority of the computer algebra we intend to provide in Nemo.

Specifically, its Groebner basis engine is crucial for what we have in mind. We also intend to supply access to the many Singular libraries written in the Singular language, modulo many improvements we have planned. Its obviously not going to happen overnight.
Oddly, Groebner basis calculation form a very small part of the course I usually taught, though.
Henrik Lenstra teaches (or taught) about applications of Groebner calcs.

I saw Lenstra on campus of Leiden University this summer, so he's not 100% gardening, for sure...

 

Bill Hart

unread,
Dec 12, 2015, 4:28:33 PM12/12/15
to sage-flame
On 12 December 2015 at 22:14, Dima Pasechnik <dim...@gmail.com> wrote:


On Saturday, 12 December 2015 17:30:51 UTC, fateman wrote:
On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".


It outlines the elementary parts of the subject pretty well.
Well, if you choose to define computer algebra that way, then it defines it.
I would not agree at all.

Indeed, this is a course on computational commutative algebra (that's what they are good
at over there).

No I used to think that too, but after attending the Mathemagix conference some years back (the best conference I have ever attended, without a doubt, and by a big margin), I realised that there was much more to the subject than I had originally thought.

This was independently confirmed by people like Joris van der Hoeven, Stephen Watt, Dan Grayson and others, who talked about computer algebra topics of which I was completely unaware at the time. They spoke of them as though they were somehow quite central to the topic. I was confused at the time because I had been (mis)led into believing that computer algebra was a branch of theoretical computer science. I also didn't know anything at all about these topics at the time and was quite blindsided by their discussions. They seemed to assume I also knew all about those topics, and yet somehow I had never been exposed to them.

I see now that it is the commutative algebra in the subject that makes it mathematical and not just a branch of theoretical computer science, which is incidentally what I always thought, but couldn't justify to people.

So unless you want to exclude the entire German Fachgruppe Computeralgebra, Stephen Watt, the Macaulay 2 group, Joris van der Hoeven and collaborators and many others in Asia and Europe, you can't justify that those lecture notes are not on computer algebra.

It skims over other topics in the introduction, but otherwise it's just mistitled.

It does certainly skim over some of the other topics in the subject. But the title is well-defended.
 

Dima Pasechnik

unread,
Dec 12, 2015, 5:01:56 PM12/12/15
to sage-flame


On Saturday, 12 December 2015 21:28:33 UTC, Bill Hart wrote:


On 12 December 2015 at 22:14, Dima Pasechnik <dim...@gmail.com> wrote:


On Saturday, 12 December 2015 17:30:51 UTC, fateman wrote:
On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".


It outlines the elementary parts of the subject pretty well.
Well, if you choose to define computer algebra that way, then it defines it.
I would not agree at all.

Indeed, this is a course on computational commutative algebra (that's what they are good
at over there).

No I used to think that too, but after attending the Mathemagix conference some years back (the best conference I have ever attended, without a doubt, and by a big margin), I realised that there was much more to the subject than I had originally thought.

This was independently confirmed by people like Joris van der Hoeven, Stephen Watt, Dan Grayson and others, who talked about computer algebra topics of which I was completely unaware at the time. They spoke of them as though they were somehow quite central to the topic. I was confused at the time because I had been (mis)led into believing that computer algebra was a branch of theoretical computer science. I also didn't know anything at all about these topics at the time and was quite blindsided by their discussions. They seemed to assume I also knew all about those topics, and yet somehow I had never been exposed to them.

I see now that it is the commutative algebra in the subject that makes it mathematical and not just a branch of theoretical computer science, which is incidentally what I always thought, but couldn't justify to people.

So unless you want to exclude the entire German Fachgruppe Computeralgebra, Stephen Watt, the Macaulay 2 group, Joris van der Hoeven and collaborators and many others in Asia and Europe, you can't justify that those lecture notes are not on computer algebra.
 
The notes are on a topic of computer algebra called computational commutative algebra/algebraic geometry.
It is a a big topic, and among the central ones, but not the only one. Computational number theory or
computational group theory can get away without a lot of computational commutative algebra.
And I don't see why computational differential Galois theory (also known as symbolic integration) is brushed aside.
(this is just pure ignorance, if you ask me).
Or computational topology.

No, really, Groebner basis is a nice idea, but it's not that one cannot live without; even in computational
commutative algebra one can get quite far with resultants and various geometric tricks.

I am afraid to a large extent it has to do with politics in this science :-(
People want to partition a broad area into small subjects and fight over grant money
and jobs. And to win this fight, one concentrates on narrow topics...
Leaving aside benefits of knowing tools from far away areas,
one almost certainly does not need Grobner bases if one is doing computations in finite groups.
And someone who does commutative algebra does not need strong generating sets
for permutation groups, or power-commutator persentations
(while these topics are something that they swear by in computational group theory)

And in computational real algebraic geometry Groebner bases are of very limited use,
believe me, I work in this area. As soon as you want to find out whether a system of
polynomial inequalities determines an empty set, you are almost certainly seriously out
of luck with Groebner bases.


It skims over other topics in the introduction, but otherwise it's just mistitled.

It does certainly skim over some of the other topics in the subject. But the title is well-defended.

That title is highly political. Like the title of "Pravda" newspaper ;-)
I imagine lecture notes on Computer Algebra in Aachen might not even mention
Groebner bases at all...

Bill Hart

unread,
Dec 12, 2015, 5:14:52 PM12/12/15
to sage-flame
On 12 December 2015 at 23:01, Dima Pasechnik <dim...@gmail.com> wrote:


On Saturday, 12 December 2015 21:28:33 UTC, Bill Hart wrote:


On 12 December 2015 at 22:14, Dima Pasechnik <dim...@gmail.com> wrote:


On Saturday, 12 December 2015 17:30:51 UTC, fateman wrote:
On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".


It outlines the elementary parts of the subject pretty well.
Well, if you choose to define computer algebra that way, then it defines it.
I would not agree at all.

Indeed, this is a course on computational commutative algebra (that's what they are good
at over there).

No I used to think that too, but after attending the Mathemagix conference some years back (the best conference I have ever attended, without a doubt, and by a big margin), I realised that there was much more to the subject than I had originally thought.

This was independently confirmed by people like Joris van der Hoeven, Stephen Watt, Dan Grayson and others, who talked about computer algebra topics of which I was completely unaware at the time. They spoke of them as though they were somehow quite central to the topic. I was confused at the time because I had been (mis)led into believing that computer algebra was a branch of theoretical computer science. I also didn't know anything at all about these topics at the time and was quite blindsided by their discussions. They seemed to assume I also knew all about those topics, and yet somehow I had never been exposed to them.

I see now that it is the commutative algebra in the subject that makes it mathematical and not just a branch of theoretical computer science, which is incidentally what I always thought, but couldn't justify to people.

So unless you want to exclude the entire German Fachgruppe Computeralgebra, Stephen Watt, the Macaulay 2 group, Joris van der Hoeven and collaborators and many others in Asia and Europe, you can't justify that those lecture notes are not on computer algebra.
 
The notes are on a topic of computer algebra called computational commutative algebra/algebraic geometry.
It is a a big topic, and among the central ones, but not the only one. Computational number theory or
computational group theory can get away without a lot of computational commutative algebra.

Agreed, which is why Nemo will focus on all three subjects.
 
And I don't see why computational differential Galois theory (also known as symbolic integration) is brushed aside.
(this is just pure ignorance, if you ask me).
Or computational topology.

Probably just not so relevant to the research interests here.
 

No, really, Groebner basis is a nice idea, but it's not that one cannot live without; even in computational
commutative algebra one can get quite far with resultants and various geometric tricks.

No that's not true. You severely constrain yourself if you don't have Groebner bases. You can do some tricks in Euclidean domains, that's about it. 95% of the serious stuff requires Groebner basis techniques.
 

I am afraid to a large extent it has to do with politics in this science :-(
People want to partition a broad area into small subjects and fight over grant money
and jobs. And to win this fight, one concentrates on narrow topics...

Here we go again...
 
Leaving aside benefits of knowing tools from far away areas,
one almost certainly does not need Grobner bases if one is doing computations in finite groups.

Computational group theory only partially overlaps with commutative algebra. It's really its own topic.
 
And someone who does commutative algebra does not need strong generating sets
for permutation groups, or power-commutator persentations
(while these topics are something that they swear by in computational group theory)

Of course.
 

And in computational real algebraic geometry Groebner bases are of very limited use,
believe me, I work in this area. As soon as you want to find out whether a system of
polynomial inequalities determines an empty set, you are almost certainly seriously out
of luck with Groebner bases.

Sure.
 


It skims over other topics in the introduction, but otherwise it's just mistitled.

It does certainly skim over some of the other topics in the subject. But the title is well-defended.

That title is highly political. Like the title of "Pravda" newspaper ;-)
I imagine lecture notes on Computer Algebra in Aachen might not even mention
Groebner bases at all...

Bill. 

Dima Pasechnik

unread,
Dec 12, 2015, 5:37:55 PM12/12/15
to sage-flame

surely serious ivory commutative algebra stuff, or perhaps crypto, might need it all, and more. 
But over a sufficiently nice field you can get away with computing
critical points on a deformed hypersurface: g(X)=e(sum_k X_k^{2m})+(1-e)f(X)=0, with deg(f)<2m, take partial derivatives d/dX_k g(X)=0, k>1;
together with g(X), they already form a Groebner basis (if we must mention it, although we don't need to).
 Find roots of this system using Stickelberger Lemma (linear algebra, more or less), and then take lim_{e->0}.
Funnily enough, this is the fastest, in theory, known way to solve polynomial systems over reals...
See https://perso.univ-rennes1.fr/marie-francoise.roy/bpr-ed2-posted1.html if you don't believe me.

 

Bill Hart

unread,
Dec 12, 2015, 5:55:19 PM12/12/15
to sage-flame
Sure, over a sufficiently nice field you can do other things.
 
critical points on a deformed hypersurface: g(X)=e(sum_k X_k^{2m})+(1-e)f(X)=0, with deg(f)<2m, take partial derivatives d/dX_k g(X)=0, k>1;
together with g(X), they already form a Groebner basis (if we must mention it, although we don't need to).
 Find roots of this system using Stickelberger Lemma (linear algebra, more or less), and then take lim_{e->0}.
Funnily enough, this is the fastest, in theory, known way to solve polynomial systems over reals...

Sure. The other methods you mentioned are also faster when applicable.
But judging the subject by what you do in real algebraic geometry is pretty pointless. I suspect that the vast majority of people doing computer algebra are not doing real algebraic geometry as an application.

The fact remains that a number of very fundamental computations that you want to do in commutative algebra more or less require Groebner bases. They drop out of the ideal membership problem quite naturally and almost fundamentally (apart from still essentially depending on an ordering).

That's a bad thing of course. The oftentimes double exponential nature of the algorithms means that we'd really like better algorithms than we currently have. Any time we can avoid using a Groebner basis, we should. But you certainly can't do away with them. They are absolutely required for many applications, even fundamental ones, as things are currently.

If that's not the case, then we'd really like to know about it. We want to implement such fundamental things in Nemo, and after looking around for people who have described alternative methods that are faster, we don't see much on offer. For specific applications and problem domains, other techniques exist. But for many things you are completely constrained and must use GBs.
 

 
 

I am afraid to a large extent it has to do with politics in this science :-(
People want to partition a broad area into small subjects and fight over grant money
and jobs. And to win this fight, one concentrates on narrow topics...

Here we go again...
 
Leaving aside benefits of knowing tools from far away areas,
one almost certainly does not need Grobner bases if one is doing computations in finite groups.

Computational group theory only partially overlaps with commutative algebra. It's really its own topic.
 
And someone who does commutative algebra does not need strong generating sets
for permutation groups, or power-commutator persentations
(while these topics are something that they swear by in computational group theory)

Of course.
 

And in computational real algebraic geometry Groebner bases are of very limited use,
believe me, I work in this area. As soon as you want to find out whether a system of
polynomial inequalities determines an empty set, you are almost certainly seriously out
of luck with Groebner bases.

Sure.
 


It skims over other topics in the introduction, but otherwise it's just mistitled.

It does certainly skim over some of the other topics in the subject. But the title is well-defended.

That title is highly political. Like the title of "Pravda" newspaper ;-)
I imagine lecture notes on Computer Algebra in Aachen might not even mention
Groebner bases at all...

Bill. 

--

Dima Pasechnik

unread,
Dec 12, 2015, 6:53:04 PM12/12/15
to sage-flame

the vast majority of people *doing* computer algebra are doing what they are good at (I suppose at U-KL it is commutative algebra and its offshots...).
But the vast majority of people *using* computer algebra do not do computational commutative algebra.
Probably more of them will want to *do* systems of non-linear inequalities than systems of polynomial equations
over an algebraically closed field. Thus saying that real algebraic geometry (i.e. the theory of semialgebraic sets, sets of solutions
of systems of polynomial inequalities) is pointless for computer algebra is quite a bit of a stretch.
 

The fact remains that a number of very fundamental computations that you want to do in commutative algebra more or less require Groebner bases. They drop out of the ideal membership problem quite naturally and almost fundamentally (apart from still essentially depending on an ordering).

That's a bad thing of course. The oftentimes double exponential nature of the algorithms means that we'd really like better algorithms than we currently have. Any time we can avoid using a Groebner basis, we should. But you certainly can't do away with them. They are absolutely required for many applications, even fundamental ones, as things are currently.

If that's not the case, then we'd really like to know about it. We want to implement such fundamental things in Nemo, and after looking around for people who have described alternative methods that are faster, we don't see much on offer. For specific applications and problem domains, other techniques exist. But for many things you are completely constrained and must use GBs.

Well, I am not saying that I know faster methods to do ideal membership, etc, (especially not in characteristic p)
but there are many things out there that don't need ideal membership, e.g. real domain things
such as solutions of systems of polynomial inequalities (i.e. semi-algebraic sets),
yet they are fundamental for many applications of computer algebra.
A fundamental problem is to have an effective Positivstellensatz implementation, something that
has a lot to do with nonnegative polynomials and sums of squares. Another fundamental problem is to have
quantifier elimination over reals.

Similarly, symbolic integration is what computer algebra systems are used for a lot, and no system out there
does it quite right...

Bill Hart

unread,
Dec 12, 2015, 7:03:23 PM12/12/15
to sage-flame
I never said it was pointless for computer algebra. I said that judging what the subject is about by only looking at what you can get away with in real algebraic geometry is pointless. You miss many of the deep and interesting applications to pure maths that way.
 
 

The fact remains that a number of very fundamental computations that you want to do in commutative algebra more or less require Groebner bases. They drop out of the ideal membership problem quite naturally and almost fundamentally (apart from still essentially depending on an ordering).

That's a bad thing of course. The oftentimes double exponential nature of the algorithms means that we'd really like better algorithms than we currently have. Any time we can avoid using a Groebner basis, we should. But you certainly can't do away with them. They are absolutely required for many applications, even fundamental ones, as things are currently.

If that's not the case, then we'd really like to know about it. We want to implement such fundamental things in Nemo, and after looking around for people who have described alternative methods that are faster, we don't see much on offer. For specific applications and problem domains, other techniques exist. But for many things you are completely constrained and must use GBs.

Well, I am not saying that I know faster methods to do ideal membership, etc, (especially not in characteristic p)
but there are many things out there that don't need ideal membership, e.g. real domain things
such as solutions of systems of polynomial inequalities (i.e. semi-algebraic sets),
yet they are fundamental for many applications of computer algebra.

You use the word application slightly differently to me. 

Without knowing much about your specific area, I can't comment on how important it is to real world applications. But having them should make it relatively easier to get grant and commercial funding I guess.
 
A fundamental problem is to have an effective Positivstellensatz implementation, something that
has a lot to do with nonnegative polynomials and sums of squares. Another fundamental problem is to have
quantifier elimination over reals.

Similarly, symbolic integration is what computer algebra systems are used for a lot, and no system out there
does it quite right...

I've no doubt.

Bill. 

Richard Fateman

unread,
Dec 12, 2015, 7:29:24 PM12/12/15
to sage-...@googlegroups.com
Again, trimming all the quotes.

1. Perhaps my own prejudice, but I view Groebner basis computation as a
last
resort. That is, if the task can be accomplished by any other method,
that method
is faster. There are several classic tasks that have very high
complexity, and also
exhibit high computation times in practice, even on what most people
would think
are rather small examples. GB for one. Cylindrical Algebraic Decomposition.
Resultant calculations, which can be slow if you have a bad GCD algorithm,
can be used instead of GB for some tasks.

2. Risch integration is not an algorithm because it requires determining
the algebraic
independence of elements constructed in a differential field by adjoining
various functions. This can require that we know things that we don't know
and/or provably cannot compute, like is some f(x) identically zero. (D.
Richardson
results say -- no.) Schanuel's conjecture involves the independence of
transcendental extensions exp(z) over a base field. We do not know
even that e^e is not rational. or e+pi. So this conjecture is not some
random
number theory conjecture, but one that has important bearing on the
constructive
nature of certain computations that are part of the Risch "algorithm". The
questions also come up in simplification or "zero-equivalence".
As you know, certain tasks require you to know the degree of a polynomial.
If the leading term's coefficient is zero, then it is not the leading term.
If the coefficients of all the exponential terms are zero, maybe what you
have left is a polynomial? etc..

(if you care, the Wikipedia article on this, I just
glanced at, seems brief/easy reading/correct as far as I can tell.

So even if Bronstein stamped out all the other nuances of what is needed to
do Risch integration, his methods still must rely on functionality that
is not constructive.
I am sure he was well aware of this.

I suspect that the Mathematica integration program is quite strong, but I
am insufficiently motivated to try to compare it to others.

Besides which, indefinite integration in finite terms is rarely of interest.
Definite integration, ESPECIALLY when it cannot be done by simply
substituting limits into the indefinite integral, is far more important.
It is also usually done quite accurately and fast in practical circumstances
by numerical programs.

To some extent CAS are used to just do Freshman calculus homework
or to do impressive parlor tricks. Wolfram made a great impression on
the media by showing 3D plots of math functions.

What, after all this, should we be impressed by? What is a task that
is hard, important, can be done by a CAS and probably not without it?

I think that one area is proving the correctness of important (numerical)
programs.

Too bad this interesting discuss (a) has diverged from the title, and
(b) is even in the wrong newsgroup.

RJF

Richard Fateman

unread,
Dec 12, 2015, 7:45:28 PM12/12/15
to sage-...@googlegroups.com
Again, tossing out the quotes.

In the relatively early days of building systems for mathematics, the
motivation was to make systems that were experts in doing what
mathematicians do. The thought was that by building computer
experts in different fields one could link them together and build
something that effectively mimicked human behavior in a non-trivial
domain.

A less ambitious version of this would be to figure out the
"Arithmetization" of mathematics. An area of interest going
back to G. Frege, Russell & Whitehead, and (today) MKM
researchers.

This too is (dare I say, impossibly) ambitious.

So what did people do that was not impossibly ambitious?

For the most part, they tried
to do interesting applications they thought could be automated, and found
difficulties (hence research on polynomial GCD,
polynomial factoring, integration, limits, series.)

Other people decided to do things motivated by their own
curiosity that they thought could be automated (applications to "pure
mathematics").
They had separate sessions in the early conferences
run by SIGSAM. There was very little cross fertilization with
the "general purpose" system builders.

I remember discussing a problem having to do with
simplification of "nested radicals" with John Tate,
then at Harvard. He suggested a simplification of the
problem which would make it quite doable but totally
uninteresting.

Bill Hart

unread,
Dec 12, 2015, 8:30:07 PM12/12/15
to sage-flame
On 13 December 2015 at 01:29, Richard Fateman <fat...@berkeley.edu> wrote:
Again, trimming all the quotes.

1.  Perhaps my own prejudice, but I view Groebner basis computation as a last
resort.  That is,  if the task can be accomplished by any other method, that method
is faster. 

I definitely agree with this. I don't know what Dr. Boehm would say on this, as I haven't asked him about this specifically.
 
There are several classic tasks that have very high complexity, and also
exhibit high computation times in practice, even on what most people would think
are rather small examples.  GB for one. Cylindrical Algebraic Decomposition.

Interesting.
 
Resultant calculations, which can be slow if you have a bad GCD algorithm,

Yes, we discovered this the hard way in Nemo. We still need heaps of improvement here actually. We know what needs to be done, just haven't done it yet.

Of course in general commutative algebra, resultant can't be done with polynomial algorithms.
 
can be used instead of GB for some tasks.

Some, indeed.
 

2. Risch integration is not an algorithm because it requires determining the algebraic
independence of elements constructed in a differential field by adjoining
various functions.  This can require that we know things that we don't know
and/or provably cannot compute, like is some f(x) identically zero. (D. Richardson
results say -- no.) 

Interesting. I know approximately nothing about the methods.
 
Schanuel's conjecture involves the independence of
transcendental extensions   exp(z)  over a base field.  We do not know
even that e^e is not rational. or e+pi.   So this conjecture is not some random
number theory conjecture, but one that has important bearing on the constructive
nature of certain computations that are part of the Risch "algorithm". 

Yes, I see where you are coming from.
 
The
questions also come up in simplification or "zero-equivalence".
As you know, certain tasks require you to know the degree of a polynomial.
If the leading term's coefficient is zero, then it is not the leading term.
If the coefficients of all the exponential terms are zero, maybe what you
have left is a polynomial?  etc..

 (if you care, the Wikipedia article on this, I just
glanced at, seems brief/easy reading/correct as far as I can tell.

So even if Bronstein stamped out all the other nuances of what is needed to
do Risch integration, his methods still must rely on functionality that is not constructive.
I am sure he was well aware of this.

I suspect that the Mathematica integration program is quite strong, but I
am insufficiently motivated to try to compare it to others.

Besides which, indefinite integration in finite terms is rarely of interest.
Definite integration,  ESPECIALLY when it cannot be done by simply
substituting limits into the indefinite integral,  is far more important.
It is also usually done quite accurately and fast in practical circumstances
by numerical programs.

To some extent  CAS are used to just do Freshman calculus homework
or to do impressive parlor tricks.

Some are. I double Singular is used for that. There's many hundreds of references to Singular in scientific literature, with many, many more omitted I am sure.

There are also piles of graduate and fachpraktikum theses every year which are really quite serious. They are very specific applications, but they seem to have an endless supply of them around here.
 
  Wolfram made a great impression on
the media by showing 3D plots of math functions.

Yeah, if only he could have gotten "game of life", "pretty pictures", "tiling", "mandelbrot set" and "better way of teaching mathematics" in the same sentence, he would have practically been declared god by the media.

Unfortunately, the general public get a really distorted view of mathematics from the media. This is why there are so many cranks. They are somehow convinced that maths is actually really easy, even for a random member of the general public. It's just that we are pretending that it is hard.
 

What, after all this, should we be impressed by?  What is a task that
is hard, important, can be done by a CAS and probably not without it?

Factoring polynomials maybe? Any task that takes sufficient machinery to make it interesting and non-trivial to implement, but which has wide enough applicability to either theoreticians who want to do experiments or commercial enterprises that have hard problems to solve that require specialised computing tools.
 

I think that one area is proving the correctness of important (numerical)
programs.

Too bad this interesting discuss (a) has diverged from the title, and
(b) is even in the wrong newsgroup.

Oh I don't know. Not discussing Sage on a list set up to discuss Sage seems like enough of a flame to me. The irony burns. :-)

Bill.

Bill Hart

unread,
Dec 12, 2015, 8:54:05 PM12/12/15
to sage-flame
On 13 December 2015 at 01:45, Richard Fateman <fat...@berkeley.edu> wrote:
Again, tossing out the quotes.

In the relatively early days of building systems for mathematics, the
motivation was to make systems that were experts in doing what
mathematicians do. The thought was that by building computer
experts in different fields one could link them together and build
something that effectively mimicked human behavior in a non-trivial
domain.

A less ambitious version of this would be to figure out the
"Arithmetization" of mathematics.   An area of interest going
back to G. Frege, Russell &  Whitehead, and (today) MKM
researchers.

This too is (dare I say, impossibly) ambitious.

So what did people do that was not impossibly ambitious?

  For the most part, they tried
to do interesting applications they thought could be automated, and found
difficulties  (hence research on polynomial GCD,
polynomial factoring, integration, limits, series.)

Other people decided to do things motivated by their own
curiosity that they thought could be automated  (applications to "pure mathematics").
They had separate sessions in the early conferences
run by SIGSAM.  There was very little cross fertilization with
the "general purpose" system builders.

It's not up to me solely, but if I had my way, Nemo would support both deep, specialised applications and broad, general purpose system development. But I would personally make a very clear distinction between these so that people know what to expect when running the code.

What I kind of have in mind is a kind of "projects" folder within Nemo, where people can implement their own specialised project. (Likely there will be lots of external libraries that build on Nemo as well.)

William Stein, if I recall correctly, tried to solve this problem with Sage itself. Mathematicians working on a personal research interest with little interest or expertise in general purpose system building are critical to the lifeblood of a genuine computer algebra community. Those are the smart pure mathematicians who you want to be using your system, promoting your system and to share their expertise with those implementing the general purpose parts of your system (often they have good ideas, motivated by their specific application, even if they aren't interested in implementing those themselves).

But by and large, no one is really interested in using their code because it may have been implemented badly, was only intended to provide a few examples for a paper and not really intended for general purpose use, was not documented well, if at all, and it has never been run on more than a handful of examples (no one knows how to check it in other cases, if it even terminates or uses a reasonable amount of ram, or is even theoretically applicable to more than a handful of examples). And it's too specialised for anyone to really understand or review properly anyway.

As I recall, William started psage to encourage at least slightly more pure maths oriented applications of Sage, without all the code review and doctesting and software engineering that goes into the main project. Unfortunately I don't think psage survived.

So it's an unsolved problem really. How does one encourage mathematicians with very specialised scripts to contribute them without messing up the rest of the project.

Actually, I find it really, really hard to convince people that they do not need open commit access to the repository and the ability to add whatever they want, whenever they want, in any state that they want, in order to make use of a computer algebra project. Mathematicians invariably argue for more genericity and less optimisation, and for far fewer software engineering restrictions, all of which run counter to good general purpose system development goal and sound software engineering principles.

Somehow the Magma project has managed this to some degree. But I don't really understand how. On the other hand, we only have their word for it that the project is managed carefully from a software engineering perspective and is actually fit for use by a wide variety of users. We can't see all their source code to verify this. So maybe they are just cheating. For all we know, Magma is full of algorithms that don't even guarantee a correct answer. Who would know? What kind of review has most of it been subjected to? Certainly not public scrutiny.


I remember discussing a problem having to do with
simplification of "nested radicals" with John Tate,
then at Harvard.  He suggested a simplification of the
problem which would make it quite doable but totally
uninteresting.


Interesting anecdote.

Bill. 

Dima Pasechnik

unread,
Dec 13, 2015, 5:14:52 AM12/13/15
to sage-flame

I don't really have a specific area.
Lately I did use resultants in some relatively esoteric algebra paper: http://www.sciencedirect.com/science/article/pii/S0021869310003182
and some Grobner bases in a complex analysis paper: http://arxiv.org/abs/1209.4014
and some linear optimisation (had to get a student to write an interface for PPL in Sage) in http://arxiv.org/abs/1207.6512 for coding theory.
and graph theory and algebra (such as Smith normal forms) in Sage/GAP here: http://arxiv.org/abs/1405.0113 - originally to explain some strange coincidence in OEIS.

hopefully this allows me to comment on computer algebra without a bias from a particular narrowly defined... :-)

Richard Fateman

unread,
Dec 13, 2015, 11:02:45 AM12/13/15
to sage-...@googlegroups.com
The difficulty (historically at least) in supporting "user written" code has several facets.

1. Many mathematicians do not have a programming/algorithmic mind set.
  Why?  Unclear.  But look at the books they write, even for calculus.  Instead of
describing algorithms, they give examples and theorems (probably some exceptions now;
I haven't taught calculus in 4 decades...).  What calculus text explains Risch?
Arguably, mathematicians with an algorithm mind set are attracted to computer
science where they can a better paying job :)   The Macsyma project at MIT,
and also our various projects at Berkeley hired itinerant mathematicians to
work full time to develop code. 3 come to mind.  results: almost total failure.
One, bizarrely insisted that one couldn't do math on a computer keyboard
because it didn't have keys for alpha, integral, etc.  He owned two "multiplex"
custom typewriters that had such facilities -- two "baskets" of keys and a
mechanism that could shift the carriage from one to another.  The electric
model he considered too dangerous to use.  I think that only the British know
how to nurture such eccentrics.

So to summarize this point:  Mathematicians often cannot reduce their
expertise to programs.

2. (Blame the computer types)  Traditional programming languages do
not support symbolic mathematics directly, and so some additional
design and explanation of that design is required.  There are people who
claim that slight twiddles of a conventional language are sufficient.
FORMAC was a twiddle on FORTRAN, and then PL/I.  Pascal ABC
on Pascal,  Macsyma on Algol-60+Lisp, Reduce on Lisp+Algol,
 Maple (loosely on Algol), Sage on Python, Nemo on Julia, various
experiments on Lisp CLOS (object system)  ...
Others claim that new languages specially designed will do the job.
Like Mathematica, Maple, SAC-2, Altran,  especially Axiom, ...
And the perennial debate as to whether there should be two languages
or one:  user and developer.   (e.g.  "Mathematica Wolfram Language"
vs some dialect of C;   "Maxima top level language" or Lisp, or
just Maple + maple core,  Or many:  Sage-python, but calling libraries
in Lisp, C, python(sympy, numpy), assembler (parts of GMP)...

So to summarize this point: Mathematical syntax and semantics is
not modeled by any existing programming system in spite of decades
of attempts at it.  As a computer scientist primarily, I would blame
the mathematicians (as a group) for using ambiguous or conflicting
syntax and semantics,  and blame the CS types for under-estimating
the need to (force)  mathematicians to think concretely about
data, algorithms, complexity.  Doing mathematics on a computer
can look like magic, but for developing practical re-usable packages,
you need a better understanding.  (And, personally, if a researcher
is too fearful of it,  or somehow or other cannot, learn to program,
the computer algebra system building people should find
someone else.)



What I kind of have in mind is a kind of "projects" folder within Nemo, where people can implement their own specialised project. (Likely there will be lots of external libraries that build on Nemo as well.)

William Stein, if I recall correctly, tried to solve this problem with Sage itself. Mathematicians working on a personal research interest with little interest or expertise in general purpose system building are critical to the lifeblood of a genuine computer algebra community. Those are the smart pure mathematicians who you want to be using your system, promoting your system and to share their expertise with those implementing the general purpose parts of your system (often they have good ideas, motivated by their specific application, even if they aren't interested in implementing those themselves).

This is, to me, a questionable approach, based on past experience.



But by and large, no one is really interested in using their code because it may have been implemented badly, was only intended to provide a few examples for a paper and not really intended for general purpose use, was not documented well, if at all, and it has never been run on more than a handful of examples (no one knows how to check it in other cases, if it even terminates or uses a reasonable amount of ram, or is even theoretically applicable to more than a handful of examples). And it's too specialised for anyone to really understand or review properly anyway.

This person is way more useful than the one who might just "have good ideas".
A careful programmer, or at least one who understands what
it would mean to write a "good" program, is valuable.  This goes
along with a mindset and training that is not a requirement for
doing mathematics, so far as I can tell.



As I recall, William started psage to encourage at least slightly more pure maths oriented applications of Sage, without all the code review and doctesting and software engineering that goes into the main project. Unfortunately I don't think psage survived.

So it's an unsolved problem really. How does one encourage mathematicians with very specialised scripts to contribute them without messing up the rest of the project.
Eh, this is half the program.  Consider the programs written by the people who
mess up the project because they are prolific programmers but
are totally lacking in "taste".  So the facilities they embed in the
system are, at best,  difficult to explain.  At worst, they mischaracterize
a subfield of mathematical computation.



Actually, I find it really, really hard to convince people that they do not need open commit access to the repository and the ability to add whatever they want, whenever they want, in any state that they want, in order to make use of a computer algebra project. Mathematicians invariably argue for more genericity and less optimisation, and for far fewer software engineering restrictions, all of which run counter to good general purpose system development goal and sound software engineering principles.

This is something that (at least with Macsyma/Maxima) we have not faced.
If you want to load your own code on top of the standard system, you do
so by inserting a line or more  like  load("mydirectory/myreplacementcode")
in the file maxima.init.
You can, if you wish, replace all or part of the parser, evaluator, simplifier,
and add arbitrary code for new functionality.  It can be run interpretively
or highly-optimized compiled.   You can fix small buggy spots (no need
to recompile "the system")  or totally change important components.
Whether or not the main system is open or not should be irrelevant.

I, for one, refuse to put stuff in Maxima github.  If I can't convince at
least one other person that it is worth including, it shouldn't be included.


Somehow the Magma project has managed this to some degree. But I don't really understand how. On the other hand, we only have their word for it that the project is managed carefully from a software engineering perspective and is actually fit for use by a wide variety of users.
I haven't looked at Magma in a long while, but
I was not impressed with the system design/language.


We can't see all their source code to verify this. So maybe they are just cheating. For all we know, Magma is full of algorithms that don't even guarantee a correct answer.
Even if the algorithm guaranteed a correct answer, you
still wouldn't know if the implementation was correct.

You may correctly observe that these are two different
issues, but from the perspective of someone using the
program as a building block in something larger,
it may not matter. 
There are many (all?) CAS that do a half-assed implementation of
symbolic integration.  Do you think that looking at a few
hundred thousand lines of code in Mathematica would allow
you to verify that it is correct?



Who would know? What kind of review has most of it been subjected to? Certainly not public scrutiny.

People use the open source of Maxima to fix bugs that have
been revealed by mis-behavior.  I do not recall anyone
saying that he/she was studying (apparently working) code
to prove that it was correct.

(There are even some people who say that the code is
correct if it compiles.  That is, it does something.  It does
what the programmer wrote;  it has correct typing.  It
is a correct reflection of the program text... This is silly,
but I've seen it happen, e.g.  I say "There is a bug, the
program should return XXX not YYY"  and the response
is,  "no,  the answer YYY is the correct answer. I looked
at the program and it computes YYY." [labeling a
bug as a feature. ]


I remember discussing a problem having to do with
simplification of "nested radicals" with John Tate,
then at Harvard.  He suggested a simplification of the
problem which would make it quite doable but totally
uninteresting.


Interesting anecdote.
I think he proposed that instead of reducing radical expressions in
variables, x,y,.... variables over the reals,
that I work on the problem if x,y, ... were variables over GF(2).

RJF


Ted Kosan

unread,
Dec 13, 2015, 11:28:26 AM12/13/15
to sage-flame
Parts of this discussion seem to indicate that many developers who
work on computer algebra systems don’t seem to have a clear
understanding of what a computer algebra system is essentially. I
suspect this may be because they don’t have a clear understanding of
what algebra, computation, and related concepts are. Is anyone on this
list able to answer the following questions precisely?

1) What is the difference between a calculus and an algebra? (The
mathematicians at the university where I teach were unable to provide
a clear answer to this question)

2) What exactly is computation?

3) What exactly is an algorithm?

4) What is the difference between a variable in mathematics and a
programming language variable?

Ted

Richard Fateman

unread,
Dec 13, 2015, 11:40:46 AM12/13/15
to sage-...@googlegroups.com
On 12/13/2015 8:28 AM, Ted Kosan wrote:
Parts of this discussion seem to indicate that many developers who
work on computer algebra systems don’t seem to have a clear
understanding of what a computer algebra system is essentially. 

 I think they have a clear idea, but they just don't agree on the idea.

I
suspect this may be because they don’t have a clear understanding of
what algebra, computation, and related concepts are. Is anyone on this
list able to answer the following questions precisely?
They don't need to agree on these terms or concepts either :)


1) What is the difference between a calculus and an algebra? (The
mathematicians at the university where I teach were unable to provide
a clear answer to this question)
It seems to me you should be able to look up the definitions, but even so, people will disagree.
Off the top of my head,  calculus (from "pebbles"?)  should be a computational method,  but
algebra is an abstract structure.  So not even close.

 Unfortunately, calculus has become "the course you take
in your high school senior year if you are good at math".  And algebra is also some kind of
introductory elementary school  "arithmetic with symbols",


2) What exactly is computation?
I think this is pretty vague, but some other people should offer a definition.


3) What exactly is an algorithm?
I think you can find this defined clearly by almost any text in computer science. e.g. Donald Knuth probably
does a good job.


4) What is the difference between a variable in mathematics and a
programming language variable?
Partly, it depends on the programming language.  Lisp may be clearest on this, in
that there are symbols that can be associated with values in various ways.
Symbols can have properties.  You might call them variables, but that would be
only one use.

Maybe a variable in math is an indeterminate?


Ted


Ted Kosan

unread,
Dec 14, 2015, 1:49:16 AM12/14/15
to sage-flame
Richard wrote:

>> 1) What is the difference between a calculus and
>> an algebra? (The mathematicians at the university
>> where I teach were unable to provide a clear
>> answer to this question)
>
> It seems to me you should be able to look up the
> definitions, but even so, people will disagree.
> Off the top of my head, calculus (from
> "pebbles"?) should be a computational method, but
> algebra is an abstract structure. So not even
> close.

One would think clear definitions for “a calculus” and “an algebra”
would be easy to look up. However, I have found this to not be the
case. After much searching, the best definitions I have been able to
find so far are:

“A language for which syntactical rules are given is sometimes called
a calculus.”
(“Introduction to Symbolic Logic and its Applications”, Carnap,
Rudolph, 1958, p.80)

“An algebra is basically a model of a theory.” (“A Logical Approach to
Discrete Math”, Gries, David, 1993, p.388)



>> 2) What exactly is computation?
>
> I think this is pretty vague, but some other
> people should offer a definition.

“I will argue that the usual sharp distinction which is made between
the process of computation and deduction, is misleading. An
interpreter for a programming language, and a theorem-proving program
for a logical language, are structurally indistinguishable. [...] More
specifically: I will argue that computation is best regarded as a
process of controlled deduction.” (“Computation and Deduction”, Hayes,
P.J., Proc. 2nd MFCS Symp. Czechoslovak Academy of Sciences, 1973, pp.
105-118.)



>> 3) What exactly is an algorithm?
>
> I think you can find this defined clearly by
> almost any text in computer science. e.g. Donald
> Knuth probably does a good job.

“Conventional algorithms and programs expressed in conventional
programming languages combine the logic of the information to be used
in solving problems with the control over the manner in which the
information is put to use. This relationship can be expressed
symbolically by the equation Algorithm = Logic + Control. Logic
programs express only the logic component of algorithms. The control
component is exercised by the program executor...” ("Logic for Problem
Solving", Kowalski, Robert, p.125, 1979)

Donald Knuth has stated “I'm not strong on logic, so TAOCP treads
lightly on this sort of thing.”
(http://www.informit.com/articles/article.aspx?p=2213858). Having a
clear understanding of what an algorithm is depends on having a deep
understanding of logic. Since Knuth states that he is not strong on
logic, I conclude that he does not have a clear understanding of what
an algorithm is.



>> 4) What is the difference between a variable in
>> mathematics and a programming language variable?
>
> Partly, it depends on the programming language.
> Lisp may be clearest on this, in that there are
> symbols that can be associated with values in
> various ways. Symbols can have properties. You
> might call them variables, but that would be only
> one use.
>
> Maybe a variable in math is an indeterminate?

A variable in math being an indeterminate appears to be accurate.
Edsger Dijkstra states “... as a rule, a mathematical variable
represents no specific value at all...” and that programming language
variables are actually “changeable constants”. (“A discipline of
Programming”, Dijkstra, Edsger, p.11, 1976)


Ted

Bill Hart

unread,
Dec 14, 2015, 2:05:02 AM12/14/15
to sage-flame
And at this point I stopped reading anything you wrote.
 

>> 4) What is the difference between a variable in
>> mathematics and a programming language variable?
>
> Partly, it depends on the programming language.
> Lisp may be clearest on this, in that there are
> symbols that can be associated with values in
> various ways. Symbols can have properties. You
> might call them variables, but that would be only
> one use.
>
> Maybe a variable in math is an indeterminate?

A variable in math being an indeterminate appears to be accurate.
Edsger Dijkstra states “... as a rule, a mathematical variable
represents no specific value at all...” and that programming language
variables are actually “changeable constants”. (“A discipline of
Programming”, Dijkstra, Edsger, p.11, 1976)


Ted

Richard Fateman

unread,
Dec 14, 2015, 10:08:05 AM12/14/15
to sage-...@googlegroups.com
On 12/13/2015 10:49 PM, Ted Kosan wrote:

<snip?
>

.. ok, so you seem to be happy with calculus vs algebra.
I suppose that in their particular contexts, those statements are fine.
>>> 2) What exactly is computation?
>> I think this is pretty vague, but some other
>> people should offer a definition.
> “I will argue that the usual sharp distinction which is made between
> the process of computation and deduction, is misleading. An
> interpreter for a programming language, and a theorem-proving program
> for a logical language, are structurally indistinguishable. [...] More
> specifically: I will argue that computation is best regarded as a
> process of controlled deduction.” (“Computation and Deduction”, Hayes,
> P.J., Proc. 2nd MFCS Symp. Czechoslovak Academy of Sciences, 1973, pp.
> 105-118.)
Sounds to me like <mumble> = <mumble>, clarifying neither.

If I were to guess at what "a theorem-proving program for a logical
language" or "a logical language"
meant, I would not come up with "computation". Indeed, the claim that
there is a "usual sharp distinction"
is a key. I, (apparently along with Knuth) don't care, and would not
bother thinking about this.


>
>
>
>>> 3) What exactly is an algorithm?
>> I think you can find this defined clearly by
>> almost any text in computer science. e.g. Donald
>> Knuth probably does a good job.
> “Conventional algorithms and programs expressed in conventional
> programming languages combine the logic of the information to be used
> in solving problems with the control over the manner in which the
> information is put to use.
Sorry, this makes no sense. There is a book by N. Wirth, Programs=
Algorithms+Data, that you
could look at for an explanation of Algorithm that is perhaps more
discursive. I haven't looked
at it, and I am not a big fan of Wirth, but it seems to me the kind of
thing he would rattle on about.
> This relationship can be expressed
> symbolically by the equation Algorithm = Logic + Control.
So, using the "calculus" of arithmetic, Control = Algorithm - Logic?

> Logic
> programs express only the logic component of algorithms. The control
> component is exercised by the program executor...” ("Logic for Problem
> Solving", Kowalski, Robert, p.125, 1979)
I am not familiar with this book, but it sounds like someone is aiming
towards saying that
all you need to do is express rules in Prolog and let the Prolog
interpreter be the control, and
then you have magically made an algorithm.

How about this, for a practical explanation:
You know you have an algorithm if you have a computer program (in an
implemented
computer language on an existing computer) and a finite set of data to
use as input.
If the program executes without error and completes in a finite amount
of time
and produces the appropriate answers for each datum, then you have an
algorithm.

If you don't have this, you might not have an algorithm.
>
> Donald Knuth has stated “I'm not strong on logic, so TAOCP treads
> lightly on this sort of thing.”
> (http://www.informit.com/articles/article.aspx?p=2213858). Having a
> clear understanding of what an algorithm is depends on having a deep
> understanding of logic.
Nope. I can argue it needs an (exceeding shallow) understanding of what
a computational "step" is, plus the (obvious) notion of deterministic
computing and
the (obvious) notion of termination. See how simple a Turing machine is.
If you want to talk about theoretical notions. And reducing everything
to logic is just
a particular bias. You could also, I expect, reduce everything to set
theory if you
wanted.
> Since Knuth states that he is not strong on
> logic, I conclude that he does not have a clear understanding of what
> an algorithm is.
I think you are using deduction instead of logic. If you were using
logic, you might
conclude that your premise is wrong.
>
>
>
>>> 4) What is the difference between a variable in
>>> mathematics and a programming language variable?
>> Partly, it depends on the programming language.
>> Lisp may be clearest on this, in that there are
>> symbols that can be associated with values in
>> various ways. Symbols can have properties. You
>> might call them variables, but that would be only
>> one use.
>>
>> Maybe a variable in math is an indeterminate?
> A variable in math being an indeterminate appears to be accurate.
> Edsger Dijkstra states “... as a rule, a mathematical variable
> represents no specific value at all...” and that programming language
> variables are actually “changeable constants”. (“A discipline of
> Programming”, Dijkstra, Edsger, p.11, 1976)

Quoting from Dijkstra gets you Dijkstra's opinion. He's not the boss of
all mathematicians.
Or the boss of all computer scientists. If this helps people writing
computer algebra system
programs to get around difficulties that they encounter and solve in
their own
ways, I don't see how.

RJF

>
>
> Ted
>

Dima Pasechnik

unread,
Dec 15, 2015, 7:16:08 PM12/15/15
to sage-flame
I shall ask the incoming Oxford's Waynflete Professor of Metaphysical Philosophy about algorithms vs logic, next
time we overlap while picking up our kids from the nursery...
I hope the margins of this newsreader will let me record the answer in sufficient detail...
 

Ted Kosan

unread,
Dec 16, 2015, 4:10:59 PM12/16/15
to sage-flame
Richard wrote:

>> “I will argue that computation is best regarded as a
>> process of controlled deduction.” (“Computation and Deduction”, Hayes,
>> P.J., Proc. 2nd MFCS Symp. Czechoslovak Academy of Sciences, 1973, pp.
>> 105-118.)
>
> Sounds to me like <mumble> = <mumble>, clarifying neither.

“Control” and “Deduction” are clearly defined in the literature of
mathematical logic.



> So, using the "calculus" of arithmetic, Control = Algorithm - Logic?

My understanding is this is correct (see below).



>> Logic programs express only the logic component of algorithms. The control
>> component is exercised by the program executor...” ("Logic for Problem
>> Solving", Kowalski, Robert, p.125, 1979)
>
> I am not familiar with this book, but it sounds like someone is aiming
> towards saying that all you need to do is express rules in Prolog and
> let the Prolog interpreter be the control, and then you have magically
> made an algorithm.

Robert Kowalski is the logician/computer scientist who did a
significant amount of the theoretical work upon which Prolog is based.
His explanations of algorithms are much deeper than any explanations I
have been able to find which were written by computer scientists who
are not logicians. I suppose that to computer scientists who are not
logicians, the writings of computer scientists who are logicians may
appear to be magical. Here is more of what Kowalski has to say about
the control and logic parts of algorithms:

“Logic programs express only the logic component L of
algorithms. The control component C is exercised by the
program executor, either following its own autonomously
determined control decisions or else following control
instructions provided by the programmer.

The conceptual separation of logic from control has
several advantages:

(1) Algorithms can be constructed by successive refinement,
designing the logic component before the control component.

(2) Algorithms can be improved by improving their control
component without changing the logic component at all.

(3) Algorithms can be generated from specifications, can be
verified and can be transformed into more efficient ones,
without considering the control component, by applying
deductive inference rules to the logic component alone.

[...]

In the systematic development of well-structured algorithms
it is appropriate for the logic component to be specified
before the control component. The logic component expresses
the domain-specific part of an algorithm. It both determines
the meaning of the algorithm and influences the way it
behaves.

The control component, on the other hand, determines the
general-purpose problem-solving strategy. It affects only
the efficiency of the algorithm without affecting its
meaning. Thus different algorithms A1 and A2 obtained by
applying different control C1 and C2 to the same logic L,
are equivalent in the sense that they solve the same
problems with the same results. Symbolically A1 and A2 are
equivalent if

A1 = L + C1
A2 = L + C2.

The equivalence of different algorithms having the same
logic can be used to improve the efficiency of an algorithm
by improving its control without changing its logic. In
particular, replacing bottom-up by top- down control often,
though not always, improves efficiency, whereas replacing
top-down sequential execution of procedure calls by top-down
consumer-producer and parallel execution almost always
improves efficiency, and never harms it.

The arguments for separating logic from control are like the
arguments for separating procedures from data structures.
When procedures are separated from data structures, it is
possible to distinguish what functions the data structures
perform from the manner in which they perform them.

An algorithm can be improved by replacing an inefficient
data structure by a more efficient one, provided that the
new data structure performs the same functions as the old
one. Similarly, when logic is separated from control, it is
possible to distinguish what the algorithm does, as
determined by the logic component, from the manner in which
it is done, as determined by the control component.

An algorithm can be improved by replacing an inefficient
control strategy by a more efficient one, provided that the
logic component is unaltered. In both cases, it is easier to
determine the meaning of the algorithm and to improve
efficiency without affecting meaning.

[...]

The analysis of algorithms into logic and control components
provides two distinct methods for improving the efficiency
of an algorithm. Given a fixed control component,
incorporated in a program executor with limited
problem-solving capabilities, efficiency can be improved by
changing the representation of the problem in the logic
component; or, given a fixed logic component, it can be
improved by improving the problem-solving capabilities of
the program executor.

Changing the logic component is a useful short-term
strategy, since the representation of the problem is
generally easier to change than the problem-solver. Changing
the control component, on the other hand, is a better
long-term solution, since improving the problem-solver
improves its performance for many different problems.”
("Logic for Problem Solving", Kowalski, Robert, pp.125-127, 1979)



> Quoting from Dijkstra gets you Dijkstra's opinion.
> He's not the boss of all mathematicians. Or the
> boss of all computer scientists.

Dijkstra was knowledgeable about mathematical logic. On topics related
to computer programming, I have found that the opinions of computer
scientists who are knowledgeable about mathematical logic are much
more informed than the opinions of computer scientists who are not
knowledgeable about mathematical logic.

Ted

Ted Kosan

unread,
Dec 16, 2015, 4:13:44 PM12/16/15
to sage-flame
Dima wrote:

> I shall ask the incoming Oxford's Waynflete
> Professor of Metaphysical Philosophy about
> algorithms vs logic, next time we overlap while
> picking up our kids from the nursery... I hope the
> margins of this newsreader will let me record the
> answer in sufficient detail...

Is the incoming Waynflete Professor Ofra Magidor
(http://users.ox.ac.uk/~ball1646/)? If so, I would be very interested
to hear what she has to say on algorithms vs. logic because she
appears to be both a computer scientist and a logician.

Ted

rjf

unread,
Dec 16, 2015, 6:59:54 PM12/16/15
to sage-flame
I am not sure exactly what tkosan is trying to say, but
if Kowalski  is arguing that Logic= collection of Prolog rules
and Control =Prolog language interpreter,

then I would venture to say that the conclusions drawn about
validity, efficiency, logic and control pertain only to a limited
domain of algorithm development. And a not very interesting one
to people building large complex systems.

Here's why..  you can read this page


Now if Kowalski and TK are arguing about resolution theorem
proving, or perhaps some other general method, then the proposal
is akin to suggesting that algorithms = collection of all axioms
+ all initial conditions + all logical rules + search.

Not a great idea, at least if you are expecting results soon.

Now what about the claim that computer scientists who
doubt Kowalski simply don't understand logic.  Like D.E. Knuth.

Does implementing Prolog in an imperative language
(say, Lisp)  mean that someone understands logic?

My feeling, from reading the extensive quote from
Kowalski, is that he has a very limited concept of what
can be done with procedural state-based programming.
Furthermore, he is using the term algorithm to denote
something rather different from the usual one.  You may
think he is "deeper"  but I would say instead "narrower".

While Prolog continues to have its adherents  (and I have
taught about it -- and how to implement it) repeatedly,
it is a niche language.  You may be aware of the Japanese
Fifth Generation Project (see wikipedia) which based
its approach on parallel computing and logic programming.
It was widely viewed as a failure.  Contributing to the
failure, the choice of logic programming.
Reply all
Reply to author
Forward
0 new messages