--
You received this message because you are subscribed to the Google Groups "sage-flame" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sage-flame+...@googlegroups.com.
To post to this group, send email to sage-...@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-flame.
For more options, visit https://groups.google.com/d/optout.
If we can trust the report at danluu.com about the julia language (I didn't check),
there is likely little reason for sage developers to fear competition with Nemo (just in case) in the near future:
the julia language development seems to have serious quality issues.
On the other side it is sad, that another computer algebra package (Nemo),
potentially reusable in other CAS might struggle on software development issues...
On 12/11/2015 3:22 PM, Jakob Kroeker wrote:
Looking at the description of Nemo here
If we can trust the report at danluu.com about the julia language (I didn't check),
there is likely little reason for sage developers to fear competition with Nemo (just in case) in the near future:
the julia language development seems to have serious quality issues.
http://nemocas.org/
I would not call it a computer algebra system (CAS) as usually used to denote
symbolic and algebraic manipulation programs like Maxima, Mathematica, Maple.
On the other side it is sad, that another computer algebra package (Nemo),
potentially reusable in other CAS might struggle on software development issues...
Frankly, any application package that asserts in the first line of its description
that it is "for the X programming language" is likely to be written to appeal to
programmers using X.
If Sage appeals to pythonistas and Nemo to
julianistas, there is no conflict.
I assume that the people doing Nemo have some reason to do it,
otherwise endlessly rewriting the basic routines used for (say) polynomial arithmetic
does not advance the understanding of such software. Maybe the speed is
improved and this makes people happy. Sometimes that makes me happy, not
that I am so impatient to get results out of my computer programs.
----
You received this message because you are subscribed to the Google Groups "sage-flame" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sage-flame+...@googlegroups.com.
To post to this group, send email to sage-...@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-flame.
For more options, visit https://groups.google.com/d/optout.
You received this message because you are subscribed to the Google Groups "sage-flame" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sage-flame+...@googlegroups.com.
To post to this group, send email to sage-...@googlegroups.com.
Visit this group at https://groups.google.com/group/sage-flame.
On Fri, Dec 11, 2015 at 5:25 PM, 'Bill Hart' via sage-flame
<sage-...@googlegroups.com> wrote:
> So now sage-flame is used for flaming Nemo? Why the sudden interest? Bored
> with life? Can't sleep?
>
> On 12 December 2015 at 02:13, Richard Fateman <fat...@berkeley.edu> wrote:
>>
>> On 12/11/2015 3:22 PM, Jakob Kroeker wrote:
>>
>>
>>
>> If we can trust the report at danluu.com about the julia language (I
>> didn't check),
>> there is likely little reason for sage developers to fear competition with
>> Nemo (just in case) in the near future:
>> the julia language development seems to have serious quality issues.
>>
>> Looking at the description of Nemo here
>> http://nemocas.org/
>>
>> I would not call it a computer algebra system (CAS) as usually used to
>> denote
>> symbolic and algebraic manipulation programs like Maxima, Mathematica,
>> Maple.
>
>
> Correct. It is not a computer algebra system. Which is why we call it a
> "computer algebra package" with the aim of eventually covering commutative
> algebra (also often called computer algebra), number theory and group theory
> (most of which it does not cover yet, because you know, time and all).
"Computer algebra package" versus "computer algebra system" is not
really much of a distinction.
Everywhere I always try to call Sage "mathematical software". Calling
"mathematical software" is something I just made up, since I
personally don't like the term "computer algebra" to describe Sage (or
Nemo), since it's misleading.
>>
>>
>> As a computer algebra package it makes use of
>> Flint, Antic, Pari, GMP/MPIR, MPFR, Singular and Arb.
>> all of which are presumably in Sage already.
>
>
> And yet Singular is for computer algebra, Pari for number theory, etc.
>
> Damned if you do, damned if you don't.
Don't -- just use "mathematical software" instead.
Or you could call it "The world's definitive system for modern
technical computing"...
On Fri, Dec 11, 2015 at 3:22 PM, Jakob Kroeker
<google...@spaceship-earth.net> wrote:
>
>
> If we can trust the report at danluu.com about the julia language (I didn't
> check),
> there is likely little reason for sage developers to fear competition with
> Nemo (just in case) in the near future:
Sage developers are on the same team as Julia developers -- there's no
fear here. When Nemo has substantial 10x-better functionality than
Sage in enough ways, it'll just be included in Sage; our frickin'
source download is 500MB after all.
I fear competition with Magma, Mathematica, Matlab, and maybe Maple much more.
For what it's worth, I appreciate you posting that blog post; I read
it carefully, along with some others on the guy's page. He basically
just described how every rapidly developing software project is during
the early days, and says so very clearly at the end. The best
approach to organizing the development of software depends hugely on
the age of the software, just like the best approach to treating a
child depends on their age, to running a company depends on its age
and size (and industry), etc.
The only thing that is definitely
wrong is trying to apply one-size-fits-all approaches... to anything.
On Fri, Dec 11, 2015 at 9:01 PM, 'Bill Hart' via sage-flame
<sage-...@googlegroups.com> wrote:
>> Sage developers are on the same team as Julia developers -- there's no
>> fear here. When Nemo has substantial 10x-better functionality than
>> Sage in enough ways, it'll just be included in Sage; our frickin'
>> source download is 500MB after all.
>
>
> Julia plus its dependencies would add quite a lot to that. But since some of
> those dependencies are already in Sage, it probably wouldn't be a huge hit.
How much? I would rather no sooner rather than later...
> It should in theory be possible to embed Julia and use it from within Sage
> fairly easily should that ever be desirable. We'll see about the 10x-better
> functionality. That's a pretty tall order.
> I'm not sure how motivated people will be to incorporate ideas from Nemo
> into Sage. That'll probably dictate whether Nemo ends up being part of Sage
> or not.
A huge point of Sage is incorporating ideas from other software, at
least if the software is sufficiently good. Building the car, not
reinventing the wheel. We'll want Nemo.
On 12 December 2015 at 07:57, William Stein <wst...@gmail.com> wrote:On Fri, Dec 11, 2015 at 9:01 PM, 'Bill Hart' via sage-flame
<sage-...@googlegroups.com> wrote:
>> Sage developers are on the same team as Julia developers -- there's no
>> fear here. When Nemo has substantial 10x-better functionality than
>> Sage in enough ways, it'll just be included in Sage; our frickin'
>> source download is 500MB after all.
>
>
> Julia plus its dependencies would add quite a lot to that. But since some of
> those dependencies are already in Sage, it probably wouldn't be a huge hit.
How much? I would rather no sooner rather than later...I think it'll come in under 100mb. But it's certainly not going down and is still changing.
libedit-dev
, libncurses5-dev.
Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".
It outlines the elementary parts of the subject pretty well.
Bear in mind that we intend to rely on Singular for the majority of the computer algebra we intend to provide in Nemo.
Specifically, its Groebner basis engine is crucial for what we have in mind. We also intend to supply access to the many Singular libraries written in the Singular language, modulo many improvements we have planned. Its obviously not going to happen overnight.
On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Well, if you choose to define computer algebra that way, then it defines it.Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".
It outlines the elementary parts of the subject pretty well.
I would not agree at all.
It treats "symbolic integration" in about 1.5 pages out of 222, and there it is misleading,
not distinguishing the two rather different problems of symbolic definite integration
and symbolic (indefinite) integration in terms of elementary functions. It is also incorrect
in asserting that Risch's methods are an algorithm. And (I suspect) is incorrect in
claiming Axiom is best on this.
It omits entirely some very significant mathematics, e.g. look up Schanuel's conjecture and
results of Daniel Richardson.
It omits mention of Collins, Brown, Zippel, etc regarding computation of polynomial GCDs.
In fact, it seems to omit complexity entirely.
It also omits entirely mention of the FFT,
an important underlying (discrete or continuous) method. I think I could find other
omissions if I compared a course outline of mine to the notes.
This appears to be a course on algebra. It occasionally mentions computer programs, but
it is a course in mathematics essentially devoid of the (in my opinion) far more interesting
components of the subject (Doing Mathematics on a Computer), whatever you might wish to call it.
Mr Boehm's notes are not unique in its style of coverage. Just (again, my opinion)
unappealing. As a teacher of a graduate course with a similar title, I could never
use these notes. My course was in Computer Science, not Mathematics.
While we are dealing with naming issues, it was my impression that the objection to
the term "manipulation" came from German-speaking participants at a conference,
and I was told that it had, in German, uncomfortable associations. Now that Google translate
is available, I tried some translations. Manipulieren doesn't sound so bad. Frisieren
has implications of "trickery". I was under the impression that it might have been worse,
that the translation was too close to Masturbieren for comfort.
Maybe it was just that manipulation sounds non-rigorous, and not a proper subject for
study? I have certainly noticed the emphasis on rigor in teaching from our visitors
from Germany. Unless they can change their approach,
it makes them generally ineffective as undergraduate instructors.
There is the example of a large apparently well-researched book titled "Modern Computer Algebra"
whose first edition omitted most of the work on symbolic indefinite integration (beyond rational
functions), because the complexity of the Risch "algorithm" was not yet analyzed.
(The fact that Risch relies on recursively non-computable sub-tasks is a hazard).
This book does, however, represent a point of view that is also dubious, which is
that the proper study of the subject should be dominated by complexity analysis of
worst cases. While it does occasionally delve into timing, its analysis is
not (in my opinion) credible.
Oddly, Groebner basis calculation form a very small part of the course I usually taught, though.
Bear in mind that we intend to rely on Singular for the majority of the computer algebra we intend to provide in Nemo.
Specifically, its Groebner basis engine is crucial for what we have in mind. We also intend to supply access to the many Singular libraries written in the Singular language, modulo many improvements we have planned. Its obviously not going to happen overnight.
Henrik Lenstra teaches (or taught) about applications of Groebner calcs.
On 12 December 2015 at 18:30, Richard Fateman <fat...@berkeley.edu> wrote:On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Well, if you choose to define computer algebra that way, then it defines it.Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".
It outlines the elementary parts of the subject pretty well.
I would not agree at all.I think I did say it covers only the elementary parts of the subject. Much of the stuff you mention would be suitable for a graduate course maybe. I wouldn't teach it to undergraduates.
It treats "symbolic integration" in about 1.5 pages out of 222, and there it is misleading,
not distinguishing the two rather different problems of symbolic definite integration
and symbolic (indefinite) integration in terms of elementary functions. It is also incorrect
in asserting that Risch's methods are an algorithm. And (I suspect) is incorrect in
claiming Axiom is best on this.
It omits entirely some very significant mathematics, e.g. look up Schanuel's conjecture and
results of Daniel Richardson.Honestly, that just sounds like two random results out of many. You really think such a course should include mention of every such thing?
It omits mention of Collins, Brown, Zippel, etc regarding computation of polynomial GCDs.
In fact, it seems to omit complexity entirely.I actually agree with you that the course could benefit from the addition of these. In fact, we were discussing this the other day at lunch (not with Dr. Boehm though). But there is the matter of the appetite of a student to be taken into account. On a first reading they find some things too tedious.It also omits entirely mention of the FFT,
an important underlying (discrete or continuous) method. I think I could find other
omissions if I compared a course outline of mine to the notes.You could fill a book with results on that. There's over 800 papers in the literature on this. Probably its omission says more about Dr. Boehm's interests than Computer Algebra as a subject.
(starting fresh without including quotes)..
I disagree that Boehm's course notes are somehow foundational and other topics that might be
in a graduate course would depend on these.
One could presumably construct such a course,
but it would not be one I would care to teach (or study). A few sections of the notes are useful,
maybe.
What I find important about FFT is that it is possible to multiply polynomials surprisingly
fast that way, (and with a bit of handwaving one can point out that it can be used to multiply long numbers.)
I think students find it interesting that there is no algorithmic solution to the
problem of "is f(x) always zero" for a modest class of expressions defining f.
The question that should be is students' minds is
"Can we mechanize all of mathematics?"
On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Well, if you choose to define computer algebra that way, then it defines it.Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".
It outlines the elementary parts of the subject pretty well.
I would not agree at all.
It treats "symbolic integration" in about 1.5 pages out of 222, and there it is misleading,
not distinguishing the two rather different problems of symbolic definite integration
and symbolic (indefinite) integration in terms of elementary functions. It is also incorrect
in asserting that Risch's methods are an algorithm.
And (I suspect) is incorrect in
claiming Axiom is best on this.
Oddly, Groebner basis calculation form a very small part of the course I usually taught, though.
Bear in mind that we intend to rely on Singular for the majority of the computer algebra we intend to provide in Nemo.
Specifically, its Groebner basis engine is crucial for what we have in mind. We also intend to supply access to the many Singular libraries written in the Singular language, modulo many improvements we have planned. Its obviously not going to happen overnight.
Henrik Lenstra teaches (or taught) about applications of Groebner calcs.
On Saturday, 12 December 2015 17:30:51 UTC, fateman wrote:On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Well, if you choose to define computer algebra that way, then it defines it.Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".
It outlines the elementary parts of the subject pretty well.
I would not agree at all.
Indeed, this is a course on computational commutative algebra (that's what they are good
at over there).
It skims over other topics in the introduction, but otherwise it's just mistitled.
On 12 December 2015 at 22:14, Dima Pasechnik <dim...@gmail.com> wrote:
On Saturday, 12 December 2015 17:30:51 UTC, fateman wrote:On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Well, if you choose to define computer algebra that way, then it defines it.Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".
It outlines the elementary parts of the subject pretty well.
I would not agree at all.
Indeed, this is a course on computational commutative algebra (that's what they are good
at over there).No I used to think that too, but after attending the Mathemagix conference some years back (the best conference I have ever attended, without a doubt, and by a big margin), I realised that there was much more to the subject than I had originally thought.This was independently confirmed by people like Joris van der Hoeven, Stephen Watt, Dan Grayson and others, who talked about computer algebra topics of which I was completely unaware at the time. They spoke of them as though they were somehow quite central to the topic. I was confused at the time because I had been (mis)led into believing that computer algebra was a branch of theoretical computer science. I also didn't know anything at all about these topics at the time and was quite blindsided by their discussions. They seemed to assume I also knew all about those topics, and yet somehow I had never been exposed to them.I see now that it is the commutative algebra in the subject that makes it mathematical and not just a branch of theoretical computer science, which is incidentally what I always thought, but couldn't justify to people.So unless you want to exclude the entire German Fachgruppe Computeralgebra, Stephen Watt, the Macaulay 2 group, Joris van der Hoeven and collaborators and many others in Asia and Europe, you can't justify that those lecture notes are not on computer algebra.
It skims over other topics in the introduction, but otherwise it's just mistitled.It does certainly skim over some of the other topics in the subject. But the title is well-defended.
On Saturday, 12 December 2015 21:28:33 UTC, Bill Hart wrote:On 12 December 2015 at 22:14, Dima Pasechnik <dim...@gmail.com> wrote:
On Saturday, 12 December 2015 17:30:51 UTC, fateman wrote:On 12/11/2015 9:59 PM, 'Bill Hart' via sage-flame wrote:
Well, if you choose to define computer algebra that way, then it defines it.Here for example is a set of notes in English by one of our lecturers/researchers, called "Computer Algebra".
It outlines the elementary parts of the subject pretty well.
I would not agree at all.
Indeed, this is a course on computational commutative algebra (that's what they are good
at over there).No I used to think that too, but after attending the Mathemagix conference some years back (the best conference I have ever attended, without a doubt, and by a big margin), I realised that there was much more to the subject than I had originally thought.This was independently confirmed by people like Joris van der Hoeven, Stephen Watt, Dan Grayson and others, who talked about computer algebra topics of which I was completely unaware at the time. They spoke of them as though they were somehow quite central to the topic. I was confused at the time because I had been (mis)led into believing that computer algebra was a branch of theoretical computer science. I also didn't know anything at all about these topics at the time and was quite blindsided by their discussions. They seemed to assume I also knew all about those topics, and yet somehow I had never been exposed to them.I see now that it is the commutative algebra in the subject that makes it mathematical and not just a branch of theoretical computer science, which is incidentally what I always thought, but couldn't justify to people.So unless you want to exclude the entire German Fachgruppe Computeralgebra, Stephen Watt, the Macaulay 2 group, Joris van der Hoeven and collaborators and many others in Asia and Europe, you can't justify that those lecture notes are not on computer algebra.The notes are on a topic of computer algebra called computational commutative algebra/algebraic geometry.
It is a a big topic, and among the central ones, but not the only one. Computational number theory or
computational group theory can get away without a lot of computational commutative algebra.
And I don't see why computational differential Galois theory (also known as symbolic integration) is brushed aside.
(this is just pure ignorance, if you ask me).
Or computational topology.
No, really, Groebner basis is a nice idea, but it's not that one cannot live without; even in computational
commutative algebra one can get quite far with resultants and various geometric tricks.
I am afraid to a large extent it has to do with politics in this science :-(
People want to partition a broad area into small subjects and fight over grant money
and jobs. And to win this fight, one concentrates on narrow topics...
Leaving aside benefits of knowing tools from far away areas,
one almost certainly does not need Grobner bases if one is doing computations in finite groups.
And someone who does commutative algebra does not need strong generating sets
for permutation groups, or power-commutator persentations
(while these topics are something that they swear by in computational group theory)
And in computational real algebraic geometry Groebner bases are of very limited use,
believe me, I work in this area. As soon as you want to find out whether a system of
polynomial inequalities determines an empty set, you are almost certainly seriously out
of luck with Groebner bases.
It skims over other topics in the introduction, but otherwise it's just mistitled.It does certainly skim over some of the other topics in the subject. But the title is well-defended.
That title is highly political. Like the title of "Pravda" newspaper ;-)
I imagine lecture notes on Computer Algebra in Aachen might not even mention
Groebner bases at all...
critical points on a deformed hypersurface: g(X)=e(sum_k X_k^{2m})+(1-e)f(X)=0, with deg(f)<2m, take partial derivatives d/dX_k g(X)=0, k>1;
together with g(X), they already form a Groebner basis (if we must mention it, although we don't need to).
Find roots of this system using Stickelberger Lemma (linear algebra, more or less), and then take lim_{e->0}.
Funnily enough, this is the fastest, in theory, known way to solve polynomial systems over reals...
I am afraid to a large extent it has to do with politics in this science :-(
People want to partition a broad area into small subjects and fight over grant money
and jobs. And to win this fight, one concentrates on narrow topics...Here we go again...Leaving aside benefits of knowing tools from far away areas,
one almost certainly does not need Grobner bases if one is doing computations in finite groups.Computational group theory only partially overlaps with commutative algebra. It's really its own topic.And someone who does commutative algebra does not need strong generating sets
for permutation groups, or power-commutator persentations
(while these topics are something that they swear by in computational group theory)Of course.
And in computational real algebraic geometry Groebner bases are of very limited use,
believe me, I work in this area. As soon as you want to find out whether a system of
polynomial inequalities determines an empty set, you are almost certainly seriously out
of luck with Groebner bases.Sure.It skims over other topics in the introduction, but otherwise it's just mistitled.It does certainly skim over some of the other topics in the subject. But the title is well-defended.
That title is highly political. Like the title of "Pravda" newspaper ;-)
I imagine lecture notes on Computer Algebra in Aachen might not even mention
Groebner bases at all...Bill.
--
The fact remains that a number of very fundamental computations that you want to do in commutative algebra more or less require Groebner bases. They drop out of the ideal membership problem quite naturally and almost fundamentally (apart from still essentially depending on an ordering).That's a bad thing of course. The oftentimes double exponential nature of the algorithms means that we'd really like better algorithms than we currently have. Any time we can avoid using a Groebner basis, we should. But you certainly can't do away with them. They are absolutely required for many applications, even fundamental ones, as things are currently.If that's not the case, then we'd really like to know about it. We want to implement such fundamental things in Nemo, and after looking around for people who have described alternative methods that are faster, we don't see much on offer. For specific applications and problem domains, other techniques exist. But for many things you are completely constrained and must use GBs.
The fact remains that a number of very fundamental computations that you want to do in commutative algebra more or less require Groebner bases. They drop out of the ideal membership problem quite naturally and almost fundamentally (apart from still essentially depending on an ordering).That's a bad thing of course. The oftentimes double exponential nature of the algorithms means that we'd really like better algorithms than we currently have. Any time we can avoid using a Groebner basis, we should. But you certainly can't do away with them. They are absolutely required for many applications, even fundamental ones, as things are currently.If that's not the case, then we'd really like to know about it. We want to implement such fundamental things in Nemo, and after looking around for people who have described alternative methods that are faster, we don't see much on offer. For specific applications and problem domains, other techniques exist. But for many things you are completely constrained and must use GBs.
Well, I am not saying that I know faster methods to do ideal membership, etc, (especially not in characteristic p)
but there are many things out there that don't need ideal membership, e.g. real domain things
such as solutions of systems of polynomial inequalities (i.e. semi-algebraic sets),
yet they are fundamental for many applications of computer algebra.
A fundamental problem is to have an effective Positivstellensatz implementation, something that
has a lot to do with nonnegative polynomials and sums of squares. Another fundamental problem is to have
quantifier elimination over reals.
Similarly, symbolic integration is what computer algebra systems are used for a lot, and no system out there
does it quite right...
Again, trimming all the quotes.
1. Perhaps my own prejudice, but I view Groebner basis computation as a last
resort. That is, if the task can be accomplished by any other method, that method
is faster.
There are several classic tasks that have very high complexity, and also
exhibit high computation times in practice, even on what most people would think
are rather small examples. GB for one. Cylindrical Algebraic Decomposition.
Resultant calculations, which can be slow if you have a bad GCD algorithm,
can be used instead of GB for some tasks.
2. Risch integration is not an algorithm because it requires determining the algebraic
independence of elements constructed in a differential field by adjoining
various functions. This can require that we know things that we don't know
and/or provably cannot compute, like is some f(x) identically zero. (D. Richardson
results say -- no.)
Schanuel's conjecture involves the independence of
transcendental extensions exp(z) over a base field. We do not know
even that e^e is not rational. or e+pi. So this conjecture is not some random
number theory conjecture, but one that has important bearing on the constructive
nature of certain computations that are part of the Risch "algorithm".
The
questions also come up in simplification or "zero-equivalence".
As you know, certain tasks require you to know the degree of a polynomial.
If the leading term's coefficient is zero, then it is not the leading term.
If the coefficients of all the exponential terms are zero, maybe what you
have left is a polynomial? etc..
(if you care, the Wikipedia article on this, I just
glanced at, seems brief/easy reading/correct as far as I can tell.
So even if Bronstein stamped out all the other nuances of what is needed to
do Risch integration, his methods still must rely on functionality that is not constructive.
I am sure he was well aware of this.
I suspect that the Mathematica integration program is quite strong, but I
am insufficiently motivated to try to compare it to others.
Besides which, indefinite integration in finite terms is rarely of interest.
Definite integration, ESPECIALLY when it cannot be done by simply
substituting limits into the indefinite integral, is far more important.
It is also usually done quite accurately and fast in practical circumstances
by numerical programs.
To some extent CAS are used to just do Freshman calculus homework
or to do impressive parlor tricks.
Wolfram made a great impression on
the media by showing 3D plots of math functions.
What, after all this, should we be impressed by? What is a task that
is hard, important, can be done by a CAS and probably not without it?
I think that one area is proving the correctness of important (numerical)
programs.
Too bad this interesting discuss (a) has diverged from the title, and
(b) is even in the wrong newsgroup.
Again, tossing out the quotes.
In the relatively early days of building systems for mathematics, the
motivation was to make systems that were experts in doing what
mathematicians do. The thought was that by building computer
experts in different fields one could link them together and build
something that effectively mimicked human behavior in a non-trivial
domain.
A less ambitious version of this would be to figure out the
"Arithmetization" of mathematics. An area of interest going
back to G. Frege, Russell & Whitehead, and (today) MKM
researchers.
This too is (dare I say, impossibly) ambitious.
So what did people do that was not impossibly ambitious?
For the most part, they tried
to do interesting applications they thought could be automated, and found
difficulties (hence research on polynomial GCD,
polynomial factoring, integration, limits, series.)
Other people decided to do things motivated by their own
curiosity that they thought could be automated (applications to "pure mathematics").
They had separate sessions in the early conferences
run by SIGSAM. There was very little cross fertilization with
the "general purpose" system builders.
I remember discussing a problem having to do with
simplification of "nested radicals" with John Tate,
then at Harvard. He suggested a simplification of the
problem which would make it quite doable but totally
uninteresting.
What I kind of have in mind is a kind of "projects" folder within Nemo, where people can implement their own specialised project. (Likely there will be lots of external libraries that build on Nemo as well.)
William Stein, if I recall correctly, tried to solve this problem with Sage itself. Mathematicians working on a personal research interest with little interest or expertise in general purpose system building are critical to the lifeblood of a genuine computer algebra community. Those are the smart pure mathematicians who you want to be using your system, promoting your system and to share their expertise with those implementing the general purpose parts of your system (often they have good ideas, motivated by their specific application, even if they aren't interested in implementing those themselves).
But by and large, no one is really interested in using their code because it may have been implemented badly, was only intended to provide a few examples for a paper and not really intended for general purpose use, was not documented well, if at all, and it has never been run on more than a handful of examples (no one knows how to check it in other cases, if it even terminates or uses a reasonable amount of ram, or is even theoretically applicable to more than a handful of examples). And it's too specialised for anyone to really understand or review properly anyway.
As I recall, William started psage to encourage at least slightly more pure maths oriented applications of Sage, without all the code review and doctesting and software engineering that goes into the main project. Unfortunately I don't think psage survived.
So it's an unsolved problem really. How does one encourage mathematicians with very specialised scripts to contribute them without messing up the rest of the project.
Actually, I find it really, really hard to convince people that they do not need open commit access to the repository and the ability to add whatever they want, whenever they want, in any state that they want, in order to make use of a computer algebra project. Mathematicians invariably argue for more genericity and less optimisation, and for far fewer software engineering restrictions, all of which run counter to good general purpose system development goal and sound software engineering principles.
Somehow the Magma project has managed this to some degree. But I don't really understand how. On the other hand, we only have their word for it that the project is managed carefully from a software engineering perspective and is actually fit for use by a wide variety of users.
We can't see all their source code to verify this. So maybe they are just cheating. For all we know, Magma is full of algorithms that don't even guarantee a correct answer.
Who would know? What kind of review has most of it been subjected to? Certainly not public scrutiny.
I remember discussing a problem having to do with
simplification of "nested radicals" with John Tate,
then at Harvard. He suggested a simplification of the
problem which would make it quite doable but totally
uninteresting.
Interesting anecdote.
Parts of this discussion seem to indicate that many developers who work on computer algebra systems don’t seem to have a clear understanding of what a computer algebra system is essentially.
I suspect this may be because they don’t have a clear understanding of what algebra, computation, and related concepts are. Is anyone on this list able to answer the following questions precisely?
1) What is the difference between a calculus and an algebra? (The mathematicians at the university where I teach were unable to provide a clear answer to this question)
2) What exactly is computation?
3) What exactly is an algorithm?
4) What is the difference between a variable in mathematics and a programming language variable?
Ted
>> 4) What is the difference between a variable in
>> mathematics and a programming language variable?
>
> Partly, it depends on the programming language.
> Lisp may be clearest on this, in that there are
> symbols that can be associated with values in
> various ways. Symbols can have properties. You
> might call them variables, but that would be only
> one use.
>
> Maybe a variable in math is an indeterminate?
A variable in math being an indeterminate appears to be accurate.
Edsger Dijkstra states “... as a rule, a mathematical variable
represents no specific value at all...” and that programming language
variables are actually “changeable constants”. (“A discipline of
Programming”, Dijkstra, Edsger, p.11, 1976)
Ted