Problem of reduction of rational functions

141 views
Skip to first unread message

dhr

unread,
Apr 15, 2018, 3:39:36 PM4/15/18
to sage-devel
Hi

Reduction of rational functions seems not to work in specific cases.
In the following output,

===================
sage: R.<t>=QQ[]
sage: (2*t+2)/(2*t)
(2*t + 2)/(2*t)
sage: (2*t+2)/(2)
t + 1
sage: (2*t^2+2*t)/(2*t)
t + 1

===================
2 is not reduced in the first calculation.

SageMath version 8.1

Vincent Delecroix

unread,
Apr 15, 2018, 4:27:40 PM4/15/18
to sage-...@googlegroups.com
The representation is indeed not canonical but the object compare coherently

sage: R.<t>=QQ[]
sage: (2*t+2)/(2*t)
(2*t + 2)/(2*t)
sage: (2*t+2)/(2*t) == (t+1)/t
True

The reason is that 2 is a unit in QQ. You can compare with

sage: R.<t>=ZZ[]
sage: (2*t+2)/(2*t)
(t + 1)/t

It would be nice to have better simplification rules for QQ (and more
generally fraction fields).

Vincent

Dima Pasechnik

unread,
Apr 15, 2018, 6:53:08 PM4/15/18
to sage-devel


On Sunday, April 15, 2018 at 9:27:40 PM UTC+1, vdelecroix wrote:
The representation is indeed not canonical but the object compare coherently

sage: R.<t>=QQ[]
sage: (2*t+2)/(2*t)
(2*t + 2)/(2*t)
sage: (2*t+2)/(2*t) == (t+1)/t
True

The reason is that 2 is a unit in QQ. You can compare with

sage: R.<t>=ZZ[]
sage: (2*t+2)/(2*t)
(t + 1)/t

It would be nice to have better simplification rules for QQ (and more
generally fraction fields).

I suppose it's only OK to have as an option, as in general computing such a canonical
form would be slow, no?

Dima 

Nils Bruin

unread,
Apr 15, 2018, 7:06:28 PM4/15/18
to sage-devel
On Sunday, April 15, 2018 at 3:53:08 PM UTC-7, Dima Pasechnik wrote:

It would be nice to have better simplification rules for QQ (and more
generally fraction fields).

I suppose it's only OK to have as an option, as in general computing such a canonical
form would be slow, no?

For fraction fields of euclidean domains it's not so bad (as ZZ['x'].fraction_field() shows).  Furthermore, if you consistently don't clear common content from your numerator/denominator pairs you can end up with quite bad coefficient blow-up.

Of course, the work-around is to use Z['x'].fraction_field().

Dima Pasechnik

unread,
Apr 16, 2018, 4:24:39 AM4/16/18
to sage-devel
in multivariate case things like GCD are certainly very expensive.

John Cremona

unread,
Apr 16, 2018, 4:43:59 AM4/16/18
to Sage devel
In this case (one variable, over QQ) it would be simple to extract the leading coefficients of the numerator and denominator, say a and b, reduce the rational number a/b to a1/b1, and scale the rational function so that the new leading coefficients are a1 and b1 which will be coprime integers.  This is simpler than writing numerator and denominator as a rational times a primitive integral polynomial, though that is probably what users would prefer.

John
 

--
You received this message because you are subscribed to the Google Groups "sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sage-devel+unsubscribe@googlegroups.com.
To post to this group, send email to sage-...@googlegroups.com.
Visit this group at https://groups.google.com/group/sage-devel.
For more options, visit https://groups.google.com/d/optout.

Matthias Koeppe

unread,
Apr 16, 2018, 5:06:26 AM4/16/18
to sage-devel

Marc Mezzarobba

unread,
Apr 23, 2018, 1:31:55 PM4/23/18
to sage-...@googlegroups.com
Matthias Koeppe wrote:
> This is discussed in https://trac.sagemath.org/ticket/16993

...and https://trac.sagemath.org/ticket/16268 (needs review!).

--
Marc

Marc Mezzarobba

unread,
Apr 23, 2018, 1:37:19 PM4/23/18
to sage-...@googlegroups.com
John Cremona wrote:
> This is simpler than writing numerator and denominator as a rational
> times a primitive integral polynomial, though that is probably what
> users would prefer.

And (at least in my limited experience), for rational fractions over
general fraction fields (things like ℚ(x)(y), say), normalizing by
extracting the contents can be very slow.

--
Marc

Vincent Delecroix

unread,
Apr 27, 2018, 2:19:24 AM4/27/18
to sage-...@googlegroups.com
That does not prevent us to normalize for let say: __repr__() ,
numerator(), denominator(). Is there any reasonable rule to choose when
to and when not to normalize? For QQ itself, GMP does normalize all the
time.

Marc Mezzarobba

unread,
Apr 27, 2018, 4:01:31 AM4/27/18
to sage-...@googlegroups.com
Vincent Delecroix wrote:
> That does not prevent us to normalize for let say: __repr__() ,
> numerator(), denominator(). Is there any reasonable rule to choose
> when to and when not to normalize? For QQ itself, GMP does normalize
> all the time.

Just to be certain that we are talking about the same thing: Currently,
elements of fraction fields are always reduced (except over inexact
rings) in the sense of dividing out by the gcd of the numerator and the
denominator. But no further normalization is done, and in particular,
when the numerator and denominator are polynomials, their contents or
leading coefficients are not normalized.

There is a ticket needing review that proposes to normalize the leading
coefficient of the denominator to one:

https://trac.sagemath.org/ticket/16268

Actually, I first tried implementing a version that would systematically
clear denominators inside the numerator and denominator of the fraction,
but I didn't manage to make it reasonably fast, even with a dedicated
class for polynomials over fraction fields (with a single denominator,
similar to the representation of polynomials over ℚ). While I'm not
saying it can't be done, in the short term, the version at #16268 is a
lot simpler and already fixes a number of issues!

Regarding your question: Having a variant of numerator()/denominator()
that works recursively would definitely be useful, and yes, it may be
nice to use it in __repr__(). At the same time, I think we do want to
keep programmatic access to the “true” numerator and denominator as
defined in the data structure. In case you want to experiment with the
idea, here is a (perhaps a bit naive, improvements welcome!)
implementation of something similar that I needed for something I'm
working on. It is intended to be used with #16268 and #23909.

def _my_lcm(elts): # non monic
l = ZZ.one()
for p in elts:
l *= (p//p.gcd(l))
return l

def _clear_denominators_1(elt, dom):
num, den = elt.numerator(), elt.denominator()
base = dom.base_ring()
if isinstance(base, FractionField_generic):
numnum, numden = clear_denominators(num, base)
dennum, denden = clear_denominators(den, base)
newdom = dom.change_ring(base.ring())
numnum = newdom(numnum)
dennum = newdom(dennum)
gnum = numnum.gcd(dennum)
numnum, dennum = numnum//gnum, dennum//gnum
gden = numden.gcd(denden)
numden, denden = numden//gden, denden//gden
return (numnum, numden, dennum, denden)
else:
return (num, dom.one(), den, dom.one())

def clear_denominators(elts, dom=None):
r"""
Recursively clear denominators in a list (or other iterable) of
elements.

Typically intended for elements of fields like QQ(x)(y).
"""
if not elts:
return elts, dom.one()
if dom is None:
dom = elts[0].parent()
if isinstance(dom, FractionField_generic):
ring = dom.ring()
split = [_clear_denominators_1(elt, ring) for elt in elts]
lcmnum = _my_lcm((dennum for _, _, dennum, _ in split))
lcmden = _my_lcm((numden for _, numden, _, _ in split))
num = [(denden*(lcmden//numden))*(numnum*(lcmnum//dennum))
for (numnum, numden, dennum, denden) in split]
den = lcmden*lcmnum
g = gcd(num + [den]) # XXX: can we be more specific here?
num = [p//g for p in num]
den = den//g
else:
num, den = elts, dom.one()
# assert all(b/den == a for a, b in zip(elts, num))
# assert gcd(num + [den]).is_one()
return num, den

--
Marc

rjf

unread,
Apr 28, 2018, 1:07:41 PM4/28/18
to sage-devel
A  few comments.
1. Giving someone a surprising or undesirable result because
a GCD computation is considered expensive sounds like a really
bad idea.  (I believe I'm overstating the situation here, but that's
sort of what it sounds like.
2. (Judging from Macsyma/Maxima)  it is ok, workable, and
only occasionally surprising, to allow almost anything in the
"general representation" used in the top-level command language.
Maybe not division by an explicit zero,  or wrong-number-of-args
to a famous function.  but unreduced fractions is ok if the
reduction is not obvious.

3. On the other hand, a select set of canonical representations
is available in Maxima. NOT everything that might be made canonical, but
ones that seemed useful.

Thus canonical rational expressions are ratios of polynomials with
INTEGER coeficients. The nice thing about this is GCD is well defined.
If you have ratio of polynomials with RATIONAL coefficients, you
could canonicalize by making the denominator monic, but frankly,
you can continue down that route and have any number of peculiar
edge cases -- puzzling even to someone who knows some
modern algebra, and just outright mysterious to someone who
is a physicist (etc) who thinks a field is where you play baseball,
and a ring is worn on a finger.

It's your design decision, so you live with the consequences.
Or change your design.  I recall that Axiom had a mathematical
category called  "integration result"  to use as a container for
results from Risch or other procedures.

RJF



Reply all
Reply to author
Forward
0 new messages