Function roots() in package Polynomial

1,327 views
Skip to first unread message

Hans W Borchers

unread,
May 5, 2014, 4:14:44 PM5/5/14
to julia...@googlegroups.com
The following is the so-called Wilkinson polynomial, that has zeros at 1, 2, ..., 20 :

    julia> using Polynomial


    julia
> pW = poly([1.0:20.0])


    julia
> roots(pW)
   
20-element Array{Complex{Float64},1}:
         
1.0-0.0im
         
2.0-0.0im
         
3.0-0.0im
     
3.99999-0.0im
     
5.00006-0.0im
     
6.00181-0.0im
     
6.95031-0.0im
 
8.09172-0.579086im
 
8.09172+0.579086im
 
9.70891-1.64679im
 
9.70891+1.64679im
 
11.7999-2.59604im
 
11.7999+2.59604im
 
14.3615-3.12883im
 
14.3615+3.12883im
 
17.1543-2.89387im
 
17.1543+2.89387im
 
19.5758-1.70785im
 
19.5758+1.70785im
     
20.6635-0.0im


Compare this with the same computation in Matlab:

    >> pw = poly(1:20);
   
>> roots(pw)

    ans
=

       
20.0003
       
18.9972
       
18.0112
       
16.9711
       
16.0483
       
14.9354
       
14.0653
       
12.9491
       
12.0334
       
10.9840
       
10.0061
       
8.9984
       
8.0003
       
7.0000
       
6.0000
       
5.0000
       
4.0000
       
3.0000
       
2.0000
       
1.0000


The roots of this polynomial are known to be difficult to determine, but the results in Julia's Polynomial package seem really far off.

Tony Kelman

unread,
May 5, 2014, 5:24:47 PM5/5/14
to julia...@googlegroups.com
Good catch. This looks almost entirely due to the order in which Polynomial.jl is creating the companion matrix. The modified version at https://github.com/loladiro/Polynomials.jl which stores the coefficients in the mathematical order instead of the Matlab order does a little better, but worryingly I'm getting noticeably different results between 32 bit and 64 bit.

Just transposing the companion matrix gives much better-behaved results for this polynomial, not exactly the same as Matlab but pretty close.

j verzani

unread,
May 6, 2014, 5:42:12 PM5/6/14
to julia...@googlegroups.com
Not just 32 and 64, these coefficients, as produced through poly([1:20]) or poly([1.0:20.0]), aren't accurate as integers or floating point at 64 bit. You can see this by comparing `poly(big(1):big(20)).a` with `big(1.0:20.0).a`, say. Perhaps it is interesting, the differences in the coefficients found here (http://blogs.mathworks.com/cleve/2013/03/04/wilkinsons-polynomials/) differ in one term (the x^16 coefficient) from those of julia on my macbook pro. This small perturbation might be part of the difference you see as well. 

Now just for fun, the bisection method can get this right using big integers,as you can see with `using Roots; fzeros(x->polyval(poly(big(1):big(20)),x), 0, 21)`. (It kinda gets lucky, fzeros just splits the interval into a few hundred pieces and looks for bracketing intervals.)

Hans W Borchers

unread,
May 6, 2014, 11:16:08 PM5/6/14
to julia...@googlegroups.com
So where is the problem/bug?
I did construct the companion matrix manually, starting from the vector of
coefficients of the Wilkinson polynomial:

    julia> pW = [
                       
1.0,                  -210,                 20615,
                 
-1256850,              53327946,           -1672280820,
               
40171771630,         -756111184500,        11310276995381,
         
-135585182899530,      1307535010540395,    -10142299865511450,
         
63030812099294896,   -311333643161390656,   1206647803780373248,
     
-3599979517947607040,   8037811822645052416, -12870931245150988288,
     
13803759753640704000,  -8752948036761600000,   2432902008176640000];


    julia
> pw = pW[2:21] / pW[1];                   # normalize 1st coefficient

    julia
> A = vcat(-pw', hcat(eye(19), zeros(19)); # companion matrix

    julia> eigvals(A)
    20-element Array{Float64,1}:
     20.0003
     18.9979
     18.0072
     16.9854
     16.0183
     14.9856
     14.0067
     12.9987
     12.0002
     10.9992
     10.0009
      8.99944
      8.0002
      6.99995
      6.00001
      5.0    
      4.0    
      3.0    
      2.0    
      1.0    

This is almost the same result as with Matlab above. Does this mean, Polynomial is constructing a different / wrong companion matrix?

Tony Kelman

unread,
May 7, 2014, 9:42:03 PM5/7/14
to julia...@googlegroups.com
Yes, Polynomial is using a different convention than Matlab or what you used below in how it constructs the companion matrix. Polynomials.jl uses yet another convention. Both produce more accurate (comparable to Matlab) results for the roots of the Wilkinson polynomial if you just switch the indices during construction of the companion matrix. See https://github.com/vtjnash/Polynomial.jl/blob/master/src/Polynomial.jl#L350-L353 for Polynomial, or https://github.com/loladiro/Polynomials.jl/blob/master/src/Polynomials.jl#L324-L325 for Polynomials.

I think the intent (see https://github.com/vtjnash/Polynomial.jl/issues/5) is to deprecate Polynomial and switch the coefficient order by developing under the Polynomials name going forward, but Keno's probably been too busy to register the new package, turn on issues, etc. I'm doing some work on piecewise stuff in a branch of Polynomials, I might just adopt the package. I know David de Laat has put together packages for sparse multivariate polynomials https://github.com/daviddelaat/MultiPoly.jl and orthogonal polynomials https://github.com/daviddelaat/Orthopolys.jl, don't think they're registered but it might make sense to eventually unify all of these into the same package.

Hans W Borchers

unread,
May 8, 2014, 8:23:09 AM5/8/14
to julia...@googlegroups.com
Because I was a (tiny) bit unsatisfied with the Polynomial package,
I wrote my own polynomial functions, like 

  - polyval() to be applied to vectors as well as polynomial types
  - roots() that uses the Matlab order in constructing the companion
      matrix and finding all roots
  - horner() that utilizes the Horner scheme to compute the value
      and the derivative of the polynomial at the same time
      (useful for a specialized version of Newton's algorithm)
  - a deflated Horner function to return p(x) = (x - x0)*q(x) when
      x is a root of polynomial p
  - polyfit() for fitting polynomials to data, etc.

I think a polyfit() function should in any case be a part of a polynomial
package. (Is such a function contained in any other package?)

Besides that an implementation of the Muller algorithm for computing zeros of
polynomials might be helpful. Or the calculation of the number of real roots
of a polynomial in an interval (Descartes' and Sturm's rules). There is more 
interesting numerical stuff that could be part of such a polynomial package.

Andreas Noack Jensen

unread,
May 8, 2014, 8:41:59 AM5/8/14
to julia...@googlegroups.com
I'd suggest fit(Polynomial, data) instead of polyfit(data). The generic fit function is defined in StatsBase.
--
Med venlig hilsen

Andreas Noack Jensen

Hans W Borchers

unread,
May 8, 2014, 9:07:29 AM5/8/14
to julia...@googlegroups.com
Actually, I called it pfit(); is that name also given?
I don't understand your signature because a polynomial will not be provided, only data and a degree.

How can I see a list of function names in Julia, JuliaBase, ... and packages on the METADATa list.
[As long as I really don't understand the namespace concept in Julia.]

Andreas Noack Jensen

unread,
May 8, 2014, 9:27:55 AM5/8/14
to julia...@googlegroups.com
Sorry for being too brief. My notation was not clear. The first argument should be a type and not an instance. This is becoming the standard way of fitting models deriving from StatsBase and Distributions. Hence, provided that your polynomial type is parametrised by its degree, the definition could be something like fit(::Type{Polynomial{3}), Formula, data) or fit(::Type{Polynomial{3}), y::Vector, x::Vector).

Hans W Borchers

unread,
May 8, 2014, 2:36:40 PM5/8/14
to julia...@googlegroups.com
Thanks; I really appreciate all your efforts. But no, as far as I understand, the Poly type in Polynomial is not parametrized by a degree. I think there was a discussion lately, "How can Julia recognise the degree of a polynomial", that might be relevant.

You see, I am a hard-core numerical analyst and I try to bother with the type system only as much as is absolutely necessary for getting the speed out of Julia. Hope the Julia developers do not feel too much displease with this attitude. I do admire their work.

Jameson Nash

unread,
May 9, 2014, 12:30:06 AM5/9/14
to julia...@googlegroups.com
As the author of Polynomial.jl, I'll say that being "a bit
unsatisfied" is a good reason to make pull requests for any and all
improvements :)

While loladiro is now the official maintainer of Polynomials.jl (since
he volunteered to do the badly-needed work to switch the coefficient
order), if I had access, I would accept a pull request for additional
roots() methods (parameterized by an enum type, for overloading, and
possibly also a realroots function), horner method functions, polyfit,
etc.

I would not accept a pull request for allowing a vector instead of a
Polynomial in any method, however. IMHO, this is a completely
unnecessary "optimization", which encourages the user to conflate the
concept of a Vector and a Polynomial without benefit. It could even
potentially lead to subtle bugs (since indexing a polynomial is
different from indexing a vector), or passing in the roots instead of
the polynomial.

I think merging your proposal for a polyfit function with
StatsBase.fit makes sense. You could use a tuple parameter to combine
the Polynomial parameter with the degrees information:

function fit((T::(Type{Polynomial},Int), data)
P, deg = T
return Poly( pfit(deg, data) ) #where pfit represents the
calculation of the polynomial-of-best fit, and may or may not be a
separate function
end
fit((Polynomial,3), data)

David de Laat put together a pull request to add his content to
Polynomial: https://github.com/vtjnash/Polynomial.jl/pull/25. He also
indicated he would update it for Polynomials.jl so that it could be
merged.

Hans W Borchers

unread,
May 9, 2014, 1:53:30 AM5/9/14
to julia...@googlegroups.com
Thanks a lot. Just a few minutes ago I saw here on the list an announcement
of the "Least-squares curve fitting package" with poly_fit, among others.
I think this is good enough for me at the moment.

I will come back to your suggestion concerning polynomials when I have a
better command of the type system. For polynomials there is surprisingly
many more interesting functionality than is usually implemented.

Hans W Borchers

unread,
May 9, 2014, 12:39:56 PM5/9/14
to julia...@googlegroups.com
@Jameson
I am writing a small report on scientific programming with Julia. I changed the section on polynomials by now basing it on the newer(?) Polynomials.jl. This works quite fine, and roots() computes the zeros of the Wilkinson polynomial to quite satisfying accuracy.

It's a bit irritating that the README file still documents the old order of sequence of coefficients while the code already implements the coefficients in increasing order of exponents. I see there is a pull request for an updated README, but this is almost 4 weeks old.

Testing one of my examples,

    julia> using Polynomials

    julia
> p4 = poly([1.0, 1im, -1.0, -1im])
   
Poly(--1.0 + 1.0x^4)


which appears to indicate a bug in printing the polynomial. The stored coefficient is really and correctly -1.0 as can be seen from

    julia> p4[0]
   
-1.0 + 0.0im


I wanted to report that as an issue on the project page, but I did not find a button for starting the issue tracker. Does this mean the Polynomial.jl project is still 'private' in some sense?

I know there have been long discussions on which is the right order for the coefficients of a polynomial. But I feel it uneasy that the defining order in MATLAB and other numerical computing systems has been changed so drastically. Well, we have to live with it.

Tony Kelman

unread,
May 9, 2014, 11:21:11 PM5/9/14
to julia...@googlegroups.com
By default GitHub doesn't enable issue tracking in forked repositories, the person who makes the fork has to manually go do that under settings.

Alan Edelman

unread,
Jun 17, 2014, 10:08:11 AM6/17/14
to julia...@googlegroups.com
I just tried roots in the Polynomial package

here's what happened

@time roots(Poly([randn(100)]));

LAPACKException(99)
while loading In[10], in expression starting on line 44
 in geevx! at linalg/lapack.jl:1225
 in eigfact! at linalg/factorization.jl:531
 in eigfact at linalg/factorization.jl:554
 in roots at /Users/julia/.julia/v0.3/Polynomial/src/Polynomial.jl:358

my first question would be why are we calling geevx for a matrix
known to be Hessenberg?

I'd be happy to have a time comparable to matlab's though i'm sure there
are faster algorithms out there as well


Tony Kelman

unread,
Jun 17, 2014, 10:46:55 AM6/17/14
to julia...@googlegroups.com
The implementation in https://github.com/Keno/Polynomials.jl looks a bit better-scaled (not taking reciprocals of the eigenvalues of the companion matrix), though it still might be better off on the original Wilkinson example if the companion matrix were transposed.

Doesn't look like Julia has a nicely usable Hessenberg type yet (https://github.com/JuliaLang/julia/issues/6434 - there is hessfact and a Hessenberg factorization object, but those don't look designed to be user-constructed), and I don't see any sign of ccall's into the Hessenberg routine {sdcz}hseqr either.

Iain Dunning

unread,
Jun 17, 2014, 10:53:14 AM6/17/14
to julia...@googlegroups.com
I see both Polynomial and Polynomials in METADATA - is Polynomials a replacement for Polynomial?

Andreas Noack Jensen

unread,
Jun 17, 2014, 10:57:48 AM6/17/14
to julia...@googlegroups.com
I can't make roots fail with the example you gave.

It is right that there isn't yet eigfact methods for the Hessenberg type and the LAPACK routines are not wrapped. The last part wouldn't be difficult, but we need to think about the Hessenberg type. Right now it is a Factorization and it stores the elementary reflectors for the transformation to Hessenberg form. We might want a Hessenberg<:AbstractMatrix similarly to e.g. Triangular.

Stefan Karpinski

unread,
Jun 17, 2014, 11:30:04 AM6/17/14
to Julia Users
On Tue, Jun 17, 2014 at 10:53 AM, Iain Dunning <iaind...@gmail.com> wrote:
I see both Polynomial and Polynomials in METADATA - is Polynomials a replacement for Polynomial?

Yes, Polynomials is the newer version with good indexing order – i.e. p[0] is the constant term. We should probably get this in better order. It may make sense to break the connection with the old repo and put it under some organization so that more people can work on it. What org would be most appropriate? 

Alan Edelman

unread,
Oct 10, 2014, 11:42:09 AM10/10/14
to julia...@googlegroups.com
Related topic: 
 I'd like to propose that roots and polyval be part of base.
I can promise firsthand that they are among the first things a 12 year old user
of Julia would just want to be there.  

Tony Kelman

unread,
Oct 10, 2014, 1:23:29 PM10/10/14
to julia...@googlegroups.com
Polynomials is a good candidate for “default packages” however that ends up being implemented. Installed by default for the vast majority of users who aren’t trying to do something with a “minimal Julia,” but not strictly necessary for the rest of the language to function.

Steven G. Johnson

unread,
Mar 10, 2015, 12:44:18 PM3/10/15
to julia...@googlegroups.com
Just a quick follow-up on this thread: for my numerics class at MIT, I put together a little Julia notebook on the Wilkinson polynomial, along with a little tutorial on how to define a basic polynomial type (although pointing to Polynomials.jl for a more serious version) and some fun with root-finding:


Hopefully it is helpful to someone.

I agree that the Polynomials vs. Polynomial etc. situation needs a bit of loving care.

Sheehan Olver

unread,
Mar 11, 2015, 6:39:29 AM3/11/15
to julia...@googlegroups.com
Thought I'd do a self-plug and mention that you can do this and other polynomial manipulations in ApproxFun.jl (which uses a colleague matrix and gets 10 digits accuracy for this example):

using ApproxFun
f=Fun(x->prod([1.0:20.0]-x),[0,20],21)
roots(f)
Reply all
Reply to author
Forward
0 new messages