[SciPy-User] scipy.sparse.linalg.cg statistics?

627 views
Skip to first unread message

Nico Schlömer

unread,
Sep 28, 2010, 9:34:21 AM9/28/10
to SciPy Users List
Hi all,

I was just running some test problems with scipy.sparse.linalg.cg and
thing seem work work out fine.
To compare preconditioners, I'd like to get more detailed information
about the convergence history, but I haven't been able to figure out
how to, e.g., get the number of performed iterations or the (relative)
residual for each of those steps. Compare this with
http://www.mathworks.com/help/techdoc/ref/pcg.html.

Any hints?

Cheers,
Nico
_______________________________________________
SciPy-User mailing list
SciPy...@scipy.org
http://mail.scipy.org/mailman/listinfo/scipy-user

Pauli Virtanen

unread,
Sep 28, 2010, 11:19:50 AM9/28/10
to scipy...@scipy.org
Tue, 28 Sep 2010 15:34:21 +0200, Nico Schlömer wrote:

> Hi all,
>
> I was just running some test problems with scipy.sparse.linalg.cg and
> thing seem work work out fine.
> To compare preconditioners, I'd like to get more detailed information
> about the convergence history, but I haven't been able to figure out how
> to, e.g., get the number of performed iterations or the (relative)
> residual for each of those steps. Compare this with
> http://www.mathworks.com/help/techdoc/ref/pcg.html.
>
> Any hints?

Use the 'callback' argument.

>>> help(scipy.sparse.linalg.cg)
...
callback : function
User-supplied function to call after each iteration. It is called
as callback(xk), where xk is the current solution vector.
...

Nico Schlömer

unread,
Oct 1, 2010, 6:06:30 AM10/1/10
to SciPy Users List
> Use the 'callback' argument.

That works alright I guess.
What I do right now is creating a *global array that's filled up as
the callback function is called, after which I go ahead and plot it.
Using a global variable here seems somewhat ugly to me -- might there
be a more elegant solution at all?

Cheers,
Nico

Sebastian Walter

unread,
Oct 1, 2010, 6:44:07 AM10/1/10
to SciPy Users List
I don't know if that helps you, but probably you can provide any
callable object.

I.e., you can make a class like

In [1]: class Foo:
...: def __call__(self,args):
...: print 'Bar'
...:
...:

In [2]: f = Foo()

In [4]: f(1)
Bar

regards,
Sebastian

Pauli Virtanen

unread,
Oct 1, 2010, 8:56:31 AM10/1/10
to scipy...@scipy.org
Fri, 01 Oct 2010 12:06:30 +0200, Nico Schlömer wrote:
>> Use the 'callback' argument.
>
> That works alright I guess.
> What I do right now is creating a *global array that's filled up as the
> callback function is called, after which I go ahead and plot it. Using a
> global variable here seems somewhat ugly to me -- might there be a more
> elegant solution at all?

Use variables from the outer scope:

def doit(M, b):
residuals = []

def callback(xk):
residuals.append(M*xk - b)

sol, info = scipy.sparse.linalg.cg(M, b, callback=callback)
return residuals


http://docs.python.org/tutorial/classes.html#python-scopes-and-namespaces

http://www.saltycrane.com/blog/2008/01/python-variable-scope-notes/

Nico Schlömer

unread,
Oct 2, 2010, 6:19:56 AM10/2/10
to SciPy Users List
Good idea, thanks for the hint!

Nico Schlömer

unread,
Oct 8, 2010, 11:42:10 AM10/8/10
to SciPy Users List
Hello again!

> def callback(xk):
> residuals.append(M*xk - b)

This approach works nicely from a code-beautification point of view,
thanks again for the hint.
There's a major drawback, though, which may not be easily fixed: To
calculate the residual, one has to compute an extra matrix-vector
multiplication *per step*, which effectively increases the runtime of
the CG algorithm by a factor of two (!). Ideally -- and classically --
the residual is retrieved directly from the CG method itself.
Now obviously, the callback method does not provide for residual
vector input, but I'm not quite sure what's the reason for that. In
the numerical computation world, this would be considered a major
deficiency.

Cheers,
Nico

Pauli Virtanen

unread,
Oct 8, 2010, 12:30:10 PM10/8/10
to scipy...@scipy.org
Fri, 08 Oct 2010 17:42:10 +0200, Nico Schlömer wrote:
[clip]

> There's a major drawback, though, which may not be easily fixed: To
> calculate the residual, one has to compute an extra matrix-vector
> multiplication *per step*, which effectively increases the runtime of
> the CG algorithm by a factor of two (!). Ideally -- and classically --
> the residual is retrieved directly from the CG method itself. Now
> obviously, the callback method does not provide for residual vector
> input, but I'm not quite sure what's the reason for that.

That can quite likely be easily fixed by making a small change to Scipy.
An interesting point here is that the `gmres` routine on the other hand
only passes the residual norm to the callback, so there would be also
some other things to harmonize here.

Is there some other information that would be useful in the callback? I
can think of: residual vector, residual 2-norm, and current iterate --
what else?

If you have time, please file a bug report,

http://projects.scipy.org/scipy/newticket

so that we won't forget this issue.

--
Pauli Virtanen

Nico Schlömer

unread,
Oct 8, 2010, 3:14:42 PM10/8/10
to SciPy Users List
> If you have time, please file a bug report,

Done: http://projects.scipy.org/scipy/ticket/1297.

Fernando Perez

unread,
Oct 8, 2010, 5:04:37 PM10/8/10
to SciPy Users List
On Fri, Oct 8, 2010 at 9:30 AM, Pauli Virtanen <p...@iki.fi> wrote:
>
> That can quite likely be easily fixed by making a small change to Scipy.
> An interesting point here is that the `gmres` routine on the other hand
> only passes the residual norm to the callback, so there would be also
> some other things to harmonize here.

Quick question, why is cg marked as deprecated?

Class Docstring:
scipy.linalg.cg is DEPRECATED!! -- use scipy.sparse.linalg.cg instead

Use Conjugate Gradient iteration to solve A x = b


While I perfectly understand that in production work, the typical use
cases require sparse matrices, there's still plenty of value for a
dense implementation of CG in scipy, I would think (especially given
how sparse matrices are a little harder to use for a beginner than
plain dense numpy arrays).

Am I missing something obvious?

Thanks,

f

Pauli Virtanen

unread,
Oct 8, 2010, 5:42:47 PM10/8/10
to scipy...@scipy.org
Fri, 08 Oct 2010 14:04:37 -0700, Fernando Perez wrote:
[clip]

> While I perfectly understand that in production work, the typical use
> cases require sparse matrices, there's still plenty of value for a dense
> implementation of CG in scipy, I would think (especially given how
> sparse matrices are a little harder to use for a beginner than plain
> dense numpy arrays).
>
> Am I missing something obvious?

There is no dense implementation. The one that was under scipy.linalg is
the same as what is in scipy.sparse.linalg -- I guess the misleading
location was a reason for moving it.

Indeed, :)


__all__ = ['bicg','bicgstab','cg','cgs','gmres','qmr']

# Deprecated on January 26, 2008

from scipy.sparse.linalg import isolve
from numpy import deprecate

for name in __all__:
oldfn = getattr(isolve, name)
oldname='scipy.linalg.' + name
newname='scipy.sparse.linalg.' + name
newfn = deprecate(oldfn, oldname=oldname, newname=newname)
exec(name + ' = newfn')

Fernando Perez

unread,
Oct 8, 2010, 5:56:35 PM10/8/10
to SciPy Users List
On Fri, Oct 8, 2010 at 2:42 PM, Pauli Virtanen <p...@iki.fi> wrote:
> There is no dense implementation. The one that was under scipy.linalg is
> the same as what is in scipy.sparse.linalg -- I guess the misleading
> location was a reason for moving it.
>
> Indeed, :)

Ah, thanks :) In that case it might be worth changing the message and
warning from:

/usr/lib/python2.6/dist-packages/numpy/lib/utils.py:108:
DeprecationWarning: scipy.linalg.cg is deprecated, use
scipy.sparse.linalg.cg
warnings.warn(str1, DeprecationWarning)

to simply indicating that the *name* is deprecated, but the actual
function lives in the sparse library. Since the function already
accepts dense matrices just fine, indicating this should be enough.

I'm happy to make the changes if you agree.

Cheers,

f

Pauli Virtanen

unread,
Oct 8, 2010, 6:36:35 PM10/8/10
to scipy...@scipy.org
Fri, 08 Oct 2010 14:56:35 -0700, Fernando Perez wrote:
[clip]
> /usr/lib/python2.6/dist-packages/numpy/lib/utils.py:108:
> DeprecationWarning: scipy.linalg.cg is deprecated, use
> scipy.sparse.linalg.cg
> warnings.warn(str1, DeprecationWarning)
>
> to simply indicating that the *name* is deprecated, but the actual
> function lives in the sparse library. Since the function already
> accepts dense matrices just fine, indicating this should be enough.
>
> I'm happy to make the changes if you agree.

Those functions are removed in 0.9, so I guess we're OK without changes.

Pauli

Fernando Perez

unread,
Oct 8, 2010, 6:54:51 PM10/8/10
to SciPy Users List
On Fri, Oct 8, 2010 at 3:36 PM, Pauli Virtanen <p...@iki.fi> wrote:
>
>
> Those functions are removed in 0.9, so I guess we're OK without changes.
>

Ah, even better, nothing to do :)

Cheers,

f

josef...@gmail.com

unread,
Oct 9, 2010, 11:05:58 PM10/9/10
to SciPy Users List
On Fri, Oct 8, 2010 at 12:30 PM, Pauli Virtanen <p...@iki.fi> wrote:
> Fri, 08 Oct 2010 17:42:10 +0200, Nico Schlömer wrote:
> [clip]
>> There's a major drawback, though, which may not be easily fixed: To
>> calculate the residual, one has to compute an extra matrix-vector
>> multiplication *per step*, which effectively increases the runtime of
>> the CG algorithm by a factor of two (!). Ideally -- and classically --
>> the residual is retrieved directly from the CG method itself. Now
>> obviously, the callback method does not provide for residual vector
>> input, but I'm not quite sure what's the reason for that.
>
> That can quite likely be easily fixed by making a small change to Scipy.
> An interesting point here is that the `gmres` routine on the other hand
> only passes the residual norm to the callback, so there would be also
> some other things to harmonize here.
>
> Is there some other information that would be useful in the callback? I
> can think of: residual vector, residual 2-norm, and current iterate --
> what else?
>
> If you have time, please file a bug report,
>
>    http://projects.scipy.org/scipy/newticket
>
> so that we won't forget this issue.

A related idea:

Is it possible or is there a trick to tell an optimizer from a
callback function to stop?

I never used callback function, so I don't know what's possible.

I have two usecases in mind:
Stata for example switches optimizers during estimation of a
statistical model. It's possible to limit the number of iterations,
and restart with a different optimizer but it might be useful to
switch depending on the state of the optimization.
fmin_bfgs doesn't look very robust, quite often it goes of into
neverland and I can wait quite some time until I can restart with a
different optimizer. It would be useful in this case if we can inspect
what's going on and kill it automatically and switch to something more
robust.

Josef

Pauli Virtanen

unread,
Oct 10, 2010, 7:34:23 AM10/10/10
to scipy...@scipy.org
Sat, 09 Oct 2010 23:05:58 -0400, josef.pktd wrote:
[clip]

> A related idea:
>
> Is it possible or is there a trick to tell an optimizer from a callback
> function to stop?
>
> I never used callback function, so I don't know what's possible.

Raising an exception will at least bail out from the optimizer.

Reply all
Reply to author
Forward
0 new messages