Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

[ANN] constraint-0.2

1 view
Skip to first unread message

Alexandre

unread,
Jun 7, 2002, 12:50:52 PM6/7/02
to
Logilab has released constraint-0.2.

The constraint package is an extensible Constraint Satisfaction Problem
(CSP) solver using constraint propagation algorithms written in 100%
pure Python (tested with Python 2.1)

You can download it from ftp://ftp.logilab.org/pub/constraint/, read a
bit more about it on
http://www.logilab.org/python-logic/constraint.html,
and discuss it on the logic-sig mailing list (see
http://lists.logilab.org/mailman/listinfo/python-logic to subscribe)

The package is still young, but we find it useful enough right now, so I
thought I might as well publish it.

What can it do?
* solve some finite-domain CSP
* handle n-ary constraints

How can you extend it:
* by implementing new Domain classes
* by implementing new constraint classes
* by implementing new Distributor classes

How well does it perform?
* 8-queens problems solved in 15 seconds on my 1GHz PC
* SEND+MORE=MONEY problem solved in 8 seconds on the same machine

What has changed?
* a BasicConstraint class was added (these constraint affect only a single
variable, and can be entailed in one pass), with several example
implementations
* The Repository class was refactored, and a Solver class was added as
a result, which gives a better separation of the solving logic and
storage logic

If you're interested by constraint propagation programming in python,
please join the logic-sig mailing list.

--
Alexandre Fayolle
--
LOGILAB, Paris (France).
http://www.logilab.com http://www.logilab.fr http://www.logilab.org
Narval, the first software agent available as free software (GPL).

Magnus Lie Hetland

unread,
Jun 8, 2002, 6:22:05 PM6/8/02
to
In article <mailman.102346873...@python.org>, Alexandre wrote:
>Logilab has released constraint-0.2.
>
>The constraint package is an extensible Constraint Satisfaction Problem
>(CSP) solver using constraint propagation algorithms written in 100%
>pure Python (tested with Python 2.1)
[snip]

>How well does it perform?
> * 8-queens problems solved in 15 seconds on my 1GHz PC

Interesting... The package looks really cool, but the performance in
this example is truly abysmal. The simple queens.py found in the
standard Python distribution solves the 8-queens problem (finds all
solutions) in 0.11 seconds on an 800MHz PC. I guess maybe that't the
price you have to pay for generality... On the other hand, a speedup
factor of well over 100 seems to indicate a potential for
streamlining... Maybe?

- Magnus (who hasn't looked at the internals, and who looks forward to
playing around with the package :)

--
Magnus Lie Hetland The Anygui Project
http://hetland.org http://anygui.org

Alexandre Fayolle

unread,
Jun 10, 2002, 5:53:09 AM6/10/02
to
Magnus Lie Hetland a écrit :

> In article <mailman.102346873...@python.org>, Alexandre wrote:
>>How well does it perform?
>> * 8-queens problems solved in 15 seconds on my 1GHz PC
>
> Interesting... The package looks really cool, but the performance in
> this example is truly abysmal. The simple queens.py found in the
> standard Python distribution solves the 8-queens problem (finds all
> solutions) in 0.11 seconds on an 800MHz PC. I guess maybe that't the
> price you have to pay for generality... On the other hand, a speedup
> factor of well over 100 seems to indicate a potential for
> streamlining... Maybe?

I do agree to all of this.

As you said, your comparison is not fair at all ;o), because
Demo/scripts/queens.py is targeted specifically at the n-queens
problem.

<tong-in-cheek mode="on">
Please note that if your main interest is solving the N-queens
problems, searchers have found an O(N) algorithm that will certainly
perform much better than queens.py in the Python distribution.
</tong-in-cheek>

Now, seriously.

This script uses a simple backtracking approach and shows that
it can be very efficient with some problems. The constraint package uses
a different approach (constraint propagation), which is supposed to be
*at least* as efficient as backtracking, in terms of algorithmic
complexity.

So how comes the constraint package doesn't toast the backtracking
approach?
* the example solution in the constraint package uses the constraint
class MathematicConstraint which deals with arbitrary N-ary mathematic
constraint. I can write an subclass optimized for binary constraints,
which are the only thing we need for n-queens
* the constraint package adds some overhead in processing constraint
queues, and it is possible that some of the inner structures are not
efficient. Blame this on the young age of the package.
* maybe there is some horribly inneficient code in my package that I
haven't seen yet. Be assured that as soon as I spot it, it will get
refactored.

It's nice to have a comparison anyway. It helps to get an idea of what
our target is. Just as a sidenote, the very first (unreleased) version
of the package took 35 minutes to solve the 8 queens problem, so I
already got a 140x speed increase. Achieving another 140x speed increase
will certainly be harder, but it's nothing out of reach.

And then, once constraint-1.0 is considered stable, maybe will go for a
rewrite in C.

> - Magnus (who hasn't looked at the internals, and who looks forward to
> playing around with the package :)

Please do so, and join us on the python-logic mailing list.
http://lists.logilab.org/mailman/listinfo/python-logic

Terry Reedy

unread,
Jun 10, 2002, 12:20:25 PM6/10/02
to
To paraphrase Magnus Lie Hetland, the general constraint-propagation
system constraint-0.2 takes 100 times as long to do the 8-queens
problem as the specific backtracking queens.py found in the standard
Python distribution: 15 versus .1 sec on comparable machines.

The is about the same comparison, for some computations, as between
generic/dynamic Python and type-specific C. In both cases, there are
two more questions of practical interest: which platform/method is
faster for the *programmer*? and, is the slow method fast enough?

So, what is the relative programming speed of the general constraint
propagation system versus specific backtracking? I suspect that some
people could whip up something like queens.py as fast or faster that
they could the (unpublished?) constraint input for the same problem.
However, I also suspect the reverse would be true for the more
realistic and useful scheduling example that Fayolle used to
illustrate and motivate constraint-0.2.
http://www.logilab.org/python-logic/documentation.html

Real world problems may have no solution as given. On the other hand,
constraints may not be absolute. In the scheduling example, two
workshops *can* be scheduled simultaneously, if necessary, even though
many people want to take both. So a stepwise procedure may be needed.
(Given multiple solutions, one might also want to add 'wishes' as
pseudoconstraints to help select just one.) So, is it equally easy,
with the two approaches, to add or drop constraints and rerun? This
would be fairly simple with c-0.2 input but probably harder for
backtracking code unless planned for from the beginning. Is it
equally easy to give constraints priorities and have the system
automatically either add them in order from the highest or drop them
from lowest? Constraint-0.2 evaluates constraints in the order
listed, but I do not know if it properly backtracks from failure to
report the most satisfying solutions.

Terry J. Reedy

Alexandre Fayolle

unread,
Jun 11, 2002, 8:20:00 AM6/11/02
to
Terry Reedy a écrit :

> To paraphrase Magnus Lie Hetland, the general constraint-propagation
> system constraint-0.2 takes 100 times as long to do the 8-queens
> problem as the specific backtracking queens.py found in the standard
> Python distribution: 15 versus .1 sec on comparable machines.

Yup. But we do mean to make it faster.

> The is about the same comparison, for some computations, as between
> generic/dynamic Python and type-specific C. In both cases, there are
> two more questions of practical interest: which platform/method is
> faster for the *programmer*? and, is the slow method fast enough?

This is a good point.
Using the constraint package for 8 queens is very easy, since all you
have to do is describe the problem in terms of variable and constraints,
and call Solver().solve(Repository(variables,domains,constraints))

Is it fast enough? Well, it really depends. The good point in the
situation at hand is that I have not yet started thinking about
optimizing the code. There are other things to deal with first (adding
other kind of domains, such as finite sets, and the basic constraints
that go with these domains), so my personal answer is that it is for
now fast enough as far as I'm concerned. Which does not mean it's ready
for production use. Hey, this is version 0.2!

> So, what is the relative programming speed of the general constraint
> propagation system versus specific backtracking? I suspect that some
> people could whip up something like queens.py as fast or faster that
> they could the (unpublished?) constraint input for the same problem.

O(N) algorithm available for N queens has been published and proven to
be of minimal complexity, and will always be faster than a constraint
propagation approach which needs to consider N*(N-1) constraints, and
therefore will always have a complexity > O(N^2)

> However, I also suspect the reverse would be true for the more
> realistic and useful scheduling example that Fayolle used to
> illustrate and motivate constraint-0.2.
> http://www.logilab.org/python-logic/documentation.html

N-Queens is a well known problem, and it can be used as a benchmark for
solvers. Same for SEND+MORE=MONEY, which is also in the examples
directory of the constraint module. Since these problems are both well
known and simple in their expression, they can give an idea of the
relative performance of solvers.

The scheduling example on the web page is completely made up, and its
purpose is to give a more "real-life" flavour to the tutorial. There are
many other constraint satisfaction problems that can be solved with the
module, or the the constraint module will be able to solve in the near
future (how to select tunes from a CD to best fill up a tape, how to
arrange your furniture in the flat you're going to move in, finding
a new password that you'll be able to easily memorize while
passing through the devious filters set up by your sysadmin, choosing
which fruits and vegetables to buy in the market in order not to eat the
same same thing every week...)

> Real world problems may have no solution as given. On the other hand,
> constraints may not be absolute. In the scheduling example, two
> workshops *can* be scheduled simultaneously, if necessary, even though
> many people want to take both. So a stepwise procedure may be needed.
> (Given multiple solutions, one might also want to add 'wishes' as
> pseudoconstraints to help select just one.) So, is it equally easy,
> with the two approaches, to add or drop constraints and rerun? This
> would be fairly simple with c-0.2 input but probably harder for
> backtracking code unless planned for from the beginning. Is it
> equally easy to give constraints priorities and have the system
> automatically either add them in order from the highest or drop them
> from lowest? Constraint-0.2 evaluates constraints in the order
> listed, but I do not know if it properly backtracks from failure to
> report the most satisfying solutions.

The short answer is no, because solving overconstrained problems is a
tricky thing.

The longer answer is that you could quite easily build on top of
the constraint package an engine which would try to relax constraints
when no solution can be found. When finite set domains are implemented
in the solver, building this engine will be possible with the constraint
package itself. But this approach will not be as efficient as a
dedicated overconstrained package solver.

Now, a little annecdote: when writing the scheduling example, the
first set of constraints I had come up with gave 1152 possible solution,
which I found was too many. So I did exactly what you suggested, I added
more constraints (this was really easy because adding a consrtaint is a
one-liner), twiddled with the choice of the non-simultaneous
conferences, until I found a combination which gave "only" 64 possible
solutions. It took me about 20 runs to get this, which was fairly quick
because the conference.py script runs in about 2 seconds on my machine.

0 new messages