I don't want to be stuck using libraries that may not be updated for the
next generation of Python. How are others handling this issue?
Code by Kevin
From my POV, your question would be precisely identical if you had
started your project when Python 2.3 was just released and wanted to
know if the libraries you selected would be available for Python 2.6.
I didn't realize 2.6 broke libraries that had worked in 2.3, at least on
any scale. Did I miss something?
Lots of tiny things change. Any of these can break a library. With the
3 releases between 2.3 and 2.6, there are lots of opportunities for
these changes. I don't know what you mean by "any scale", but I know
that I've seen lots of things break on 2.6 that worked on 2.3, 2.4, or
I certainly had to update several modules I use (C extensions) to work
with the new memory management in a recent release (changing PyMem_Del
to Py_DECREF being a pretty common alteration); I can't remember
whether that was for 2.6 or 2.5.
* When Python 3.x has something compelling to offer:
o Users will start asking for Python 3.x with a *reason* to
justify the cost
o Libraries will begin to see the pain of porting, and most
importantly, *maintaining*, as a reasonable trade-off
o Libraries will port (and maintain)
o Applications should *then* port
* or When everyone is already on a (basically) compatible 2.x
version (i.e. 2.6+); say in 3 years:
o Conversion and maintenance costs will lower to the point
where we can justify them for libraries
Python 3.x has very little that is a compelling use-case (versus 2.x)
for most people I've talked to. My paying customers aren't screaming
"we want to spend a week of your time to rewrite, re-test and re-deploy
our working code into Python 3.x so that it can do exactly the same
thing with no visible benefit and lots of likely subtle failures".
Unicode-as-default doesn't really make customers say "wow" when all
their Unicode-needing code is already Unicode-using. A few syntax
changes here and there... well, no, they certainly don't care (can't say
I *really* do either). The initial marked slowdown for 3.0 (which I
gather should be mostly fixed in 3.1) didn't do much of a sales-job either.
Possible compelling arguments that would get people to convert (with the
projects working on them):
* 5x (or more) speedup (PyPy, Unladen Swallow)
* adaptive parallelization/execution-kernel mechanism as a
first-class primitive (Apple's C extensions for OpenCL)
* continuation-like mechanisms, anonymous code blocks a-la Ruby
* (free) threading, GIL removal (IronPython, Jython)
* compilation-to-native (PyPy)
* compilation to mobile-platform native (PyPy?)
None of those in in Python 3.x, though there's talk of merging Unladen
Swallow into CPython to provide a carrot for conversions (though AFAIK
it's not yet a 5x improvement across the board). To compound the
problem, Python 3.x doesn't include any of the syntax changes you'd want
to see to support e.g. anonymous blocks, continuations, OpenCL
integration, etceteras... so if we want those, we're going to have to
either introduce new syntax (which is blocked) or hack it in... which we
could do on Python 2.x too.
I don't know about other maintainers, but I've started running PyOpenGL
tests with -3 on Python 2.6 to get a feel for how many issues are going
to come up. Most have been minimal. But when I sit down and actually
consider *maintaining* a 3.x release when I'm already *barely* able to
keep up with the 2.x maintenance in my tiny amounts of free time...
well... I do *that* stuff for *fun* after all, and double the headaches
for no noticeable benefit doesn't really sound like fun... oh, and numpy
isn't ported, so screw it ;) ...
Interestingly, at PyGTA Myles mentioned that he'd found his first-ever
Python 3.x-only library! Which he then converted to Python 2.x, because
he actually wanted to use it in real code :) .
Projects which haven't ported to Python 3.x aren't necessarily dead,
they're just not nailed to that particular perch (yet),
Maybe by 2015 or so, that might be feasible. Wait until Python 3.x
ships as standard with major Linux distros. Right now, 2.4 or 2.5 is
the production version of Python.
For a windows user who depends on pre-built binaries, every new release
breaks *every* library that is not pure Python and needs to be compiled.
> For a windows user who depends on pre-built binaries, every new release
> breaks *every* library that is not pure Python and needs to be compiled.
That's not windows specific - most packages which distribute binary
packages need to package binaries for every minor version (2.4, 2.5,
etc...). That's certainly the case for numpy and scipy. Python does
not have a stable ABI across minor releases, only micro releases.
I doubt that's what Paul was referring to, though - he seemed more
concern with API/language changes than ABI issues.
I didn't realize the ABI situation was that unstable. I thought you
could just package up a .so or .dll and people could keep using it. I
tend to not want to use extension modules that are not in the stdlib,
and I guess this is another reason to keep staying away from them.
Unstable may be strong - every minor version of python has a lifespan
of several years. But yes, that's an hindrance for packagers: you need
to package binaries for every minor version of python, although I
guess for trivial extensions, you may get away with it on some
platforms. That's why as far as I am concerned, something like PEP 384
worths more than any feature in py3k I am aware of. I think it will
have more impact than py3k's features for the scientific python, if
the stable API is rich enough. It would certainly make more incentive
for me to work on porting packages to py3k than just doing it because
we will have to at some point.
> Unstable may be strong - every minor version of python has a lifespan
> of several years. But yes, that's an hindrance for packagers: you need
> to package binaries for every minor version of python
It's important to note that this is mitigated, ironically enough, by
intentionally targeting a minimum Python minor version because the code
base makes use of Python features not available in older versions.
That is, any minor version of Python that doesn't have the features your
code base uses can be ignored (given the set of supported versions is
explicitly declared) — and hence one doesn't need to package binaries
for every minor version.
\ “Our products just aren't engineered for security.” —Brian |
`\ Valentine, senior vice-president of Microsoft Windows |
_o__) development, 2002 |
That doesn't completely match my experience. It's true that there is no
guarantee that the ABI will stay compatible, but when you compile lxml
against Py2.4 on a 32bit machine, it will continue to import in Py2.5 and
(IIRC) Py2.6. It won't be as fast and it won't use some newer features, but
it will work. Don't remember my experience with 2.3, though.
It obviously can't work the other way round, i.e. when compiling against
2.6, it will never work in 2.5 or earlier. But there is definitely a
certain degree of ABI compatibility available.
> It's important to note that this is mitigated, ironically enough, by
> intentionally targeting a minimum Python minor version because the code
> base makes use of Python features not available in older versions.
> That is, any minor version of Python that doesn't have the features your
> code base uses can be ignored (given the set of supported versions is
> explicitly declared) — and hence one doesn't need to package binaries
> for every minor version.
This has nothing to do whatsoever with feature, since we are talking
about ABI issues.
> That doesn't completely match my experience. It's true that there is no
> guarantee that the ABI will stay compatible, but when you compile lxml
> against Py2.4 on a 32bit machine, it will continue to import in Py2.5 and
> (IIRC) Py2.6. It won't be as fast and it won't use some newer features, but
> it will work. Don't remember my experience with 2.3, though.
Importing fine is a very low expectation for ABI compatibility :)
Since python does not make ABI guarantees between minor releases, you
don't know whether some structures layouts are changed between
versions, and in general, tracking crashes due to those is no fun. It
really depends on how much you depend on the C API, but for something
extensive like NumPy, I don't think it would ever work.
So yes, you could say "just try and if it crashes, check that it is
not ABI-related". In practice, this is very poor engineering in my
Ok, I should have said "imports and runs its test suite successfully". I
just wanted to add a "works for me" to counter your rather pessimistic
> Since python does not make ABI guarantees between minor releases, you
> don't know whether some structures layouts are changed between
> versions, and in general, tracking crashes due to those is no fun. It
> really depends on how much you depend on the C API, but for something
> extensive like NumPy, I don't think it would ever work.
I wouldn't be so sure that NumPy is so much more "extensive" in C-API usage
> So yes, you could say "just try and if it crashes, check that it is
> not ABI-related". In practice, this is very poor engineering in my
Well, I don't know if there is any 'official' core developer statement
regarding ABI compatibility, but I know that Guido doesn't take it easy to
break it for a release. He tried pretty hard to keep it up for 2.5->2.6, at
least, even if he was aware that it would be futile for 2.x->3.x.
I just looked at PEP 384 and I don't see anything in it about version
numbers in the interfaces. I certainly think something like that should
be added if it's not too late. Basically any extension module should
check that the CPython loading it is new enough, and CPython should
(when feasible) continue to support old interfaces when changes are
made. This is pretty standard stuff as done in COM, Java, and
presumably .NET, along with many communications protocols.
You keep repeating this nonsense even though it has been pointed out
to you in a neighbouring thread that many (most?) of the main linux
distros ship python 2.6 and not 2.5 or 2.4. For example Fedora 11 and
12 both ship python 2.6, others mentioned lots of other examples in
said other thread, anyone can look them up.
Psss, psss, put it down! - http://www.cafepress.com/putitdown
My impression is that there is something 'special' about Windows (msvc)
such that binaries compiled against x.y automatically do not work for
x.y+1, even is the ABI is unchanged from Python's viewpoint.
The point of my post that David responded to is that most Windows users
have always been effectively dependent on 3rd party module/package
developers to produce a new binary for each new version, whereas many
*nix users could download the source and compile, or at least give it a go.