|numba2||Mark Florisson||7/29/13 4:19 PM|
In my quest to create a more expressive, more easily understood and
performant numba, I've created a proposal which can be considered as
somewhat of a redesign of the numba "language". It will focus far less
on the CPython side of things, and instead mostly live in its own
world. Objects will still be allowed, but it should be possible to
write many useful numeric or other codes without it. A more
sustainable approach towards NumPy support will be taken, where a
subset of the NumPy API will be provided (maybe under a different
namespace), making it clear what is and isn't supported efficiently.
It will support data-parallel operators like map/reduce etc.
The goal is ultimately to allow (near) zero-cost abstraction and write
and implement nearly every component in a runtime (including complex
numbers, range, etc). It will further bring optional static typing,
Rust-like traits, and actual parallelism that runs unconstrained from
the GIL without sacrificing any language semantics. It's actually not
as ambitious as it sounds, with a few core abstractions numba can be
much more powerful and easier to implement at the same time. You can
read about it here:
By default objects will not be used, and it will be easily
understandable which things can run on a GPU. It combines the best
things of several languages, most notably Rust, Terra, RPython and
Julia. It's also influenced by parakeet, mypy and copperhead.
A short overview: optional static typing, generic functions, rust-like
traits, compile-time overloading, control over object representation,
control over unrolling, specialization and inlining, control over
control over type promotion, coercion and conversion, and over
mutability. Different threads have different garbage collectors
allowing them to run in parallel without any locking. Immutable data
can be safely shared, mutable data may be borrowed. Communication
happens explicitly over typed channels, or higher-level abstractions
like parallel maps. Finally, different exception models may be
supported (costful as well as zero-cost).
There's also ideas to extract native functions from C extension
modules, allowing faster calls into native (recompiled) CPython code.
Feedback and suggestions are more than welcome!
|Re: numba2||Uwe Fechner||8/18/13 10:12 AM|
By default objects will not be used, ...
Does that mean that it is not suggested to write @jit classes?
For me functions are not always sufficient and I need to use jit
compiled classes (or globals, but that is ugly).
|Re: numba2||Mark Florisson||8/19/13 4:24 AM|
Classes will most definitely be supported, but it means type instantiation is not tied to Python's memory allocator or GIL.
|Re: numba2||Ville Tuulos||8/28/13 2:19 PM|
|Re: numba2||Mark Florisson||9/4/13 7:39 AM|
Yes indeed, sorry about that. Here is the new link:
The repo is here if you want to build it offline with sphinx: