Announcing Theano0.8.0rc1

37 views
Skip to first unread message

Pascal Lamblin

unread,
Feb 29, 2016, 8:23:42 PM2/29/16
to Theano Announce, Theano Developers, Theano Users
===========================
Announcing Theano 0.8.0rc1
===========================

This is a release candidate for a major version, with lots of new
features, bug fixes, and some interface changes (deprecated or
potentially misleading features were removed).

The upgrade is recommended for developers who want to help test and
report bugs, or want to use new features now.

For those using the bleeding edge version in the
git repository, we encourage you to update to the `0.8.0rc1` tag.


What's New
----------

Highlights:
- Python 2 and 3 support with the same code base
- Faster optimization
- New GPU back-end
- Float16 new back-end (need cuda 7.5)
- Multi dtypes
- Multi-GPU support in the same process
- Integration of CuDNN for better GPU performance
- Many Scan improvements (execution speed up, ...)
- optimizer=fast_compile moves computation to the GPU.
- Better convolution on CPU and GPU. (CorrMM, cudnn, 3d conv, more parameter)
- Interactive visualization of graphs with d3viz
- cnmem (better memory management on GPU)
- BreakpointOp
- Multi-GPU for data parallism via Platoon (https://github.com/mila-udem/platoon/)


A total of 135 people contributed to this release, please see the end of
NEWS.txt for the complete list. If you are among the authors and would
like to update the information, please let us know.


Download and Install
--------------------

You can download Theano from http://pypi.python.org/pypi/Theano

Installation instructions are available at
http://deeplearning.net/software/theano/install.html

Description
-----------

Theano is a Python library that allows you to define, optimize, and
efficiently evaluate mathematical expressions involving
multi-dimensional arrays. It is built on top of NumPy. Theano
features:

* tight integration with NumPy: a similar interface to NumPy's.
numpy.ndarrays are also used internally in Theano-compiled functions.
* transparent use of a GPU: perform data-intensive computations up to
140x faster than on a CPU (support for float32 only).
* efficient symbolic differentiation: Theano can compute derivatives
for functions of one or many inputs.
* speed and stability optimizations: avoid nasty bugs when computing
expressions such as log(1+ exp(x)) for large values of x.
* dynamic C code generation: evaluate expressions faster.
* extensive unit-testing and self-verification: includes tools for
detecting and diagnosing bugs and/or potential problems.

Theano has been powering large-scale computationally intensive
scientific research since 2007, but it is also approachable
enough to be used in deep learning classes.


All questions/comments are always welcome on the Theano
mailing-lists ( http://deeplearning.net/software/theano/#community )

--
Pascal
Reply all
Reply to author
Forward
0 new messages