* Speed-ups.
* Crash fixes.
* A few small interface changes.
* GPU memory leak fix.
* A few corner cases fixes without incidence.
* More Theano determinism
* tensor.{dot,tensordot} more complete/faster/GPU friendly.
* tensor.tensordot now support Rop/Lop
* tensor.dot support n-dimensional inputs as NumPy
* To support more NumPy syntax:
* Add theano.tensor.take()
* Add a_tensor_variable.{sort,dot,std,argmin,argmax,argsort,clip,conj,conjugate,repeat,round,trace,real,imag,take}
Download and Install
--------------------
You can download Theano from
http://pypi.python.org/pypi/Theano
Installation instructions are available at
http://deeplearning.net/software/theano/install.htmlDescription
-----------
Theano is a Python library that allows you to define, optimize, and
efficiently evaluate mathematical expressions involving
multi-dimensional arrays. It is built on top of NumPy. Theano
features:
* tight integration with NumPy: a similar interface to NumPy's.
numpy.ndarrays are also used internally in Theano-compiled functions.
* transparent use of a GPU: perform data-intensive computations up to
140x faster than on a CPU (support for float32 only).
* efficient symbolic differentiation: Theano can compute derivatives
for functions of one or many inputs.
* speed and stability optimizations: avoid nasty bugs when computing
expressions such as log(1+ exp(x)) for large values of x.
* dynamic C code generation: evaluate expressions faster.
* extensive unit-testing and self-verification: includes tools for
detecting and diagnosing bugs and/or potential problems.
Theano has been powering large-scale computationally intensive
scientific research since 2007, but it is also approachable
enough to be used in the classroom (IFT6266 at the University of Montreal).
Resources
---------
About Theano:
http://deeplearning.net/software/theano/Theano-related projects:
http://github.com/Theano/Theano/wiki/Related-projects
About NumPy:
http://numpy.scipy.org/About SciPy:
http://www.scipy.org/Machine Learning Tutorial with Theano on Deep Architectures:
http://deeplearning.net/tutorial/Acknowledgments
---------------
I would like to thank all contributors of Theano. For this particular
release, many people have helped, notably (in alphabetical order):
abalkin
Rami Al-Rfou'
Frederic Bastien
James Bergstra
Olivier Delalleau
Guillaume Desjardins
Amir Elaguizy
Ian Goodfellow
Eric Hunsberger
Vivek Kulkarni
Pascal Lamblin
Jeremiah Lowin
Razvan Pascanu
David Warde-Farley
I would also like to thank users who submitted bug reports and suggestion.
Also, thank you to all NumPy and Scipy developers as Theano builds on
their strengths.
All questions/comments are always welcome on the Theano
mailing-lists (
http://deeplearning.net/software/theano/#community )