Keras VS Lasagne

2,539 views
Skip to first unread message

Jim Fan

unread,
Sep 30, 2015, 12:15:09 AM9/30/15
to cu-neu...@googlegroups.com
Hi,

Has anyone used both Keras and Lasagne? I wonder how do they compare with each other, according to your personal experience? They seem to be designed for very similar purposes. 

My first impression is that Lasagne exposes Theano more, which is a good thing because it allows more flexibility and transparency. 

I appreciate any comment or advice. 

Best,

Jim

Colin Raffel

unread,
Sep 30, 2015, 11:22:19 AM9/30/15
to Jim Fan, cu-neu...@googlegroups.com
Hi Jim, most of my experience is with Lasagne, of course.  

Your impression is correct.  I would think of Lasagne as a Python module which provides convenience classes and functions for using Theano.  In other words, common things that you'd want to do using Theano for neural networks (construct a computational graph corresponding to a neural network model and compute updates of it) are made very easy.  The result of using Lasagne are Theano symbolic expressions which you can use to compile functions.  Apart from convenience, one reason to use Lasagne instead of implementing everything yourself in Theano is that Lasagne is written to be highly efficient (many of the devs have also contributed to Theano) and bug-free (lots of people are using it).

Keras, in contrast, allows you to use Theano without writing any Theano code.  This arguably makes it less flexible and less easily extensible but also potentially easier to use for people not familiar with Theano.  For some time, Keras didn't support non-sequential models (e.g. tree structured networks with arbitrary connections), but I think that's possible now.



--
Schedule and info: http://labrosa.ee.columbia.edu/cuneuralnet/
---
You received this message because you are subscribed to the Google Groups "cu-neural-net" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cu-neural-ne...@googlegroups.com.
To post to this group, send email to cu-neu...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/cu-neural-net/CACKopzwcL2g1AOC9dR2KEM7HfEb-suB6B%2BcGAAQTPEYnVOy5FA%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Colin Raffel

unread,
Sep 30, 2015, 10:17:12 PM9/30/15
to Jim Fan, cu-neu...@googlegroups.com
Blocks seems great.  Bart talked about it a bit a week ago.  It seems that you are able to go lower-level (more pure Theano code) than Lasagne, but also has some higher-level capabilities.  Combined with fuel, I think it's very powerful.  It's probably the one which is in the earliest stages in terms of development, though.  I guess there are places in the library which may not be as efficient as possible.  One nice thing about it is that it is being developed by MILA, so closely in tandem with Theano.


On Wed, Sep 30, 2015 at 8:14 PM, Jim Fan <jimfa...@gmail.com> wrote:
Hi Colin,

Thank you so much for your detailed explanation! I guess I'll go ahead with Lasagne as my primary toolkit. 

Have you ever used Blocks? What do you think of this library? 

Best,

Jim
Reply all
Reply to author
Forward
0 new messages