Symbolic differentiation similar to TensorFlow / Theano

269 views
Skip to first unread message

Andrei Zh

unread,
Jul 8, 2016, 8:02:55 PM7/8/16
to julia-users
In Python, libraries like TensorFlow or Theano provide possibility to perform automatic differentiation over their computational graphs. E.g. in TensorFlow (example from SO): 

data = tf.placeholder(tf.float32)
var = tf.Variable(...)              
loss
= some_function_of(var, data)

var_grad
= tf.gradients(loss, [var])[0]

What is the closest thing in Julia at the moment? 

Here's what I've checked so far: 

 * ForwardDiff.jl - it computes derivatives using forward mode automatic differentiation (AD). Although AD has particular advantages, I found this package quite slow. E.g. for a vector of 1000 elements gradient takes ~100x times longer then the function itself. Another potential issues is that ForwardDiff.jl doesn't output symbolic version of gradient and thus is hardly usable for computation on GPU, for example. 
 * Calculus.jl - among other things, this package provided symbolic differentiation. However, it seems to consider all symbols to be numbers and doesn't support matrices or vectors. 

I have pretty shallow knowledge of both these packages, so please correct me if I'm wrong somewhere in my conclusions. And if not, is there any other package or project that I should consider? 

Chris Rackauckas

unread,
Jul 8, 2016, 11:47:14 PM7/8/16
to julia-users
Have you checked out using the wrappers for TensorFlow, https://github.com/benmoran/TensorFlow.jl ? Or directly using PyCall?

Gabriel Goh

unread,
Jul 9, 2016, 3:27:33 AM7/9/16
to julia-users
Forward differentiation has a bad complexity for functions of the form R^n -> R. try using ReverseDiffSource.jl instead

This blog posts describes positive results using ReverseDiffSource.jl on an autoencoder


since back-propagation is reverse differentiation, this should in theory be equivalent to tensor flow's automatic differentiation.


On Friday, July 8, 2016 at 5:02:55 PM UTC-7, Andrei Zh wrote:

Gabriel Goh

unread,
Jul 9, 2016, 3:37:11 AM7/9/16
to julia-users

There is also


https://github.com/mlubin/ReverseDiffSparse.jl


I've never used it myself, but I thought i'd throw it out there.

Andrei Zh

unread,
Jul 9, 2016, 6:10:12 AM7/9/16
to julia-users
Thanks for all your answers! Just to make it clear, at the moment I'm not really interested in TensorFlow itself, but specifically in its automatic differentiation capabilities. 

ReverseDiffSource.jl looks very promising and is indeed quite fast for `R^n -> R` in a few experiments I've made. Thanks again!
Reply all
Reply to author
Forward
0 new messages