[GSoC 2018 Tensorflex: Tensorflow bindings for Elixir] Week #1 update

29 views
Skip to first unread message

Anshuman Chhabra

unread,
May 20, 2018, 3:03:10 AM5/20/18
to BEAM Community
Hello everyone!

I'm Anshuman and for this year's Google Summer of Code I'm working on writing Tensorflow bindings for Elixir. The project is present here: https://github.com/anshuman23/tensorflex/

This is an update to let anyone interested know what functionality has been added so far and what more is coming in the next week:

Functionality added this week:

- Added support for reading pretrained graph definition files:
Now it is possible to read in pretrained graph definition files into Tensorflex. You can give this a try with the Inception graph definitions present here. The example code is given below:  
iex(1)> graph = Tensorflex.read_graph("classify_image_graph_def.pb")
2018-05-17 23:36:16.488469: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that    this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA 2018-05-17 23:36:16.774442: W             tensorflow/core/framework/op_def_util.cc:334] OpBatchNormWithGlobalNormalization is deprecated. It will cease to work in   GraphDef version 9. Use tf.nn.batch_normalization().
Successfully imported graph
#Reference<0.1610607974.1988231169.250293>

- Added support for getting all operations present in graph:
It is also possible to go ahead and look at all the operations that populate the graph. The functionality returns the operation names in a list:
iex(2)> op_list = Tensorflex.get_graph_ops graph
["softmax/biases", "softmax/weights", "pool_3/_reshape/shape",
 "mixed_10/join/concat_dim", "mixed_10/tower_2/conv/batchnorm/moving_variance",
 "mixed_10/tower_2/conv/batchnorm/moving_mean",
 "mixed_10/tower_2/conv/batchnorm/gamma",
 "mixed_10/tower_2/conv/batchnorm/beta", "mixed_10/tower_2/conv/conv2d_params",
 "mixed_10/tower_1/mixed/conv_1/batchnorm/moving_variance",
 "mixed_10/tower_1/mixed/conv_1/batchnorm/moving_mean",
 "mixed_10/tower_1/mixed/conv_1/batchnorm/gamma",
 "mixed_10/tower_1/mixed/conv_1/batchnorm/beta",
 "mixed_10/tower_1/mixed/conv_1/conv2d_params",
 "mixed_10/tower_1/mixed/conv/batchnorm/moving_variance",
 "mixed_10/tower_1/mixed/conv/batchnorm/moving_mean",
 "mixed_10/tower_1/mixed/conv/batchnorm/gamma",
 "mixed_10/tower_1/mixed/conv/batchnorm/beta",
 "mixed_10/tower_1/mixed/conv/conv2d_params",
 "mixed_10/tower_1/conv_1/batchnorm/moving_variance",
 "mixed_10/tower_1/conv_1/batchnorm/moving_mean",
 "mixed_10/tower_1/conv_1/batchnorm/gamma",
 "mixed_10/tower_1/conv_1/batchnorm/beta",
 "mixed_10/tower_1/conv_1/conv2d_params",
 "mixed_10/tower_1/conv/batchnorm/moving_variance",
 "mixed_10/tower_1/conv/batchnorm/moving_mean",
 "mixed_10/tower_1/conv/batchnorm/gamma",
 "mixed_10/tower_1/conv/batchnorm/beta", "mixed_10/tower_1/conv/conv2d_params",
 "mixed_10/tower/mixed/conv_1/batchnorm/moving_variance",
 "mixed_10/tower/mixed/conv_1/batchnorm/moving_mean",
 "mixed_10/tower/mixed/conv_1/batchnorm/gamma",
 "mixed_10/tower/mixed/conv_1/batchnorm/beta",
 "mixed_10/tower/mixed/conv_1/conv2d_params",
 "mixed_10/tower/mixed/conv/batchnorm/moving_variance",
 "mixed_10/tower/mixed/conv/batchnorm/moving_mean",
 "mixed_10/tower/mixed/conv/batchnorm/gamma",
 "mixed_10/tower/mixed/conv/batchnorm/beta",
 "mixed_10/tower/mixed/conv/conv2d_params",
 "mixed_10/tower/conv/batchnorm/moving_variance",
 "mixed_10/tower/conv/batchnorm/moving_mean",
 "mixed_10/tower/conv/batchnorm/gamma", "mixed_10/tower/conv/batchnorm/beta",
 "mixed_10/tower/conv/conv2d_params", "mixed_10/conv/batchnorm/moving_variance",
 "mixed_10/conv/batchnorm/moving_mean", "mixed_10/conv/batchnorm/gamma",
 "mixed_10/conv/batchnorm/beta", "mixed_10/conv/conv2d_params",
 "mixed_9/join/concat_dim", ...]

Plans for coming weeks:
The logical progression of adding functions should eventually get to the point where we can run the loaded graph against our own inputs and generate prediction outputs. For this to work, a lot of functions need to be added first: particularly ones that will allow us to read Tensors supplied by the user. These TF_Tensor functions are very important and they would have to encompass a wide variety of inputs. More graph based functions will also be imported from Tensorflow as and when required (like the operation list function get_graph_ops) After that, some more graph based functions will be needed that will use both the tensor functionality and the graph functions. Finally, we will need to add support for creating and running Sessions.


Reply all
Reply to author
Forward
0 new messages