TensorFlow Tutorial on 3D Tensor and Reduce_Mean¶

4 views
Skip to first unread message

r poon

unread,
Dec 30, 2018, 10:25:25 AM12/30/18
to ma...@abrs.com.hk

TensorFlow Tutorial on 3D Tensor and Reduce_Mean


Background on Tensorflow:
You may refer to this tutorial give a concise summary intro to Tensor Flow. This blog post will also
post two introductory post on:

a. Tensorflow Up and Running
b. Tensorflow Tensors and Dimensionality Basics


A basic review of dimensionality / axis in Tensorflow:

In nut shell, a N dimensional tensor is a N dimensional matrix who is capable of operate on each others with various operators. This tutorial give a concise summary intro to Tensor Flow: https://www.guru99.com/tensor-tensorflow.html.

Understanding dimensionality is one of the most important knowledge in handling machine learning algorithm.

Please note that 

i) a zero dimension tensor is a scalar (number), a real number, 
ii) a one dimension tensor is a vector, example,  [1, 2, 3] is a vector which represent a LINE in a three D space.


We also term use the term axis more for a specific dimension, which more frequently used for the specific dimension with "direction" sense. 

(Note: Please note the term dimension is used for to denotate the tensor dimension, which is one is this case, also the dimensionality of the vector which is 3. Watch out!!)

iii) a two dimension tensor is a matrix, a rectangular array of numbers, symbols, or expressions, arranged in rows and columns


Objective of this tutorial on dimensionality


This short tutorial illustrate high dimension tensor content access and tf.reduce_mean, which calculate the mean of a tensor along one dimension (eg. along one of the axis), and then


You need a Jupyter Notebook with TensorFlow and Python kernel installed. 


We start with a matrix with the following structure, a 3 dimensional tensor of (2,3,3), you can think of it
as a order list of  two 3 x 3 matrices, but really it is a higher dimensional tensor structure

[[[1.,1.,1.],[2.,2.,2.],[3.,3.,3.]], [[4.,4.,4.],[5.,5.,5.],[6.,6.,6.]]]
 

The Tensorflow codes


In [1]:
import tensorflow as tf
print(tf.__version__)
1.12.0

In [3]:
# make a 3 dimensional tensor
constant_float_ext = tf.constant([[[1.,1.,1.],[2.,2.,2.],[3.,3.,3.]],[[4.,4.,4.],[5.,5.,5.],[6.,6.,6.]]])

In [4]:
session = tf.Session()

In [6]:
session.run(tf.global_variables_initializer())


A second look at dimensionality:

Please note the dimensionality of the 3D tensor is:


[dim 1: 

           [dim 2: 

                       [dim 3: 1, 1, 1], [dim 3: 2, 2, 2], [dim 3: 3, 3, 3]

                    ] 

           

           [dim 2: 

                       [dim 3: 4, 4, 4], [dim 3: 5, 5, 5], [dim 3: 6, 6, 6]

                     ] 

]





Note: please be reminded again in Python is zero indexed, so [0] refer to indexed one, [1] refers to index two etc..



In [18]:
print(session.run(constant_float_ext))
[[[1. 1. 1.]
  [2. 2. 2.]
  [3. 3. 3.]]

 [[4. 4. 4.]
  [5. 5. 5.]
  [6. 6. 6.]]]

In [9]:
This is a tensor of 3 dimension
print(session.run(tf.rank(constant_float_ext)))
3

In [14]:
constant_float_ext[0]
Out[14]:
<tf.Tensor 'strided_slice_6:0' shape=(3, 3) dtype=float32>

In [16]:
# first element, indexed along dimension 1
print(session.run(constant_float_ext[0]))
[[1. 1. 1.]
 [2. 2. 2.]
 [3. 3. 3.]]

In [15]:
# second element, indexed along dim 1
print(session.run(constant_float_ext[1]))
[[4. 4. 4.]
 [5. 5. 5.]
 [6. 6. 6.]]

In [17]:
# first element, along dimension 2
print(session.run(constant_float_ext[0][0]))
[1. 1. 1.]

In [20]:
# second element, along first dimension
print(session.run(constant_float_ext[0][1]))
[2. 2. 2.]

In [19]:
# same effect
print(session.run(constant_float_ext[0, 1]))
[2. 2. 2.]


Reduce Mean

This function computes the mean of elements across dimensions of a tensor and then reduces input_tensor along the dimensions given in axis. There are a number of similar functions
that you can quickly understand once you grasp the key concept behind this one


In [8]:
#this is the mean across all 3 dimensions (sum along each dimension and divided by # elements)
print(session.run(tf.reduce_mean(constant_float_ext)))
3.5

In [22]:
# mean along dim 1 (note Python position one is zero-indexed), which has two elements (each with 3 sub-elements of dim 2), and now 
# (note: (1+4)/2 = 2.5, (2+5)/2 = 3.5, (3+6)/2 = 4.5)
# reduce_mean to first dim, which really sum the first and second 3 x 3 matrix and then divided by two)

print (session.run(tf.reduce_mean(constant_float_ext, 0)))
[[2.5 2.5 2.5]
 [3.5 3.5 3.5]
 [4.5 4.5 4.5]]

In [21]:
# mean along dim 2  (1+2+3)/3 and (4+5+6)/3
print(session.run(tf.reduce_mean(constant_float_ext, 1)))
[[2. 2. 2.]
 [5. 5. 5.]]

In [7]:
# mean along dim 3, please note reduce_mean will also reduce the dimenality
# eg. the following no longer has dim 3
print(session.run(tf.reduce_mean(constant_float_ext, 2)))
[[1. 2. 3.]
 [4. 5. 6.]]


Reply all
Reply to author
Forward
0 new messages