Objective of this tutorial on dimensionality
This short tutorial illustrate high dimension tensor content access and tf.reduce_mean, which calculate the mean of a tensor along one dimension (eg. along one of the axis), and then
import tensorflow as tf print(tf.__version__)
1.12.0
# make a 3 dimensional tensor constant_float_ext = tf.constant([[[1.,1.,1.],[2.,2.,2.],[3.,3.,3.]],[[4.,4.,4.],[5.,5.,5.],[6.,6.,6.]]])
session = tf.Session()
session.run(tf.global_variables_initializer()) A second look at dimensionality:
Please note the dimensionality of the 3D tensor is:
[dim 1:
[dim 2:
[dim 3: 1, 1, 1], [dim 3: 2, 2, 2], [dim 3: 3, 3, 3]
]
[dim 2:
[dim 3: 4, 4, 4], [dim 3: 5, 5, 5], [dim 3: 6, 6, 6]
]
]
print(session.run(constant_float_ext))
[[[1. 1. 1.] [2. 2. 2.] [3. 3. 3.]] [[4. 4. 4.] [5. 5. 5.] [6. 6. 6.]]]
This is a tensor of 3 dimension print(session.run(tf.rank(constant_float_ext)))
3
constant_float_ext[0]
<tf.Tensor 'strided_slice_6:0' shape=(3, 3) dtype=float32>
# first element, indexed along dimension 1 print(session.run(constant_float_ext[0]))
[[1. 1. 1.] [2. 2. 2.] [3. 3. 3.]]
# second element, indexed along dim 1 print(session.run(constant_float_ext[1]))
[[4. 4. 4.] [5. 5. 5.] [6. 6. 6.]]
# first element, along dimension 2 print(session.run(constant_float_ext[0][0]))
[1. 1. 1.]
# second element, along first dimension print(session.run(constant_float_ext[0][1]))
[2. 2. 2.]
# same effect print(session.run(constant_float_ext[0, 1]))
[2. 2. 2.]
#this is the mean across all 3 dimensions (sum along each dimension and divided by # elements) print(session.run(tf.reduce_mean(constant_float_ext)))
3.5
# mean along dim 1 (note Python position one is zero-indexed), which has two elements (each with 3 sub-elements of dim 2), and now # (note: (1+4)/2 = 2.5, (2+5)/2 = 3.5, (3+6)/2 = 4.5) # reduce_mean to first dim, which really sum the first and second 3 x 3 matrix and then divided by two) print (session.run(tf.reduce_mean(constant_float_ext, 0)))
[[2.5 2.5 2.5] [3.5 3.5 3.5] [4.5 4.5 4.5]]
# mean along dim 2 (1+2+3)/3 and (4+5+6)/3 print(session.run(tf.reduce_mean(constant_float_ext, 1)))
[[2. 2. 2.] [5. 5. 5.]]
# mean along dim 3, please note reduce_mean will also reduce the dimenality # eg. the following no longer has dim 3 print(session.run(tf.reduce_mean(constant_float_ext, 2)))
[[1. 2. 3.] [4. 5. 6.]]
Reference:
tf.reduce_mean:
https://www.tensorflow.org/api_docs/python/tf/math/reduce_mean
https://www.aiworkbox.com/lessons/calculate-mean-of-a-tensor-along-an-axis-using-tensorflow
tf.constant
Tensors in Tensflow