Caveat: my suggestion that FBP'ers should think about TensorFlow is a "question" not a "conclusion". This might turn out to be a blind alley.
Tensors were invented in physics, back when FORTRAN was still hot and OO was in its infancy. Tensors are multi-dimensional arrays of Objects, where the objects can be ints, complex, 3d vectors, etc.
Fields are not uniform in space. Imagine two waves in water colliding. The interference patterns form hills and valleys on the surface of the water. Think of describing such a complicated sets of field interactions in 3D space, using vectors. At one point in the field, a 3D vector may describe the direction of the field (energy), say (3,4,5). At the next point over, the energy interactions may result in a 3D vector (1,5,4).
This video (12 min) was most helpful in reminding me what tensors are.
https://www.youtube.com/watch?v=f5liqUk0ZTwI found this lecture, albeit long, to give me a good sense for what TensorFlow can do
https://www.youtube.com/watch?v=vq2nnJ4g6N0The slides seem to be here:
https://cloud.google.com/blog/big-data/2017/01/learn-tensorflow-and-deep-learning-without-a-phdpt