Moshe Looks (originator of the MOSES subsystem of OpenCog) and some
colleagues at Google have a new release,
https://research.googleblog.com/2017/02/announcing-tensorflow-fold-deep.html
It's fairly subtle but the crux as I currently understand it is, it
provides a way to make it more efficient to train tensorflow models
over graphs rather than vectors...
This could have some uses in the OpenCog universe, e.g. if we wanted
to do clustering of little trees or sub-hypergraphs or whatever, we
could perhaps try unsupervised LSTM in Tensorflow for unsupervised
classification...
As one example, this could be used to cluster words into "part of
speech" categories based on various (syntactic and semantic) data
associated with the word, in the simplest case using TreeLSTM....
Whether this would work better than EM clustering or GP clustering or
other stuff I have no idea though...
-- Ben
--
Ben Goertzel, PhD
http://goertzel.org
“I tell my students, when you go to these meetings, see what direction
everyone is headed, so you can go in the opposite direction. Don’t
polish the brass on the bandwagon.” – V. S. Ramachandran