AI Dungeon / GPT-2

81 views
Skip to first unread message

Tom Harrison

unread,
Feb 21, 2020, 11:53:20 AM2/21/20
to opencog
https://github.com/openai/gpt-2

These are links to an AI model called GPT-2, it has amazing natural language understanding, AIDungeon is an implementation of this model, it demonstrates an amazing ability to understand language, almost perfectly. The only flaw in AIDungeon is remembering story elements and the surrounding environment, though i believe Opencog already has methods for handling this, i think this could lead to a breakthrough in OpenCog development.

Lake Watkins

unread,
Feb 21, 2020, 12:09:08 PM2/21/20
to ope...@googlegroups.com
Hey Tom,

Thanks for the post.  I've been pushing for something like this for years, and I've been following AIDungeon since its release, but I don't think it's as sophisticated as you might think.

Lacking procedural or episodic knowledge is a major flaw, and one that fundamentally prevents these sorts of systems from performing meaningful reasoning.  You're right that OpenCog has methods for handling this, but really reasoning is the whole value-add of OpenCog to other cognitive systems.  It's not so much that AIDungeon would lead to a breakthrough in OpenCog, it's that OpenCog's methods could lead to a breakthrough in new AIDungeons.

My own startup is aiming to do just that, but we're still a long ways away from rivaling human dungeon masters.  I do think OpenCog has that potential, but if I were designing an AIDungeon I would do it very differently.  If you have thoughts on the matter I'd love to hear them.

Regards,
-Lake

--
You received this message because you are subscribed to the Google Groups "opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email to opencog+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/opencog/8b044da0-4f84-4550-b866-8b2734242d01%40googlegroups.com.

Tom Harrison

unread,
Feb 21, 2020, 1:30:38 PM2/21/20
to ope...@googlegroups.com
The natural language understanding in OpenCog at the moment isn't very good, and even though GPT-2 has problems with several things, perhaps it could be integrated with other OpenCog modules that can handle those things.

Lake Watkins

unread,
Feb 21, 2020, 1:57:51 PM2/21/20
to ope...@googlegroups.com
The integration that you're talking about is something that we've been working on for years.  It's part of our neuro-symbolic architecture.  Basically, if you can get something like GPT-2 to learn natural language, then you can project those lessons into the Atomspace and reason on them with the rest of OpenCog.  But that's far from trivial, and it's not clear that GPT-2 is the best framework for that kind of integration.  I'm not personally working on the integration myself, but I know that there are a lot of considerations beyond which machine learning approach seems to be performing the best right now.  Deborah could tell you more.

Ben Goertzel

unread,
Feb 22, 2020, 2:02:40 AM2/22/20
to opencog
GPT2 doesn't do natural language "understanding", I would say its NL
understanding is even worse than OpenCog's right now...

We have used transformer NNs together with OpenCog for NL dialogue
systems, e.g. for controlling the Philip K. Dick robot that we showed
off at the Web Summit last year. However, this was more a software
integration than a deep conceptual integration (basically if the
OpenCog rulebase couldn't come up w/ a good response to something
someone said to the robot, it would fall back to a transformer-NN
model we trained on a corpus of PKD writing)

Andres Suarez and I are also working on a deeper integration where we
use a transformer-NN model as a "sentence probability oracle" or
"paragraph probability oracle" and then use the probabilities
estimated by the model to help rank and prioritize different
rule-inductions and category-formations proposed within OpenCog's
symbolic learning

We have also been experimenting with DeepWalk-type methods for
embedding OpenCog Atomspace hypergraphs into vector spaces (in the
context of the bio-Atomspace containing ensembles of
genomic-data-analysis MOSES classification trees, and lots of data
from numerous bio-ontologies) ... with decent results .. and have been
thinking about how to create new forms of graph transformer networks
that are appropriate for generative modeling of specific sorts of
Atomspace hypergraphs (e.g. hypergraphs that record PLN inference
histories) ...

So Tom, I think your instinct is correct but the details get hairy...

ben
> To view this discussion on the web visit https://groups.google.com/d/msgid/opencog/CAFYQi%3DsWW9Gd7PyujWteb8Lcr1otYm%2B-5ys9j6f8FnzQhorNRA%40mail.gmail.com.



--
Ben Goertzel, PhD
http://goertzel.org

“The only people for me are the mad ones, the ones who are mad to
live, mad to talk, mad to be saved, desirous of everything at the same
time, the ones who never yawn or say a commonplace thing, but burn,
burn, burn like fabulous yellow roman candles exploding like spiders
across the stars.” -- Jack Kerouac
Reply all
Reply to author
Forward
0 new messages