Evolution of NELL Architecture

瀏覽次數:98 次
跳到第一則未讀訊息

Fernando Ortega

未讀,
2016年4月25日 上午10:07:382016/4/25
收件者:NELL: Never-Ending Language Learner
Hi NELL team,

I´m just starting to study how NELL works because my thesis focuses on new approaches for attribute-based sentiment analysis and I´m considering some possibilities to attribute extraction.

Your system provide a very interesting way to achieve this problem and I want to understand the decisions taken since it started around 5-6 years ago. 

How did the architecture change during this time?. I read in "Toward an Architecture for Never-Ending Language Learning" that there were four learners (CPL, CSEAL, CMD, and RL) and now NELL use six learners (CPL, CMI, SEAL, OpenEval, PRA, NEIL). I need to read more about NELL publications but I don´t know why CSEAL is not used anymore or which new method is an evolution of initial learner.

Thanks in advance.

Bryan Kisiel

未讀,
2016年4月25日 上午10:49:142016/4/25
收件者:NELL: Never-Ending Language Learner
Hi Fernando,

We'd be happy to discuss how and why NELL has changed since its inception,
but there are so many ways in which it has changed that it's difficult to
try to choose and organize the interesting parts without knowing more
about what you're interested in. So, let me start with an overview of the
learners we've been using.

In some sense, the real origin of NELL is CBL (now called CPL), which
couples learning over a space of category instances and learning over a
space of cooccurring textual patterns, and does this in the context of
coupling a number of independent learning problems (one for each
predicate) by constraining them through an ontology. In other words, CBL
alone is a basic form of never-ending multi-view coupled semi-supervised
learning that forms one of the premises for NELL.

By the time our AAAI10 paper came around, we built on this to add the four
learners you mentioned (CPL, CSEAL, CMC, RL), along with a master
"knowledge integrator", components to add more views and more coupling.
Choice of the additional three was driven, as always, by some degree of
convenience and mutual interest with other researchers, but also by a
desire to introduce learning via fundamentally different views of the
world, and to advance particular directions (e.g. RL does not learn from
the outside world but rather advances learning through introspection --
another one of our major long-term research interests.)

Now let's jump all the way to the present day. In technical terms, the
set of learners NELL runs on every iteration is CPL, CPL2, SEAL2, OE,
CMC3, LatLongTT, SemParse, and PRA. CPL2 is a work-in-progress next
generation CPL that aims to use a scoring methodology that is more
sensitive to human feedback, negative training examples, and will accept
feedback in pattern-space as well as instance-space. SEAL2 is newer
version of CSEAL tweaked for improved performance. CMC3 is a logistic
regression version of CMC (originally used SVM) that solves some
scalability problems and other minor details that also got tweaked to
improve signal/noise ratio and sensitivy to human feedback. LatLongTT
assignes geolocation information (although it technically does not learn
at present). OE and PRA I assume you already know about.

So, while, CSEAL is effectively still being run, RL and NEIL are not
integrated on a per-iteration basis. RL was suspended when NELL gained
polysemy resolution because that fundamentaly changed the internal
structure of the KB, and we've never had the manpower to reimplement RL.
PRA effectively served as RL's replacement, even though the two algorithms
have complementary properties (meaning that it would be interesting to
have both and to examine their complementary strengths). NEIL, along with
a number of other learners not mentioned here, falls into the category of
things that have so far only ever been run manually, whether once or
several times, because full-scale integration has not yet been fully
explored.

As you can imagine, there is a lot more to say about all that, and this is
only on one topic. But I hope it sheds some light on the situation for
you, and do feel free to ask more about this or anythign else.

bki...@cs.cmu.edu
> --
> You received this message because you are subscribed to the Google Groups "NELL: Never-Ending Language Learner" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to cmunell+u...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

Fernando Ortega

未讀,
2016年4月25日 上午11:37:042016/4/25
收件者:NELL: Never-Ending Language Learner
Wow! Thanks for your complete answer. Now I have a better idea about how the learners have been used and the reasons of the changes.

I´m going to continue studying this amazing system.

Thanks you.
回覆所有人
回覆作者
轉寄
0 則新訊息