Next EOTM

2 views
Skip to first unread message

Grant Ingersoll

unread,
Aug 7, 2009, 9:22:50 PM8/7/09
to Natural Language Processing Virtual Reading Group
Shall we start looking into the next paper? People can of course
still ask questions on the current one, but seems like things have
died down a bit on the first one. I forget who volunteered to be EOTM
for this month.

-Grant

Scott Frye

unread,
Aug 8, 2009, 7:37:21 AM8/8/09
to Natural Language Processing Virtual Reading Group
Sure. I offered to be the next EOTM.

I've been looking at a mix of more POS and Parsing papers so the group
can decide if they want to dig deeper into this topic or switch to
parsing.

I'll put a list of papers to select from up by Monday morning.

Scott Frye

unread,
Sep 11, 2009, 3:21:57 PM9/11/09
to Natural Language Processing Virtual Reading Group
We should probably start thinking about who the next EOTM should be so
they have time to get papers ready. Looking back in the discussions,
I think Paul Kalmar offered to be next, and after him Joan.

A few points:
-> Time frames
During the last rotation I followed the following schedule:
-Post Papers to Vote on for 2 weeks
-Read and discuss for 2 weeks.
However a number of people felt that the voting time should be cut
down to 3 days.

-> Paper complexity
There was some discussion again as to if we should be looking at
foundation or current papers.

My feeling is that each EOTM should decide on these as their turn
comes up.

-Scott Frye

Paul Kalmar

unread,
Sep 11, 2009, 5:05:47 PM9/11/09
to Scott Frye, Natural Language Processing Virtual Reading Group
Sounds great.  I'll be the next EOTM.  Any preference for general topic, or should I choose that?

--Paul

Scott Frye

unread,
Oct 26, 2009, 8:59:01 AM10/26/09
to Natural Language Processing Virtual Reading Group
Anyone up to being the next EOTM? I think Joan offered once.
> > -Scott Frye- Hide quoted text -
>
> - Show quoted text -

Tanton Gibbs

unread,
Oct 29, 2009, 3:05:55 PM10/29/09
to Scott Frye, Natural Language Processing Virtual Reading Group
Did we find someone for this? I would volunteer but I know next to
nothing about NLP.

Scott Frye

unread,
Oct 29, 2009, 4:01:15 PM10/29/09
to Natural Language Processing Virtual Reading Group
NO EXPERIENCE REQUIRED!!! :)

To me its all about learning new things we haven't seen before anyway.

This is the general format we agreed on in the past: (from Reading
Group Fromat thread)

1) Editor of the Month (EOTM) will spend 2-3 days selecting 2-5
papers
for consideration and post a brief description of each.
2) Everyone votes on papers for 3 days.
3) Everyone reads papers for 2 weeks.
4) At end of 2 weeks EOTM posts 5-10 seed questions.
5) Next EOTM starts process at #1 again.

There is more info in the READING GROUP FORMAT thread and also various
papers have been recommended in there that and other threads that you
might want to recycle.

If no one else volunteers by Monday, I'll take another stab at it but
I think it is better if we get a wider variety of EOTMs to contribute.
(IMHO)

-Scott Frye
> >> - Show quoted text -- Hide quoted text -

Tanton Gibbs

unread,
Nov 6, 2009, 11:04:28 AM11/6/09
to Natural Language Processing Virtual Reading Group
I had one request to look at papers about CFG/LFG/DG, so I picked two
papers around that. In addition, I chose two papers on the practical
use of coocurrences. Please let me know if there are additional
papers we should consider in either of these fields.

Dan Klein and Christopher D. Manning. A Generative Constituent-Context
Model for Improved Grammar Induction.
2002 - 39 citations
http://nlp.stanford.edu/~manning/papers/KleinManningACL2002.pdf

Structural Disambiguation With Constraint Propagation
1990 - 85 citations
http://www.aclweb.org/anthology-new/P/P90/P90-1005.pdf

Corpus-Based Stemming using co-occurrence of word vairants
1998 - 58 citations
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.35.604

Similarity-Based Models of Word Cooccurrence Probabilities
1999 - 52 citations
http://springerlink.metapress.com/content/t1t876515pqg5457/fulltext.pdfP

Please vote for your favorite over the next week or suggest alternatives.

Thanks!
Tanton

Scott Frye

unread,
Nov 6, 2009, 2:08:03 PM11/6/09
to Natural Language Processing Virtual Reading Group
The link to the the last one seems to be broken...but a google search
turned it up here:

Similarity-Based Models of Word Cooccurrence Probabilities
1999 - 52 citations
http://www.cis.upenn.edu/~pereira/papers/sim-mlj.pdf


On Nov 6, 11:04 am, Tanton Gibbs <tanton.gi...@gmail.com> wrote:
> I had one request to look at papers about CFG/LFG/DG, so I picked two
> papers around that.  In addition, I chose two papers on the practical
> use of coocurrences.  Please let me know if there are additional
> papers we should consider in either of these fields.
>
> Dan Klein and Christopher D. Manning. A Generative Constituent-Context
> Model for Improved Grammar Induction.
> 2002 - 39 citationshttp://nlp.stanford.edu/~manning/papers/KleinManningACL2002.pdf
>
> Structural Disambiguation With Constraint Propagation
> 1990 - 85 citationshttp://www.aclweb.org/anthology-new/P/P90/P90-1005.pdf
>
> Corpus-Based Stemming using co-occurrence of word vairants
> 1998 - 58 citationshttp://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.35.604
>
> Similarity-Based Models of Word Cooccurrence Probabilities
> 1999 - 52 citationshttp://springerlink.metapress.com/content/t1t876515pqg5457/fulltext.pdfP

Tanton Gibbs

unread,
Nov 6, 2009, 2:28:17 PM11/6/09
to Natural Language Processing Virtual Reading Group
Thanks Scott,

If you take the P off the end of the original link, it works...typo by me
http://springerlink.metapress.com/content/t1t876515pqg5457/fulltext.pdf

Jason Adams

unread,
Nov 10, 2009, 8:20:33 AM11/10/09
to Natural Language Processing Virtual Reading Group
i vote for the klein & manning grammar induction 1

-- Jason


--

You received this message because you are subscribed to the Google Groups "Natural Language Processing Virtual Reading Group" group.
To post to this group, send email to nlp-r...@googlegroups.com.
To unsubscribe from this group, send email to nlp-reading...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/nlp-reading?hl=en.



Elmer Garduno

unread,
Nov 10, 2009, 11:29:55 AM11/10/09
to Natural Language Processing Virtual Reading Group
Similarity-Based Models of Word Cooccurrence Probabilities +1
>> nlp-reading...@googlegroups.com<nlp-reading%2Bunsu...@googlegroups.com>
>> .
>> For more options, visit this group at
>> http://groups.google.com/group/nlp-reading?hl=en.
>>
>>
>>
>
> --
>
> You received this message because you are subscribed to the Google Groups
> "Natural Language Processing Virtual Reading Group" group.
> To post to this group, send email to nlp-r...@googlegroups.com.
> To unsubscribe from this group, send email to
> nlp-reading...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/nlp-reading?hl=.
>
>
>

Scott Frye

unread,
Nov 11, 2009, 8:45:07 AM11/11/09
to Natural Language Processing Virtual Reading Group
+1 for
Dan Klein and Christopher D. Manning. A Generative Constituent-
Context
Model for Improved Grammar Induction.
2002 - 39 citations
http://nlp.stanford.edu/~manning/papers/KleinManningACL2002.pdf


Structural Disambiguation With Constraint Propagation


On Nov 6, 11:04 am, Tanton Gibbs <tanton.gi...@gmail.com> wrote:
> I had one request to look at papers about CFG/LFG/DG, so I picked two
> papers around that.  In addition, I chose two papers on the practical
> use of coocurrences.  Please let me know if there are additional
> papers we should consider in either of these fields.
>
> Dan Klein and Christopher D. Manning. A Generative Constituent-Context
> Model for Improved Grammar Induction.
> 2002 - 39 citationshttp://nlp.stanford.edu/~manning/papers/KleinManningACL2002.pdf
>
> Structural Disambiguation With Constraint Propagation
> 1990 - 85 citationshttp://www.aclweb.org/anthology-new/P/P90/P90-1005.pdf
>
> Corpus-Based Stemming using co-occurrence of word vairants
> 1998 - 58 citationshttp://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.35.604
>
> Similarity-Based Models of Word Cooccurrence Probabilities
> 1999 - 52 citationshttp://springerlink.metapress.com/content/t1t876515pqg5457/fulltext.pdfP

Christoph

unread,
Nov 11, 2009, 9:47:54 AM11/11/09
to Natural Language Processing Virtual Reading Group
+1 for Klein and Manning

Joan

unread,
Nov 11, 2009, 3:34:25 PM11/11/09
to Natural Language Processing Virtual Reading Group
Similarity-Based Models of Word Cooccurrence Probabilities +1

On Nov 6, 11:04 am, Tanton Gibbs <tanton.gi...@gmail.com> wrote:

Ronald Hobbs

unread,
Nov 18, 2009, 9:58:41 AM11/18/09
to Natural Language Processing Virtual Reading Group
+1 for klein & manning.

On Nov 6, 4:04 pm, Tanton Gibbs <tanton.gi...@gmail.com> wrote:
> I had one request to look at papers about CFG/LFG/DG, so I picked two
> papers around that.  In addition, I chose two papers on the practical
> use of coocurrences.  Please let me know if there are additional
> papers we should consider in either of these fields.
>
> Dan Klein and Christopher D. Manning. A Generative Constituent-Context
> Model for Improved Grammar Induction.
> 2002 - 39 citationshttp://nlp.stanford.edu/~manning/papers/KleinManningACL2002.pdf
>
> Structural Disambiguation With Constraint Propagation
> 1990 - 85 citationshttp://www.aclweb.org/anthology-new/P/P90/P90-1005.pdf
>
> Corpus-Based Stemming using co-occurrence of word vairants
> 1998 - 58 citationshttp://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.35.604
>
> Similarity-Based Models of Word Cooccurrence Probabilities
> 1999 - 52 citationshttp://springerlink.metapress.com/content/t1t876515pqg5457/fulltext.pdfP

Tanton Gibbs

unread,
Nov 18, 2009, 11:06:17 AM11/18/09
to Natural Language Processing Virtual Reading Group
That gives a two vote majority to klein and manning, so that will be
our paper for this month.

http://nlp.stanford.edu/~manning/papers/KleinManningACL2002.pdf

>> Dan Klein and Christopher D. Manning. A Generative Constituent-Context
>> Model for Improved Grammar Induction.
>> 2002 - 39 citations

Thanks,
Tanton
Reply all
Reply to author
Forward
0 new messages