Voluntary Evaluation / Annotation Cycles for June 2018

31 views
Skip to first unread message

Laura Dietz

unread,
Jun 5, 2018, 10:22:14 AM6/5/18
to trec...@googlegroups.com
Here your monthly reminder. You can submit runs to me until June 20.
(please send as web links, not attachments)

Best,
Laura

----
Dear TREC CAR participants,

We are hard at work and will (hopefully soon) release the new training
and test topics for the TREC CAR submission in summer.

In the mean time, I recommend to train and tune your systems using the
benchmarkY1train collection, available on the TREC CAR website [1].

You may notice that we switched from v1.5 (used last year) to v2.0 a
while back. We are still using Wikipedia dump from December 2016, but
you may notice that the paragraphCorpus contains significantly more
paragraphs. As we fixed several issues in our Wikipedia parsing
pipeline, the paragraph ids have changed. We are releasing a mapping
between the paragraph ids  and translated qrels in the coming days. 


Last year's evaluation revealed that the automatic QRELS files are as
good as manual annotations at distinguishing what works and what
doesn't. Nevertheless, I agree that having more manual annotations is
probably a good thing. (Especially for the entity task).


To facilitate this, I offer voluntary evaluation and annotation cycles.
But since I don't have any annotators, you will need to help out. How
does this work?

1) Every 20th of a month (starting April 20th), I accept passage and
entity run files for the benchmarkY1test collection.

2) I will use submitted runs to seed the annotation system, and provide
access credentials to everyone who submitted runs.

3) Then it is your turn to assess evaluation topics in the system -- the
more runs you submit, the more topics I expect you to assess.

4) Every 30th of a month, I will send back collected annotations. (If
you complied with your annotation duties.)


Be aware that by sharing runs with me, other participants may reverse
engineer your ranking function. Yes, everything is out in the open. I
can't prevent teams from purposefully mis-annotating, so you will have
to conduct your own inter-annotator agreement and cleaning. Assessments
will include annotator ids.

Note that this is a voluntary service from my side. The participation in
this voluntary cycle is independent from your participation at TREC,
i.e., you can participate in either, both, or none.


When you send me runs, please do **not** send as email attachment, but
send a download link, web server, dropbox, or google drive.

Best,

Laura



[1] http://trec-car.cs.unh.edu




Reply all
Reply to author
Forward
0 new messages