CFP: SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation

7 views
Skip to first unread message

Matt Lease

unread,
May 20, 2010, 1:05:13 PM5/20/10
to NAACL 2010 Mechanical Turk Workshop
Another upcoming venue for publishing AMT research, with the added
incentive of Most Innovative Awards provided by Microsoft Bing.

SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation
Workshop site: http://www.ischool.utexas.edu/~cse2010

================
The SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation
(CSE2010)
solicits submissions on topics including but are not limited to the
following areas:

* Novel applications of crowdsourcing for evaluating search
systems
(see examples below)

* Novel theoretical, experimental, and/or methodological
developments advancing state-of-the-art knowledge of crowdsourcing for
search evaluation

* Tutorials on how the different forms of crowdsourcing might be
best suited to or best executed in evaluating different search tasks

* New software packages which simplify or otherwise improve
general
support for crowdsourcing, or particular support for crowdsourced
search
evaluation

* Reflective or forward-looking vision on use of crowdsourcing in
search evaluation as informed by prior and/or ongoing studies

* How crowdsourcing technology or process can be adapted to
encourage and facilitate more participation from outside the USA

The workshop especially calls for innovative solutions in the area of
search evaluation involving significant use of a crowdsourcing
platform
such as Amazon's Mechanical Turk, Crowdflower, LiveWork, etc. Novel
applications of crowdsourcing are of particular interest. This
includes
but is not restricted to the following tasks:

* cross-vertical search (video, image, blog, etc.) evaluation,

* local search evaluation

* mobile search evaluation

* realtime/news search evaluation

* entity search evaluation

* discovering representative groups of rare queries, documents,
and
events in the long-tail of search

* detecting/evaluating query alterations

For example, does the inherent geographic dispersal of crowdsourcing
enable better assessment of a query's local intent, its local-specific
facets, or diversity of returned results? Could crowd-sourcing be
employed in near real-time to better assess query intent for breaking
news and relevant information?

Most Innovative Awards --- Sponsored by Microsoft Bing

As further incentive to participation, authors of the most novel and
innovative crowdsourcing-based search evaluation techniques (e.g.
using
Amazon's Mechanical Turk, Livework, Crowdflower, etc.) will be
recognized with "Most Innovative Awards" as judged by the workshop
organizers. Selection will be based on the creativity, originality,
and
potential impact of the described proposal, and we expect the winners
to
describe risky, ground-breaking, and unexpected ideas. The provision
of
awards is thanks to generous support from Microsoft Bing, and the
number
and nature of the awards will depend on the quality of the submissions
and overall availability of funds. All valid submissions to the
workshop
will be considered for the awards.

Submission Instructions

Submissions should report new (unpublished) research results or
ongoing
research. Long paper submissions (up to 8 pages) will be primarily
target oral presentations. Short papers submissions can be up to 4
pages
long, and will primarily target poster presentations. Papers should be
formatted in double-column ACM SIG proceedings format
(http://www.acm.org/sigs/publications/proceedings-templates). Papers
must be submitted as PDF files. Submissions should not be anonymized.

Important Dates

Submissions due: June 7, 2010 (11:59PM US Eastern Standard Time)
Notification of acceptance: June 21, 2010
Camera-ready submission: June 28, 2010
Workshop date: July 23, 2010

Questions?

Email the organizers at cse...@ischool.utexas.edu

----
Organizers

Vitor Carvalho, Microsoft
Matthew Lease, University of Texas at Austin
Emine Yilmaz, Microsoft

Program Committee

Eugene Agichtein, Emory University
Ben Carterette, University of Delaware
Charlie Clarke,University of Waterloo
Gareth Jones, Dublin City University
Michael Kaisser. University of Edinburgh
Jaap Kamps, University of Amsterdam
Gabriella Kazai, Microsoft Research
Mounia Lalmas, University of Glasgow
Winter Mason, Yahoo! Research
Don Metzler, University of Southern California
Stefano Mizzaro, University of Udine
Gheorghe Muresan, Microsoft Bing
Iadh Ounis, University of Glasgow
Mark Sanderson, University of Sheffield
Mark Smucker, University of Waterloo
Siddharth Suri, Yahoo! Research
Fang Xu, Saarland University

--
Matt Lease
Assistant Professor
School of Information and Department of Computer Science
University of Texas at Austin
Voice: (512) 471-9350 · Fax: (512) 471-3971 · Office: UTA 5.450
http://www.ischool.utexas.edu/~ml
Reply all
Reply to author
Forward
0 new messages