CSCW 21 Workshop - Investigating and Mitigating Biases in Crowdsourced Data

33 views
Skip to first unread message

Matt Lease

unread,
Aug 23, 2021, 2:41:55 PM8/23/21
to Crowdsourcing and Human Computation

The workshop will be held at ACM CSCW 2021, virtually on the 23rd of October 2021 from 3 PM to 8 PM EDT.

The workshop will explore how specific crowdsourcing workflows, worker attributes, and work practices contribute to biases in data. We also plan to discuss research directions to mitigate labelling biases, particularly in a crowdsourced context, and the implications of such methods for the workers.

We invite participants to take part in the workshop challenge and/or submit a position paper.
  • Submit a short position paper by 10 September 2021
  • Register for the Crowd Bias Challenge by 23 September 2021
Workshop Themes

  • Understanding how annotator attributes contribute to biases 
Research on crowd work has often focused on task accuracy whereas other factors such as biases in data have received limited attention. We are interested in reviewing existing approaches and discussing ongoing work that helps us better understand annotation attributes contributing to biases. 

  • Quantifying bias in annotated data 

An important step towards bias mitigation is detecting such biases and measuring the extent of biases in data. We seek to discuss different methods, metrics and challenges in quantifying biases, particularly in crowdsourced data. Further, we are interested in ways of comparing biases across different samples and investigating if specific biases are task-specific or task-independent. 

  • Novel approaches to mitigate crowd bias 

We plan to explore novel methods that aim to reduce biases in crowd annotation in particular. Current approaches range from worker pre-selection, improving task presentation and dynamic task assignment. We seek to discuss shortcomings and limitations of existing and ongoing approaches and ideate future directions.  

  • Impact on crowd workers 

We want to explore how bias identification and mitigation strategies can impact the actual workers, positively or negatively. For example, workers in certain groups may face increased competition and lack of task availability. Collecting worker attributes and profiling could raise ethical concerns. 



More details at https://sites.google.com/view/biases-in-crowdsourced-data

--
Matt Lease
Associate Professor
School of Information 
University of Texas at Austin
Voice: (512) 471-9350 · Fax: (512) 471-3971 · Office: UTA 5.536
http://www.ischool.utexas.edu/~ml
Reply all
Reply to author
Forward
0 new messages