Participate in the CrowdBias challenge organised with CSCW 2021 Workshop - Investigating and Mitigating Biases in Crowdsourced Data. Register here to take part in teams of up to 4 members.
The broad task goal is to collect correct, unbiased labels for an annotation task in which each item to annotate has a true answer, known to the organizers. Teams are free to pursue whatever research designs they see fit, such as interface design, annotation workflows, assignment methods, aggregation approaches, etc. Teams will be evaluated based on a combination of accuracy and fairness metrics. Successful teams will also be invited to present their approach during the workshop.
Participants may use any crowdsourcing platform of their choice to collect human annotations. For any teams interested in using Amazon’s Sagemaker Ground Truth (GT) or Augmented Artificial Intelligence (A2I) service to collect human annotations, Amazon will provide $250 in AWS credits to the first 15 teams who commit to participating in the challenge and to using GT as part of their participation. AWS credits cannot be directly used with Amazon Mechanical Turk.
Participants are also invited to attend our workshop at CSCW. However, workshop attendance is not required to take part in the challenge.
Take part in the CrowdBias Challenge through Kaggle - https://www.kaggle.com/c/crowd-bias-challenge
Submission Deadline - 15 October 2021
Please contact workshop organizers for any questions.