MMSys'21 Grand Challenge on Detecting Cheapfakes - Final Reminders

26 views
Skip to first unread message

Cise Midoglu

unread,
Jul 14, 2021, 6:55:42 PM7/14/21
to MMSys'21 Grand Challenge on Detecting Cheapfakes
Dear participants,

This is a kind reminder that the submission deadline for the MMSys'21 Grand Challenge on Datasets is next Monday, 19 July at 23:59 AoE. Please note that the submission deadline for all corresponding software components is the same as the manuscript submission deadline.


Challenge Details

We have compiled a challenge overview paper (https://arxiv.org/abs/2107.05297) and updated the challenge website (https://2021.acmmmsys.org/cheapfake_challenge.php) as well as the official GitHub repository of the challenge (https://github.com/acmmmsys/2021-grandchallenge-cheapfakes/), to provide you with more details. Please feel free to take a look at Section 2 and Section 3 of the overview paper for any clarifications you might need about the dataset and the task.


Q&A Session

We will hold a short Q&A session on Monday, 19 July at 10:00 GMT+2 for technical questions regarding submissions. You are welcome to join if you have any remaining questions. Please contact me privately at ci...@simula.no for the online meeting details if you are interested.


Multiple Submissions

Please note that if you have multiple proposals for addressing the challenge, i.e., more than one proposed model, you should submit each of them as a separate manuscript (one paper = one idea). Each manuscript fulfilling the submission requirements below will be considered as a separate entity in the competition. If you would like to register an additional submission, please contact me privately at ci...@simula.no.


Submission Requirements

You can refer to Section 5.2 in the overview paper and the "Submission Guidelines" section in the challenge website for the full list of requirements.

Please remember that each submission should include explicit references to the source code and the Docker image.
  1. You can host your source code anywhere you like, as long as you provide a public link in your manuscript, which is valid as of the submission deadline. Note that GitHub is preferred.
  2. For the Docker image, you can either host it as a publicly accessible file under (1), or consider pushing it to a cloud service such as Docker Hub. Note that Dockerhub is preferred.
You can find sample Docker commands in the official GitHub repository.


Evaluation

Each manuscript should include preliminary results, i.e., values of the evaluation metrics after running the proposed model on the public test split, as exemplified in Section 4 of the overview paper (Table 2). You can use any hardware of your choice, but please make sure to indicate all hardware details alongside the results. This is especially important for putting the Latency metric into context.

During the evaluation, your Docker image will be run using the commands provided in the official Github repository README, on at least two different machines:
- A machine with an Nvidia GeForce GTX 2080 Ti GPU
- A machine without a GPU, only comprising a CPU (2 cores and up)


Best regards,
--
Cise Midoglu
Simula Research Laboratory
https://www.simula.no
Reply all
Reply to author
Forward
0 new messages