Call for Participation -
1M-Deepfakes Detection Challengeat ACM Multimedia 2024, Melbourne
The tremendous progress in generative AI has made the generation and manipulation of synthetic data easier and faster than before. To this end, multiple use cases are benefitting from it. The negative aspect of this progress and wide adoption of generative AI is deepfakes. Audio/image/video of an individual(s) is manipulated using generative methods without permission from the individual(s). This can make them be shown saying or doing something, which they may not have done in real. These unethically manipulated videos, popularly known as deepfakes have wide repercussions and negative effects on society in the form of the deepfakes’ potential in spreading disinformation and misinformation. Deepfakes unfortunately are used for trolling online as well. Authentication systems such as video KYC (Know Your Customer) are also not resilient as often face recognition and verification systems are deceived when high-quality deepfakes are used. To this end, it is important for platforms and systems to be able to identify if manipulation has been performed on a media. These systems, which detect and analyse the deepfakes are referred to as deepfakes detectors.
The 1M-Deepfakes Detection Challenge comprises of two sub-tasks:
a. Deepfake Detection – Given an audio-visual sample containing a single subject, the task is to identify if the video is a deepfake or real.
b. Deepfakes Temporal Localisation – Given an audio-visual sample containing a single subject, the task is to find out the frames (time stamps) in which the manipulation is done. The assumption here is that from the perspective of spreading misinformation, editing a few vital parts of a video may be enough to change the meaning of the original video, and at the same time, the quality of the deepfake video will be closer to the original compared to a deepfake in which the entire original video is manipulated.
Timeline
Training and Validation Data - available now
Test Data - mid-May
Paper submission deadline - June 14
Thanks,
Abhinav Dhall