XTENDED DEADLINE Workshop on Movement Analytics and Gesture Recognition for Human-Machine Collaboration in Industry 4.0

Skip to first unread message

Sotiris Manitsaris

Jun 21, 2019, 8:26:29 AM6/21/19
to gesturechallenge

The Workshop on Movement Analytics and Gesture Recognition for Human-Machine Collaboration in Industry 4.0 will be hosted by the 12th International Conference on Computer Vision Systems (ICVS 2019), which will be held in Thessaloniki, Greece, on 23-25 September 2019.

Context and general objectives

A collaborative robot is an autonomous machine that is able to share workspace with the worker without physical barriers, following health and safety standards. Collaborative robotics have created the appropriate conditions for designing a Human-Robot Collaboration (HRC) that can associate the human intelligence with the power of the robot by following a simple criterion: the complementarity of skills. Nevertheless, in industries, « we always start with manual work » as said the Executive Vice-President of Toyota. Today, even though we have made significant progress in training robots by demonstration, the simple automatisation of tasks, within mixed workspaces, still remains a priority. But, mixed workspaces are not necessarily collaborative. For example, if a robot is able to anticipate the professional gestures of the worker, then, the robot would be able to dynamically move in space and time, and the worker would be able to obtain more ergonomically « green postures ». 

Computer Vision Systems together with the recent progress in Deep/Machine Learning opens a broad potential for innovation by re-thinking collaborative robots as real partners. The robot must be able to detect not only the human presence (e.g. worker or maintenance engineer), but also recognise and predict specific actions and/or gestures the worker performs (e.g. to screw, to assembly, etc.). To achieve this goal, human pose estimation, object detection and scene understanding in general are beneficial in order to augment the perception level of the robot. But beyond humanoid robots, Automated Guided Vehicles in factories should also be able to detect the human intentions (e.g. stop when a human tends to cut the motion trajectory, detect collaborative workspaces and identify them etc.) as well as understand human commands (e.g. to charge or not a palette, to go back to the starting point, etc.).


This workshop will focus on the most recent advances in the area of pose estimation, gesture recognition and movement analytics in Human-Machine Collaboration in Industry 4.0. This workshop aims to bring together researchers from different disciplines, such as robotics, computer vision, data analysis, intelligent systems, ergonomics, intelligent vehicles to share their experiences on these aspects and how they can be beneficial in Human-Machine Collaboration. 

Papers are solicited on all areas, including but not limited to the following research topics:

  • Deep learning for pose estimation
  • Human modelling
  • Professional gesture recognition
  • Scene understanding for smart workspaces
  • Vision-based automatic ergonomic assessments
  • Extraction and visualisation of movement analytics
  • Vision-based gestural interaction with automated guided vehicles or drones
  • Human-robot rhythmic interaction
  • Internet of things and computer vision in industry 4.0
  • Contactless robot learning through gestures
  • Human style learning for robotics
  • Benchmarks, methods and datasets for professional gestures
  • Gestures and bio-inspired systems and robots
  • Machine learning for human data
  • Augmented capabilities for workers and machines


Prof. Patrick Hénaff, LORIA  UMR 7503, Université de Lorraine – CNRS- INRIA

'Vision based Human-Robot motor coordination using adaptive central pattern generators'

The talk concerns the design of adaptive neural controllers of humanoid robots to make these laters able to learn to interact appropriately with humans. These controllers are inspired from the biological structures located  in spinal cord called central pattern generators (CPG)  and dedicated to genesis of  rhythmic movements. The CPGs and the plasticity mechanisms they incorporate allow interlimb motor coordination and interpersonal synchronization in  human intercations.
The main difficulty of this approach to control humanoids concerns the model of CPGs that must behave like chaotic oscillators to be able to synchronize with an external signal and the determination of efficient proprioceptive and/or  exteroceptive feedbacks  to create this signal. The second difficulty  concerns the plasticity mechanisms that  can be incorporated in CPG allowing the robot to learn motor coordination when it interacts with humans particularly through rhythmic tasks.  
The presentation will focus on these issues. We will show how to use optic flow and plastic CPGs to make humanoid robots able to learn  motor coordination with a human partner performing various rhythmic movements and consequently triggering the emergence of synchrony.
Several videos of simulations and experiments will illustrate the presentation. A conclusion and perspectives will conclude the talk.

Keywords: Humanoid robotics, neural control, Central Pattern Generator, sensorimotor coordination, synchronization, motor coordination.

Important dates

EXTENDED: Paper Submission :30 June 2019
EXTENDED: Notification acceptance : 10 July 2019
Camera-Ready papers : 15 July 2019
Conference : 23-25 September 2019


Please email your submission before 30th June 2019 23:59 CET following the instructions below:

  • Submission to: in...@aimove.eu
  • Subject: “Movement analytics and gesture recognition for Human-Machine Collaboration in Industry 4.0”
  • Include authors names, affiliations and contact information
  • Attached file: pdf
  • Additional links or illustrations are welcome.

At least one author of an accepted submission is required to attend the workshop and must register for the main ICVS conference. Accepted papers will be published in the adjunct conference proceedings. 

Workshop Organizers

  • Sotiris Manitsaris, Senior Researcher, S/T Project Leader, Centre for Robotics, MINES ParisTech, PSL Université Paris
  • Alina Glushkova, Postdoctoral Fellow, Centre for Robotics, MINES ParisTech, PSL Université Paris
  • Dimitrios Menychtas, Postdoctoral Fellow, Centre for Robotics, MINES ParisTech, PSL Université Paris


We acknowledge support from the CoLLaboratE project (H2020-FoF, grant agreement No. 820767), which is funded by the EU’s Horizon 2020 Research and Innovation Programme.

Cordialement | Regards | Με εκτίμηση,
Dr. Sotiris Manitsaris

Senior Researcher | Research Project Leader
Centre for Robotics | MINES ParisTech | PSL Université Paris
A: 60, boulevard Saint Michel | 75272 Paris cedex 06 | France
T: +33 01 40 51 91 69 | M : sotiris.m...@mines-paristech.fr
W: sotirismanitsaris.euLinkedIn Page

Director of the Post-Master AIMove
"AI & MOVΕment in industries and creation"
Reply all
Reply to author
0 new messages