Dear Colleagues,
Workshop Overview
For robots to safely interact with people and the real world, they need the capability to not only perceive but also understand their surroundings in a semantically meaningful way (i.e., understanding implications or pertinent properties associated with the objects in the scene). Advanced perception methods coupled with learning algorithms have made significant progress in enabling semantic understanding. Recent breakthroughs in foundation models have further exposed opportunities for robots to contextually reason about their operating environments. Semantics is ingrained in every aspect of robotics, from perception to action; reliably exploiting semantic information in embodied systems requires tightly coupled perception, learning, and control algorithm design (e.g., a robot in a warehouse must recognize objects on the floor and reason whether it is safe to run over them). By organizing this workshop, we hope to foster discussions on innovative approaches that harness semantic understanding for the design and deployment of intelligent embodied systems. We aim to facilitate an interdisciplinary exchange between researchers in robot learning, perception, mapping, and control to identify the opportunities and pressing challenges when incorporating semantics into robotic applications.