[CfP] CoRL 2024 Locolearn Workshop - Call for paper and demos

38 views
Skip to first unread message

Chien Erh Lin

unread,
Sep 11, 2024, 3:53:52 PM9/11/24
to Women in Machine Learning
Dear Women in ML,

We are happy to announce that we will organize the Workshop "LocoLearn: From Bioinspired Gait Generation to Active Perception" at CoRL 2024, In Munich. The workshop will be on Saturday, November 9th, 2024.
 
We will have an exciting group of speakers to discuss different aspects of locomotion learning, including new challenges in perception,  bioinspiration, and navigation. The full list and schedule is available in our website: https://www.locolearn.robot-learning.net/

 
We have also opened the Call for Papers (CfP), where we will be accepting short articles (4 pages, submitted or ongoing work) as well as demo proposals (1 page).
You can find all the details in our CfP page https://www.locolearn.robot-learning.net/call-for-papers
The submissions will be accepted on OpenReview https://openreview.net/group?id=robot-learning.org/CoRL/2024/Workshop/Locolearn

In this workshop we aim to answer the following research questions:
  • Can we go beyond imitation of animals' locomotion, and use other biological insights, like action-perception loop, to develop better locomotion learning frameworks?
  • Can multimodal active perception improve the robot's agility, learning performance, or robustness?
  • How important is contact sensing for locomotion? Should we exploit contacts rather than avoid them?
  • How important is it to perceive terrain properties during locomotion? Can we adapt locomotion to deal with different terrains? How can we simulate terrain?
  • Can we learn directly on real platforms? Do we need safety techniques to learn in the real world?
  • How can we learn to switch between different gaits using perception, e.g., from walking in the mud to swimming?
  • How to exploit these complex locomotion skills and advanced perception to solve long-term or high-level navigation tasks?
  • How can we leverage the foundational models for improving multimodal and active perception for locomotion? Are foundational models an answer to all the questions above?

This workshop has been organized by Davide Tateo (TU Darmstadt), Matias Mattamala (University of Oxford), Piotr  Kicki (IDEAS), Lu Gan (Georgia Tech) and Amir Patel (UCL). We also acknowledge the support of Calogero Maria Oddo, Auke J. Ijspeert, Jan  Peters, and Krzysztof Walas.

================
Important dates
----------------------------

Paper submission deadline:  15 October 2024
Author notification:  1 November 2024
Camera-ready version:  6 November 2024
Workshop:  9 November 2024

================
Useful links
----------------------------

Workshop website: https://www.locolearn.robot-learning.net/call-for-papers
Submission site: https://openreview.net/group?id=robot-learning.org/CoRL/2024/Workshop/Locolearn
CoRL Conference: https://www.corl.org/

Kind regards,

The organizing committee
Reply all
Reply to author
Forward
0 new messages