[COMPARE] Ecosystem Digest, June 29 - July 12, 2025
4 views
Skip to first unread message
COMPARE Ecosystem
unread,
Jul 7, 2025, 9:11:39 AMJul 7
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to compare-...@googlegroups.com
COMPARE Ecosystem Digest, June 29 - July 12, 2025
Here’s what you may have missed last week and what’s coming up soon!
🤖 We're building a new repository for Motion Planning Datasets!
It’s become apparent we should start organizing a repo of datasets to use for benchmarking motion planning. For example, MotionBenchMaker includes 40 different datasets for manipulation. Have suggestions on datasets or tools we should include? Use this Google Form to let us know! https://forms.gle/LHrtmDpm82X4qrDk6
🤖 Recent discussions in the COMPARE Slack!
#benchmarking_datasets: Looking for insight regarding synthetic datasets for training: [1] what specific robotics problems suffer from a lack of good vision training data? [2] what new training datasets do you think would be most useful to the robotics community in general? [3] what new training datasets would be useful for your specific research? [link]
AgiBot World: A full-stack large-scale robot learning platform curated for advancing bimanual manipulation in scalable and intelligent embodied systems. It is accompanied by foundation models, benchmarks, and an ecosystem to democratize access to high-quality robot data for the academic community and the industry, paving the path towards the "ImageNet Moment" for Embodied AI.
RoboManipBaselines: Software that integrates various imitation learning methods and benchmark task environments to provide baselines for robot manipulation.
Many more additions are on their way as we still digest the material presented at ICRA, CVPR, and RSS. Come across any new open-source products or benchmarking assets that should be added? Use this Google Form to let us know! https://forms.gle/LHrtmDpm82X4qrDk6
RObotic MAnipulation of Deformable Objects (ROMADO 2025): October 20, 2025 at IROS 2025 in Hangzhou, China From folding clothes to manipulating food or surgical tissues, many objects robots need to interact with are deformable. The ROMADO 2025 workshop will bring together leading researchers to address challenges in modeling, perceiving, and manipulating deformable objects in the real world. The focus is on enabling robots to handle non-rigid materials through an interplay of perception, learning, and control, pushing toward intelligent robotic assistance in everyday and assistive settings.
Contact and Impact-Aware Manipulation (CIM25): October 24, 2025 at IROS 2025 in Hangzhou, China This workshop marks the first gathering of experts and researchers to discuss the topic of Contact and Impact-aware Manipulation (CIM). The proposed half-day event will explore recent advancements in robotic manipulation strategies in contact-rich environments and review the outcomes of the special issue on impact-aware robotics published in the IEEE Transactions on Robotics. Furthermore, the workshop will feature presentations on advanced tactile sensors and their application in contact-aware operations on humanoid robots, showcased by the leading robotics company. By bringing together academic researchers and industrial engineers under one roof, this workshop promotes deeper integration between haptic perception and robotic manipulation and further fosters innovative research for CIM.
AgiBot World Challenge: October 19, 2025 at IROS 2025 in Hangzhou, China Most existing robot learning benchmarks fall short when it comes to addressing real-world challenges, particularly those arising from low-quality data and limited sensing capabilities. These benchmarks often focus on short-horizon tasks within controlled environments. In contrast, the AgiBot World is the first large-scale multi-agents robotic dataset designed to advance multi-purpose humanoid robots. In this challenge, participants are required to work with data from the real-world domain or collected interactively in a simulator. In both scenarios, participants must demonstrate an accurate understanding of their surroundings and make informed decisions based on the given context.