Priberam Machine Learning Lunch Seminars (T10) - 2 - "Robust Object Recognition Through Symbiotic Deep Learning In Mobile Robots", João Cartucho, (ISR)

Skip to first unread message

Zita Marinho

Feb 11, 2019, 3:40:14 PM2/11/19

Hello all,

Next week João Cartucho from ISR will present his work on "Robust Object Recognition Through Symbiotic Deep Learning in Mobile Robots" on Thursday 21th February at 13:00h (room PA2- Pav. Matemática).

A reminder that this year the seminars will take place on  Thursdays (the 1st and 3rd of the month).

If you are interested in presenting your work for the IST / general community please let me know (

Please register for the seminar in the link below, where you can inform us of your food preference:

Please take into account that the attendance is limited by the room capacity.

Best regards,

Zita Marinho

Priberam Labs

Image result for priberam logoPRIBERAM SEMINARS   --  Room PA2

Priberam Machine Learning Lunch Seminar
Speaker:  João Cartucho (ISR)
Venue: IST Alameda, Room PA2 (Pavilhão de Matemática)
Date: Thursday, February 21th, 2019
Time: 13:00
Lunch will be provided

Robust Object Recognition Through Symbiotic Deep Learning in Mobile Robots

Despite the recent success of state-of-the-art deep learning algorithms in object recognition, when these are deployed as-is on a mobile service robot, we observed that they failed to recognize many objects in real human environments. In this paper, we introduce a learning algorithm in which robots address this flaw by asking humans for help, also known as a symbiotic autonomy approach. In particular, we bootstrap YOLOv2, a state-of-the-art deep neural network and train a new neural network, that we call HHELP, using only data collected from human help. Using an RGB camera and an on-board tablet, the robot proactively seeks human input to assist it in labeling surrounding objects. Pepper, located at CMU, and Monarch Mbot, located at ISR-Lisbon, were the service robots that we used to validate the proposed approach.  We conducted a study in a realistic domestic environment over the course of 20 days with 6 research participants. To improve object detection, we used the two neural networks, YOLOv2 + HHELP, in parallel. Following this methodology, the robot was able to detect twice the number of objects compared to the initial YOLOv2 neural network, and achieved a higher mAP (mean Average Precision) score. Using the learning algorithm the robot also collected data about where an object was located and to whom it belonged to by asking humans. This enabled us to explore a future use case where robots can search for a specific person's object.  We view the contribution of this work to be relevant for service robots in general, in addition to Pepper, and Mbot.
João Cartucho is an IST Master student alumni at ISR. His master thesis was co'supervised by Manuela Veloso from CMU and Rodrigo Ventura from IST. He was a finalist to the Best Paper Award at the IEE-RSJ International Conference on Intelligent Robots and Systems (IROS 2018).



General info about our seminars:
Please forward this link to whoever might be interested in attending the seminars. 

Reply all
Reply to author
0 new messages