TY - GEN
T1 - Finding Things in the Unknown
T2 - 2023 IEEE International Conference on Robotics and Automation, ICRA 2023
AU - Papatheodorou, Sotiris
AU - Funk, Nils
AU - Tzoumanikas, Dimos
AU - Choi, Christopher
AU - Xu, Binbin
AU - Leutenegger, Stefan
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Exploration of unknown space with an autonomous mobile robot is a well-studied problem. In this work we broaden the scope of exploration, moving beyond the pure geometric goal of uncovering as much free space as possible. We believe that for many practical applications, exploration should be contextualised with semantic and object-level understanding of the environment for task-specific exploration. Here, we study the task of bothfinding specific objects in unknown space as well as reconstructing them to a target level of detail. We therefore extend our environment reconstruction to not only consist of a background map, but also object-level and semantically fused submaps. Importantly, we adapt our previous objective function of uncovering as much free space as possible in as little time as possible with two additional elements: first, we require a maximum observation distance of background surfaces to ensure target objects are not missed by image-based detectors because they are too small to be detected. Second, we require an even smaller maximum distance to the found objects in order to reconstruct them with the desired accuracy. We further created a Micro Aerial Vehicle (MAV) semantic exploration simulator based on Habitat in order to quantitatively demonstrate how our framework can be used to efficiently find specific objects as part of exploration. Finally, we showcase this capability can be deployed in real-world scenes involving our drone equipped with an Intel RealSense D455 RGB-D camera.
AB - Exploration of unknown space with an autonomous mobile robot is a well-studied problem. In this work we broaden the scope of exploration, moving beyond the pure geometric goal of uncovering as much free space as possible. We believe that for many practical applications, exploration should be contextualised with semantic and object-level understanding of the environment for task-specific exploration. Here, we study the task of bothfinding specific objects in unknown space as well as reconstructing them to a target level of detail. We therefore extend our environment reconstruction to not only consist of a background map, but also object-level and semantically fused submaps. Importantly, we adapt our previous objective function of uncovering as much free space as possible in as little time as possible with two additional elements: first, we require a maximum observation distance of background surfaces to ensure target objects are not missed by image-based detectors because they are too small to be detected. Second, we require an even smaller maximum distance to the found objects in order to reconstruct them with the desired accuracy. We further created a Micro Aerial Vehicle (MAV) semantic exploration simulator based on Habitat in order to quantitatively demonstrate how our framework can be used to efficiently find specific objects as part of exploration. Finally, we showcase this capability can be deployed in real-world scenes involving our drone equipped with an Intel RealSense D455 RGB-D camera.
KW - Aerial Systems: Perception and Autonomy
KW - Visual-Based Navigation
UR - http://www.scopus.com/inward/record.url?scp=85168708141&partnerID=8YFLogxK
U2 - 10.1109/ICRA48891.2023.10160490
DO - 10.1109/ICRA48891.2023.10160490
M3 - Conference contribution
AN - SCOPUS:85168708141
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 3339
EP - 3345
BT - Proceedings - ICRA 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 29 May 2023 through 2 June 2023
ER -