TY - GEN
T1 - A Temporal Perspective n-Point Problem with Model Uncertainties for Cooperative Pose Estimation in a Heterogeneous Robot Team
AU - Steidle, Florian
AU - Boche, Simon
AU - Sturzl, Wolfgang
AU - Triebel, Rudolph
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Many solutions exist for estimating the pose of an object with respect to a camera, where perfect knowledge of the object is assumed. In this work we lift the assumption of a perfectly known model and introduce uncertainties for the 3d points, which are retrieved from a dynamically created model. The positions of model points can either be uncorrelated or correlated. The latter is typically the case for mobile robots navigating based on results of visual-inertial pose estimation in unknown and GNSS-denied environments. In our approach, a selection of poses estimated by one robot is used as a dynamical 3d model and combined with 2d points from tracking the robot over time with the camera of another robot. In addition, selection criteria for adding and deleting 3d model points in an optimal way are proposed. Weighted residuals in the tangent space are used in a generalized least-squares problem to calculate the transformation between the tracking camera and an object. Measurement errors are projected into tangential planes of the unit sphere. The proposed method allows to estimate the relative pose of members of a robotic team with high accuracy. The benefits of our approach are shown in simulation and also during real-world experiments using visual odometry measurements from a multicopter that is tracked by the camera of a rover.
AB - Many solutions exist for estimating the pose of an object with respect to a camera, where perfect knowledge of the object is assumed. In this work we lift the assumption of a perfectly known model and introduce uncertainties for the 3d points, which are retrieved from a dynamically created model. The positions of model points can either be uncorrelated or correlated. The latter is typically the case for mobile robots navigating based on results of visual-inertial pose estimation in unknown and GNSS-denied environments. In our approach, a selection of poses estimated by one robot is used as a dynamical 3d model and combined with 2d points from tracking the robot over time with the camera of another robot. In addition, selection criteria for adding and deleting 3d model points in an optimal way are proposed. Weighted residuals in the tangent space are used in a generalized least-squares problem to calculate the transformation between the tracking camera and an object. Measurement errors are projected into tangential planes of the unit sphere. The proposed method allows to estimate the relative pose of members of a robotic team with high accuracy. The benefits of our approach are shown in simulation and also during real-world experiments using visual odometry measurements from a multicopter that is tracked by the camera of a rover.
UR - http://www.scopus.com/inward/record.url?scp=85174419195&partnerID=8YFLogxK
U2 - 10.1109/ECMR59166.2023.10256287
DO - 10.1109/ECMR59166.2023.10256287
M3 - Conference contribution
AN - SCOPUS:85174419195
T3 - Proceedings of the 11th European Conference on Mobile Robots, ECMR 2023
BT - Proceedings of the 11th European Conference on Mobile Robots, ECMR 2023
A2 - Marques, Lino
A2 - Markovic, Ivan
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 11th European Conference on Mobile Robots, ECMR 2023
Y2 - 4 September 2023 through 7 September 2023
ER -