Abstract
Both robot and hand-eye calibration have been object of research for decades. While current approaches manage to precisely and robustly identify the parameters of a robot's kinematic model, they still rely on external devices such as calibration objects, markers and/or external sensors. Instead of trying to fit recorded measurements to a model of a known object, this paper treats robot calibration as an offline SLAM problem, where scanning poses are linked to a fixed point in space via a moving kinematic chain. As such, we enable robot calibration by using nothing but an arbitrary eye-in-hand depth sensor. To the authors' best knowledge the presented framework is the first solution to three-dimensional (3D) sensor-based robot calibration that does not require external sensors nor reference objects. Our novel approach utilizes a modified version of the Iterative Corresponding Point algorithm to run bundle adjustment on multiple 3D recordings estimating the optimal parameters of the kinematic model. A detailed evaluation of the system is shown on a real robot with various attached 3D sensors. The presented results show that the system reaches precision comparable to a dedicated external tracking system at a fraction of its cost.
Originalsprache | Englisch |
---|---|
Seiten (von - bis) | 327-346 |
Seitenumfang | 20 |
Fachzeitschrift | Journal of Field Robotics |
Jahrgang | 41 |
Ausgabenummer | 2 |
DOIs | |
Publikationsstatus | Veröffentlicht - März 2024 |