TY - GEN
T1 - 2D positioning of ground vehicles using stereo vision and a single ranging link
AU - Zhu, Chen
AU - Giorgi, Gabriele
AU - Lee, Young Hee
AU - Günther, Christoph
N1 - Publisher Copyright:
© 2019 Institute of Navigation. All Rights Reserved.
PY - 2019
Y1 - 2019
N2 - In this work we propose a positioning method for ground vehicles in planar motion, based on sensor fusion of stereo cameras and sparse ranging measurements obtained from a wireless network. The proposed method is an alternative localization solution when Global Navigation Satellite System (GNSS) is unavailable, with notably low requirements on infrastructures. It does not require a database of landmarks and it works in single-link scenarios, i.e., at most one station reachable at any time. In theory, to estimate two dimensional position without ambiguity, at least three ranging anchors are required. However, in GNSS-denied environments, it is often difficult to achieve simultaneous connectivity to three wireless stations. We propose to apply visual odometry technique to estimate relative motion of the vehicle using stereo cameras, and fuse the vision system with a single ranging link. The sensor fusion method can resolve absolute position unambiguously if the vehicle sequentially connects to two stations with known coordinates. Furthermore, the accuracy of the estimated trajectory is improved by fusing both ranging and visual measurements.
AB - In this work we propose a positioning method for ground vehicles in planar motion, based on sensor fusion of stereo cameras and sparse ranging measurements obtained from a wireless network. The proposed method is an alternative localization solution when Global Navigation Satellite System (GNSS) is unavailable, with notably low requirements on infrastructures. It does not require a database of landmarks and it works in single-link scenarios, i.e., at most one station reachable at any time. In theory, to estimate two dimensional position without ambiguity, at least three ranging anchors are required. However, in GNSS-denied environments, it is often difficult to achieve simultaneous connectivity to three wireless stations. We propose to apply visual odometry technique to estimate relative motion of the vehicle using stereo cameras, and fuse the vision system with a single ranging link. The sensor fusion method can resolve absolute position unambiguously if the vehicle sequentially connects to two stations with known coordinates. Furthermore, the accuracy of the estimated trajectory is improved by fusing both ranging and visual measurements.
UR - http://www.scopus.com/inward/record.url?scp=85068319259&partnerID=8YFLogxK
U2 - 10.33012/2019.16728
DO - 10.33012/2019.16728
M3 - Conference contribution
AN - SCOPUS:85068319259
T3 - ION 2019 International Technical Meeting Proceedings
SP - 843
EP - 855
BT - ION 2019 International Technical Meeting Proceedings
PB - Institute of Navigation
T2 - Institute of Navigation International Technical Meeting 2019, ITM 2019
Y2 - 28 January 2019 through 31 January 2019
ER -