TY - JOUR
T1 - Delay Compensation for a Telepresence System with 3D 360 Degree Vision Based on Deep Head Motion Prediction and Dynamic FoV Adaptation
AU - Aykut, Tamay
AU - Karimi, Mojtaba
AU - Burgmair, Christoph
AU - Finkenzeller, Andreas
AU - Bachhuber, Christoph
AU - Steinbach, Eckehard
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2018/10
Y1 - 2018/10
N2 - The usability of telepresence applications is strongly affected by the communication delay between the user and the remote system. Special attention needs to be paid in case the distant scene is experienced by means of a Head Mounted Display. A high motion-to-photon latency, which describes the time needed to fully reflect the user's motion on the display, results in a poor feeling of presence. Further consequences involve unbearable motion sickness, indisposition, and termination of the telepresence session in the worst case. In this letter, we present our low-cost MAVI telepresence system, which is equipped with a stereoscopic 360° vision system and high-payload manipulation capabilities. Special emphasis is placed on the stereoscopic vision system and its delay compensation. More specifically, we propose velocity-based dynamic field-of-view adaptation techniques to decrease the emergence of simulator sickness and to improve the achievable level of delay compensation. The proposed delay compensation approach relies on deep learning to predict the prospective head motion. We use our previously described head motion dataset for training, validation, and testing. To prove the general validity of our approach, we perform cross validation with another independent dataset. We use both qualitative measures and subjective experiments for evaluation. Our results show that the proposed approach is able to achieve mean compensation rates of around 99.9% for latencies between 0.1 and 0.5 s.
AB - The usability of telepresence applications is strongly affected by the communication delay between the user and the remote system. Special attention needs to be paid in case the distant scene is experienced by means of a Head Mounted Display. A high motion-to-photon latency, which describes the time needed to fully reflect the user's motion on the display, results in a poor feeling of presence. Further consequences involve unbearable motion sickness, indisposition, and termination of the telepresence session in the worst case. In this letter, we present our low-cost MAVI telepresence system, which is equipped with a stereoscopic 360° vision system and high-payload manipulation capabilities. Special emphasis is placed on the stereoscopic vision system and its delay compensation. More specifically, we propose velocity-based dynamic field-of-view adaptation techniques to decrease the emergence of simulator sickness and to improve the achievable level of delay compensation. The proposed delay compensation approach relies on deep learning to predict the prospective head motion. We use our previously described head motion dataset for training, validation, and testing. To prove the general validity of our approach, we perform cross validation with another independent dataset. We use both qualitative measures and subjective experiments for evaluation. Our results show that the proposed approach is able to achieve mean compensation rates of around 99.9% for latencies between 0.1 and 0.5 s.
KW - 3D vision
KW - omnidirectional vision
KW - remote reality
KW - telepresence
KW - virtual reality
UR - http://www.scopus.com/inward/record.url?scp=85062935999&partnerID=8YFLogxK
U2 - 10.1109/LRA.2018.2864359
DO - 10.1109/LRA.2018.2864359
M3 - Article
AN - SCOPUS:85062935999
SN - 2377-3766
VL - 3
SP - 4343
EP - 4350
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 4
M1 - 8429079
ER -