TY - JOUR
T1 - Device- and system-independent personal touchless user interface for operating rooms
T2 - One personal UI to control all displays in an operating room
AU - Ma, Meng
AU - Fallavollita, Pascal
AU - Habert, Séverine
AU - Weidert, Simon
AU - Navab, Nassir
N1 - Publisher Copyright:
© 2016, CARS.
PY - 2016/6/1
Y1 - 2016/6/1
N2 - Introduction: In the modern day operating room, the surgeon performs surgeries with the support of different medical systems that showcase patient information, physiological data, and medical images. It is generally accepted that numerous interactions must be performed by the surgical team to control the corresponding medical system to retrieve the desired information. Joysticks and physical keys are still present in the operating room due to the disadvantages of mouses, and surgeons often communicate instructions to the surgical team when requiring information from a specific medical system. In this paper, a novel user interface is developed that allows the surgeon to personally perform touchless interaction with the various medical systems, switch effortlessly among them, all of this without modifying the systems’ software and hardware. Methods: To achieve this, a wearable RGB-D sensor is mounted on the surgeon’s head for inside-out tracking of his/her finger with any of the medical systems’ displays. Android devices with a special application are connected to the computers on which the medical systems are running, simulating a normal USB mouse and keyboard. When the surgeon performs interaction using pointing gestures, the desired cursor position in the targeted medical system display, and gestures, are transformed into general events and then sent to the corresponding Android device. Finally, the application running on the Android devices generates the corresponding mouse or keyboard events according to the targeted medical system. Results and conclusion: To simulate an operating room setting, our unique user interface was tested by seven medical participants who performed several interactions with the visualization of CT, MRI, and fluoroscopy images at varying distances from them. Results from the system usability scale and NASA-TLX workload index indicated a strong acceptance of our proposed user interface.
AB - Introduction: In the modern day operating room, the surgeon performs surgeries with the support of different medical systems that showcase patient information, physiological data, and medical images. It is generally accepted that numerous interactions must be performed by the surgical team to control the corresponding medical system to retrieve the desired information. Joysticks and physical keys are still present in the operating room due to the disadvantages of mouses, and surgeons often communicate instructions to the surgical team when requiring information from a specific medical system. In this paper, a novel user interface is developed that allows the surgeon to personally perform touchless interaction with the various medical systems, switch effortlessly among them, all of this without modifying the systems’ software and hardware. Methods: To achieve this, a wearable RGB-D sensor is mounted on the surgeon’s head for inside-out tracking of his/her finger with any of the medical systems’ displays. Android devices with a special application are connected to the computers on which the medical systems are running, simulating a normal USB mouse and keyboard. When the surgeon performs interaction using pointing gestures, the desired cursor position in the targeted medical system display, and gestures, are transformed into general events and then sent to the corresponding Android device. Finally, the application running on the Android devices generates the corresponding mouse or keyboard events according to the targeted medical system. Results and conclusion: To simulate an operating room setting, our unique user interface was tested by seven medical participants who performed several interactions with the visualization of CT, MRI, and fluoroscopy images at varying distances from them. Results from the system usability scale and NASA-TLX workload index indicated a strong acceptance of our proposed user interface.
KW - Finger pointing gesture
KW - Multimodal interaction
KW - Operating room
KW - User interface
UR - http://www.scopus.com/inward/record.url?scp=84961199193&partnerID=8YFLogxK
U2 - 10.1007/s11548-016-1375-6
DO - 10.1007/s11548-016-1375-6
M3 - Article
C2 - 26984551
AN - SCOPUS:84961199193
SN - 1861-6410
VL - 11
SP - 853
EP - 861
JO - International Journal of Computer Assisted Radiology and Surgery
JF - International Journal of Computer Assisted Radiology and Surgery
IS - 6
ER -