TY - JOUR
T1 - On-the-fly augmented reality for orthopedic surgery using a multimodal fiducial
AU - Andress, Sebastian
AU - Johnson, Alex
AU - Unberath, Mathias
AU - Winkler, Alexander Felix
AU - Yu, Kevin
AU - Fotouhi, Javad
AU - Weidert, Simon
AU - Osgood, Greg
AU - Navab, Nassir
N1 - Publisher Copyright:
© 2018 Society of Photo-Optical Instrumentation Engineers (SPIE).
PY - 2018/4/1
Y1 - 2018/4/1
N2 - Fluoroscopic X-ray guidance is a cornerstone for percutaneous orthopedic surgical procedures. However, two-dimensional (2-D) observations of the three-dimensional (3-D) anatomy suffer from the effects of projective simplification. Consequently, many X-ray images from various orientations need to be acquired for the surgeon to accurately assess the spatial relations between the patient's anatomy and the surgical tools. We present an on-the-fly surgical support system that provides guidance using augmented reality and can be used in quasiunprepared operating rooms. The proposed system builds upon a multimodality marker and simultaneous localization and mapping technique to cocalibrate an optical see-through head mounted display to a C-arm fluoroscopy system. Then, annotations on the 2-D X-ray images can be rendered as virtual objects in 3-D providing surgical guidance. We quantitatively evaluate the components of the proposed system and, finally, design a feasibility study on a semianthropomorphic phantom. The accuracy of our system was comparable to the traditional image-guided technique while substantially reducing the number of acquired X-ray images as well as procedure time. Our promising results encourage further research on the interaction between virtual and real objects that we believe will directly benefit the proposed method. Further, we would like to explore the capabilities of our on-the-fly augmented reality support system in a larger study directed toward common orthopedic interventions.
AB - Fluoroscopic X-ray guidance is a cornerstone for percutaneous orthopedic surgical procedures. However, two-dimensional (2-D) observations of the three-dimensional (3-D) anatomy suffer from the effects of projective simplification. Consequently, many X-ray images from various orientations need to be acquired for the surgeon to accurately assess the spatial relations between the patient's anatomy and the surgical tools. We present an on-the-fly surgical support system that provides guidance using augmented reality and can be used in quasiunprepared operating rooms. The proposed system builds upon a multimodality marker and simultaneous localization and mapping technique to cocalibrate an optical see-through head mounted display to a C-arm fluoroscopy system. Then, annotations on the 2-D X-ray images can be rendered as virtual objects in 3-D providing surgical guidance. We quantitatively evaluate the components of the proposed system and, finally, design a feasibility study on a semianthropomorphic phantom. The accuracy of our system was comparable to the traditional image-guided technique while substantially reducing the number of acquired X-ray images as well as procedure time. Our promising results encourage further research on the interaction between virtual and real objects that we believe will directly benefit the proposed method. Further, we would like to explore the capabilities of our on-the-fly augmented reality support system in a larger study directed toward common orthopedic interventions.
KW - fluoroscopy
KW - interventional imaging
KW - registration
KW - surgical guidance
UR - http://www.scopus.com/inward/record.url?scp=85041193218&partnerID=8YFLogxK
U2 - 10.1117/1.JMI.5.2.021209
DO - 10.1117/1.JMI.5.2.021209
M3 - Article
AN - SCOPUS:85041193218
SN - 2329-4302
VL - 5
JO - Journal of Medical Imaging
JF - Journal of Medical Imaging
IS - 2
M1 - 021209
ER -