TY - JOUR
T1 - PAROS
T2 - Multi-Component Robotic System and an Image-Guided Patient Alignment for Robot-Assisted Ophthalmic Surgery
AU - Alikhani, Alireza
AU - Nguyen, Van Dai
AU - Inagaki, Satoshi
AU - Busam, Benjamin
AU - Faridpooya, Koorosh
AU - Maier, Mathias
AU - Gehlbach, Peter
AU - Iordachita, Iulian
AU - Navab, Nassir
AU - Nasseri, M. Ali
AU - Zapp, Daniel
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2025
Y1 - 2025
N2 - Advancements in robot-assisted eye surgery have significantly enhanced precision in delicate procedures by enabling controlled movement of surgical instruments through a small entry point (trocar) placed on the outer layer of the eye (sclera). However, before these procedures can begin, the robot must first be prepared, draped, and calibrated. Then, it is positioned near the patient, where precise patient alignment is performed. Finally, the system ensures accurate alignment between the surgical instrument and the trocar, a critical step for safe and effective instrument insertion and surgical execution. Despite these advancements, integrating robotic systems into clinical practice remains challenging. The current manual preparation and alignment process is time-consuming and error-prone, emphasizing the need for a more automated and adaptive solution. Patient alignment time has been identified as a key factor differentiating robot-assisted and manual surgeries in overall procedure duration, with robot preparation and patient alignment extending surgery time by an average of 51 ± 6 minutes compared to manual methods. To address these challenges, this paper introduces a multi-component robotic system combined with an image-guided patient alignment methodology designed to reduce preparation and alignment time in robotic eye surgery while maintaining surgical efficiency. The robotic system consists of a 2D-Cart carrying a 4-degree-of-freedom (DOF) frame robot holding a 5-DOF micro-manipulator. It utilizes an image-guided, multi-stage alignment approach, enabling the system to efficiently be localized and stabilized in an optimal position beside the surgical bed and ensuring precise and adaptive positioning relative to the patient in the surgical draped condition. The proposed end-to-end robotic system preparation and patient alignment are validated through a user study involving eight medical experts in a simulated surgical environment using a draped phantom eye. The results demonstrate successful robot-patient alignment in all cases, with the additional setup time significantly reduced to 272 ± 84 seconds, highlighting the high precision of the system and clinical feasibility. To further assess the effectiveness of the image-guided alignment in real surgical settings, eye detection experiments are conducted on five patients under surgical draped conditions, achieving 90% eye detection accuracy and 75% iris detection accuracy. This study introduces a robust, semi-automated approach to robot-patient alignment, offering a more efficient and adaptable alternative to current manual methods by reducing setup time and enhancing surgical accuracy.
AB - Advancements in robot-assisted eye surgery have significantly enhanced precision in delicate procedures by enabling controlled movement of surgical instruments through a small entry point (trocar) placed on the outer layer of the eye (sclera). However, before these procedures can begin, the robot must first be prepared, draped, and calibrated. Then, it is positioned near the patient, where precise patient alignment is performed. Finally, the system ensures accurate alignment between the surgical instrument and the trocar, a critical step for safe and effective instrument insertion and surgical execution. Despite these advancements, integrating robotic systems into clinical practice remains challenging. The current manual preparation and alignment process is time-consuming and error-prone, emphasizing the need for a more automated and adaptive solution. Patient alignment time has been identified as a key factor differentiating robot-assisted and manual surgeries in overall procedure duration, with robot preparation and patient alignment extending surgery time by an average of 51 ± 6 minutes compared to manual methods. To address these challenges, this paper introduces a multi-component robotic system combined with an image-guided patient alignment methodology designed to reduce preparation and alignment time in robotic eye surgery while maintaining surgical efficiency. The robotic system consists of a 2D-Cart carrying a 4-degree-of-freedom (DOF) frame robot holding a 5-DOF micro-manipulator. It utilizes an image-guided, multi-stage alignment approach, enabling the system to efficiently be localized and stabilized in an optimal position beside the surgical bed and ensuring precise and adaptive positioning relative to the patient in the surgical draped condition. The proposed end-to-end robotic system preparation and patient alignment are validated through a user study involving eight medical experts in a simulated surgical environment using a draped phantom eye. The results demonstrate successful robot-patient alignment in all cases, with the additional setup time significantly reduced to 272 ± 84 seconds, highlighting the high precision of the system and clinical feasibility. To further assess the effectiveness of the image-guided alignment in real surgical settings, eye detection experiments are conducted on five patients under surgical draped conditions, achieving 90% eye detection accuracy and 75% iris detection accuracy. This study introduces a robust, semi-automated approach to robot-patient alignment, offering a more efficient and adaptable alternative to current manual methods by reducing setup time and enhancing surgical accuracy.
KW - Medical robots and systems
KW - clinical integration in medical robotics
KW - planning
KW - surgical robotics
UR - http://www.scopus.com/inward/record.url?scp=105005311216&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2025.3564944
DO - 10.1109/ACCESS.2025.3564944
M3 - Article
AN - SCOPUS:105005311216
SN - 2169-3536
VL - 13
SP - 85056
EP - 85071
JO - IEEE Access
JF - IEEE Access
ER -