Know your sensors — a modality study for surgical action classification

Lennart Bastian, Tobias Czempiel, Christian Heiliger, Konrad Karcz, Ulrich Eck, Benjamin Busam, Nassir Navab

Research output: Contribution to journalArticlepeer-review

2 Scopus citations


The surgical operating room (OR) presents many opportunities for automation and optimisation. Videos from various sources in the OR are becoming increasingly available. The medical community seeks to leverage this wealth of data to develop automated methods to advance interventional care, lower costs, and improve overall patient outcomes. Existing datasets from OR room cameras are thus far limited in size or modalities acquired, leaving it unclear which sensor modalities are best suited for tasks such as recognising surgical action from videos. This study demonstrates that the task of surgical workflow classification is highly dependent on the sensor modalities used. We perform a systematic analysis on several commonly available sensor modalities, evaluating two commonly used fusion approaches that can improve classification performance. Our findings are consistent across model architectures as well as separate camera views. The analyses are carried out on a set of multi-view RGB-D video recordings of 16 laparoscopic interventions.

Original languageEnglish
Pages (from-to)1113-1121
Number of pages9
JournalComputer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization
Issue number4
StatePublished - 2023


  • Surgical workflow analysis
  • aware operating room
  • video action recognition


Dive into the research topics of 'Know your sensors — a modality study for surgical action classification'. Together they form a unique fingerprint.

Cite this