Segmentation-guided Medical Image Registration: Quality Awareness using Label Noise Correctionn

Varsha Raveendran, Veronika Spieker, Rickmer F. Braren, Dimitrios C. Karampinos, Veronika A. Zimmer, Julia A. Schnabel

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Medical image registration methods can strongly benefit from anatomical labels, which can be provided by segmentation networks at reduced labeling effort. Yet, label noise may adversely affect registration performance. In this work, we propose a quality-aware segmentation-guided registration method that handles such noisy, i.e., low-quality, labels by self-correcting them using Confident Learning. Utilizing NLST and in-house acquired abdominal MR images, we show that our proposed quality-aware method effectively addresses the drop in registration performance observed in quality-unaware methods. Our findings demonstrate that incorporating an appropriate label-correction strategy during training can reduce labeling efforts, consequently enhancing the practicality of segmentation-guided registration.

Original languageEnglish
Title of host publicationBildverarbeitung für die Medizin 2024 - Proceedings, German Conference on Medical Image Computing, 2024
EditorsAndreas Maier, Thomas M. Deserno, Heinz Handels, Klaus Maier-Hein, Christoph Palm, Thomas Tolxdorff
PublisherSpringer Science and Business Media Deutschland GmbH
Pages33-38
Number of pages6
ISBN (Print)9783658440367
DOIs
StatePublished - 2024
EventGerman Conference on Medical Image Computing, BVM 2024 - Erlangen, Germany
Duration: 10 Mar 202412 Mar 2024

Publication series

NameInformatik aktuell
ISSN (Print)1431-472X

Conference

ConferenceGerman Conference on Medical Image Computing, BVM 2024
Country/TerritoryGermany
CityErlangen
Period10/03/2412/03/24

Fingerprint

Dive into the research topics of 'Segmentation-guided Medical Image Registration: Quality Awareness using Label Noise Correctionn'. Together they form a unique fingerprint.

Cite this