Sickle Cell Disease Severity Prediction from Percoll Gradient Images Using Graph Convolutional Networks

Ario Sadafi, Asya Makhro, Leonid Livshits, Nassir Navab, Anna Bogdanova, Shadi Albarqouni, Carsten Marr

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Sickle cell disease (SCD) is a severe genetic hemoglobin disorder that results in premature destruction of red blood cells. Assessment of the severity of the disease is a challenging task in clinical routine, since the causes of broad variance in SCD manifestation despite the common genetic cause remain unclear. Identification of biomarkers that would predict the severity grade is of importance for prognosis and assessment of responsiveness of patients to therapy. Detection of the changes in red blood cell (RBC) density by means of separation of Percoll density gradients could be such a marker as it allows to resolve intercellular differences and follow the most damaged dense cells prone to destruction and vasoocclusion. Quantification and interpretation of the images obtained from the distribution of RBCs in Percoll gradients is an important prerequisite for establishment of this approach. Here, we propose a novel approach combining a graph convolutional network, a convolutional neural network, fast Fourier transform, and recursive feature elimination to predict the severity of SCD directly from a Percoll image. Two important but expensive laboratory blood test parameters are used for training the graph convolutional network. To make the model independent from such tests during prediction, these two parameters are estimated by a neural network from the Percoll image directly. On a cohort of 216 subjects, we achieve a prediction performance that is only slightly below an approach where the groundtruth laboratory measurements are used. Our proposed method is the first computational approach for the difficult task of SCD severity prediction. The two-step approach relies solely on inexpensive and simple blood analysis tools and can have a significant impact on the patients’ survival in low resource regions where access to medical instruments and doctors is limited.

Original languageEnglish
Title of host publicationDomain Adaptation and Representation Transfer, and Affordable Healthcare and AI for Resource Diverse Global Health - 3rd MICCAI Workshop, DART 2021, and 1st MICCAI Workshop, FAIR 2021, Held in Conjunction with MICCAI 2021, Proceedings
EditorsShadi Albarqouni, M. Jorge Cardoso, Qi Dou, Konstantinos Kamnitsas, Bishesh Khanal, Islem Rekik, Nicola Rieke, Debdoot Sheet, Sotirios Tsaftaris, Daguang Xu, Ziyue Xu
PublisherSpringer Science and Business Media Deutschland GmbH
Pages216-225
Number of pages10
ISBN (Print)9783030877217
DOIs
StatePublished - 2021
Event3rd MICCAI Workshop on Domain Adaptation and Representation Transfer, DART 2021, and the 1st MICCAI Workshop on Affordable Healthcare and AI for Resource Diverse Global Health, FAIR 2021, held in conjunction with 24th International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2021 - Virtual, Online
Duration: 27 Sep 20211 Oct 2021

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12968 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference3rd MICCAI Workshop on Domain Adaptation and Representation Transfer, DART 2021, and the 1st MICCAI Workshop on Affordable Healthcare and AI for Resource Diverse Global Health, FAIR 2021, held in conjunction with 24th International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2021
CityVirtual, Online
Period27/09/211/10/21

Keywords

  • Graph convolutional networks
  • Percoll gradients
  • Severity prediction
  • Sickle cell disease

Fingerprint

Dive into the research topics of 'Sickle Cell Disease Severity Prediction from Percoll Gradient Images Using Graph Convolutional Networks'. Together they form a unique fingerprint.

Cite this