TY - GEN
T1 - Intelligent image synthesis to attack a segmentation CNN using adversarial learning
AU - Chen, Liang
AU - Bentley, Paul
AU - Mori, Kensaku
AU - Misawa, Kazunari
AU - Fujiwara, Michitaka
AU - Rueckert, Daniel
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2019.
PY - 2019
Y1 - 2019
N2 - Deep learning approaches based on convolutional neural networks (CNNs) have been successful in solving a number of problems in medical imaging, including image segmentation. In recent years, it has been shown that CNNs are vulnerable to attacks in which the input image is perturbed by relatively small amounts of noise so that the CNN is no longer able to perform a segmentation of the perturbed image with sufficient accuracy. Therefore, exploring methods on how to attack CNN-based models as well as how to defend models against attacks have become a popular topic as this also provides insights into the performance and generalization abilities of CNNs. However, most of the existing work assumes unrealistic attack models, i.e. the resulting attacks were specified in advance. In this paper, we propose a novel approach for generating adversarial examples to attack CNN-based segmentation models for medical images. Our approach has three key features: (1) The generated adversarial examples exhibit anatomical variations (in form of deformations) as well as appearance perturbations; (2) The adversarial examples attack segmentation models so that the Dice scores decrease by a pre-specified amount; (3) The attack is not required to be specified beforehand. We have evaluated our approach on CNN-based approaches for the multi-organ segmentation problem in 2D CT images. We show that the proposed approach can be used to attack different CNN-based segmentation models.
AB - Deep learning approaches based on convolutional neural networks (CNNs) have been successful in solving a number of problems in medical imaging, including image segmentation. In recent years, it has been shown that CNNs are vulnerable to attacks in which the input image is perturbed by relatively small amounts of noise so that the CNN is no longer able to perform a segmentation of the perturbed image with sufficient accuracy. Therefore, exploring methods on how to attack CNN-based models as well as how to defend models against attacks have become a popular topic as this also provides insights into the performance and generalization abilities of CNNs. However, most of the existing work assumes unrealistic attack models, i.e. the resulting attacks were specified in advance. In this paper, we propose a novel approach for generating adversarial examples to attack CNN-based segmentation models for medical images. Our approach has three key features: (1) The generated adversarial examples exhibit anatomical variations (in form of deformations) as well as appearance perturbations; (2) The adversarial examples attack segmentation models so that the Dice scores decrease by a pre-specified amount; (3) The attack is not required to be specified beforehand. We have evaluated our approach on CNN-based approaches for the multi-organ segmentation problem in 2D CT images. We show that the proposed approach can be used to attack different CNN-based segmentation models.
UR - http://www.scopus.com/inward/record.url?scp=85075668578&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-32778-1_10
DO - 10.1007/978-3-030-32778-1_10
M3 - Conference contribution
AN - SCOPUS:85075668578
SN - 9783030327774
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 90
EP - 99
BT - Simulation and Synthesis in Medical Imaging - 4th International Workshop, SASHIMI 2019, Held in Conjunction with MICCAI 2019, Proceedings
A2 - Burgos, Ninon
A2 - Gooya, Ali
A2 - Svoboda, David
PB - Springer
T2 - 4th International Workshop on Simulation and Synthesis in Medical Imaging, SASHIMI 2019, held in conjunction with the 22nd International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2019
Y2 - 13 October 2019 through 13 October 2019
ER -