What Identifies A Whale by Its Fluke? On the Benefit of Interpretable Machine Learning for Whale Identification

J. Kierdorf, J. Garcke, J. Behley, T. Cheeseman, R. Roscher

Research output: Contribution to journalConference articlepeer-review

6 Scopus citations

Abstract

Interpretable and explainable machine learning have proven to be promising approaches to verify the quality of a data-driven model in general as well as to obtain more information about the quality of certain observations in practise. In this paper, we use these approaches for an application in the marine sciences to support the monitoring of whales. Whale population monitoring is an important element of whale conservation, where the identification of whales plays an important role in this process, for example to trace the migration of whales over time and space. Classical approaches use photographs and a manual mapping with special focus on the shape of the whale flukes and their unique pigmentation. However, this is not feasible for comprehensive monitoring. Machine learning methods, especially deep neural networks, have shown that they can efficiently solve the automatic observation of a large number of whales. Despite their success for many different tasks such as identification, further potentials such as interpretability and their benefits have not yet been exploited. Our main contribution is an analysis of interpretation tools, especially occlusion sensitivity maps, and the question of how the gained insights can help a whale researcher. For our analysis, we use images of humpback whale flukes provided by the Kaggle Challenge "Humpback Whale Identification". By means of spectral cluster analysis of heatmaps, which indicate which parts of the image are important for a decision, we can show that the they can be grouped in a meaningful way. Moreover, it appears that characteristics automatically determined by a neural network correspond to those that are considered important by a whale expert.

Original languageEnglish
Pages (from-to)1005-1012
Number of pages8
JournalISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Volume5
Issue number2
DOIs
StatePublished - 3 Aug 2020
Externally publishedYes
Event2020 24th ISPRS Congress on Technical Commission II - Nice, Virtual, France
Duration: 31 Aug 20202 Sep 2020

Keywords

  • Deep Learning
  • Humpback Whales
  • Interpretability
  • Machine Learning
  • Neural Networks
  • Visualization

Fingerprint

Dive into the research topics of 'What Identifies A Whale by Its Fluke? On the Benefit of Interpretable Machine Learning for Whale Identification'. Together they form a unique fingerprint.

Cite this