Landscape of neural architecture search across sensors: How much do they differ ?

K. R. Traoré, A. Camero, X. X. Zhu

Research output: Contribution to journalConference articlepeer-review


With the rapid rise of neural architecture search , the ability to understand its complexity from the perspective of a search algorithm is desirable. Recently, Traoré et al. have proposed the framework of Fitness Landscape Footprint to help describe and compare neural architecture search problems. It attempts at describing why a search strategy might be successful, struggle or fail on a target task. Our study leverages this methodology in the context of searching across sensors, including sensor data fusion. In particular, we apply the Fitness Landscape Footprint to the real-world image classification problem of So2Sat LCZ42, in order to identify the most beneficial sensor to our neural network hyper-parameter optimization problem. From the perspective of distributions of fitness, our findings indicate a similar behaviour of the CNN search space for all sensors: The longer the training time, the larger the overall fitness, and more flatness in the landscapes (less ruggedness and deviation). Regarding sensors, the better the fitness they enable (Sentinel-2), the better the search trajectories (smoother, higher persistence). Results also indicate very similar search behaviour for sensors that can be decently fitted by the search space (Sentinel-2 and fusion).

Original languageEnglish
Pages (from-to)217-224
Number of pages8
JournalISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Issue number3
StatePublished - 17 May 2022
Event2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission III - Nice, France
Duration: 6 Jun 202211 Jun 2022


  • AutoML
  • Fitness Landscape Analysis
  • Neural Architecture Search
  • Remote Sensing
  • Sensor Fusion


Dive into the research topics of 'Landscape of neural architecture search across sensors: How much do they differ ?'. Together they form a unique fingerprint.

Cite this