ResoNet: Robust and Explainable ENSO Forecasts with Hybrid Convolution and Transformer Networks

Pumeng Lyu, Tao Tang, Fenghua Ling, Jing Jia Luo, Niklas Boers, Wanli Ouyang, Lei Bai

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Recent studies have shown that deep learning (DL) models can skillfully forecast El Niño–Southern Oscillation (ENSO) events more than 1.5 years in advance. However, concerns regarding the reliability of predictions made by DL methods persist, including potential overfitting issues and lack of interpretability. Here, we propose ResoNet, a DL model that combines CNN (convolutional neural network) and transformer architectures. This hybrid architecture enables our model to adequately capture local sea surface temperature anomalies as well as long-range inter-basin interactions across oceans. We show that ResoNet can robustly predict ENSO at lead times of 19 months, thus outperforming existing approaches in terms of the forecast horizon. According to an explainability method applied to ResoNet predictions of El Niño and La Niña from 1- to 18-month leads, we find that it predicts the Niño-3.4 index based on multiple physically reasonable mechanisms, such as the recharge oscillator concept, seasonal footprint mechanism, and Indian Ocean capacitor effect. Moreover, we demonstrate for the first time that the asymmetry between El Niño and La Niña development can be captured by ResoNet. Our results could help to alleviate skepticism about applying DL models for ENSO prediction and encourage more attempts to discover and predict climate phenomena using AI methods.

Original languageEnglish
Pages (from-to)1289-1298
Number of pages10
JournalAdvances in Atmospheric Sciences
Volume41
Issue number7
DOIs
StatePublished - Jul 2024

Keywords

  • CNN
  • ENSO
  • deep learning
  • transformer

Fingerprint

Dive into the research topics of 'ResoNet: Robust and Explainable ENSO Forecasts with Hybrid Convolution and Transformer Networks'. Together they form a unique fingerprint.

Cite this