Assessing Quality Metrics for Neural Reality Gap Input Mitigation in Autonomous Driving Testing

Stefano Carlo Lambertenghi, Andrea Stocco

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Simulation-based testing of automated driving systems (ADS) is the industry standard, being a controlled, safe, and cost-effective alternative to real-world testing. Despite these advantages, virtual simulations often fail to accurately replicate real-world conditions like image fidelity, texture representation, and environmental accuracy. This can lead to significant differences in ADS behavior between simulated and real-world domains, a phenomenon known as the sim2real gap. Researchers have used Image-to-Image (I2I) neural translation to mitigate the sim2real gap, enhancing the realism of simulated environments by transforming synthetic data into more authentic representations of real-world conditions. However, while promising, these techniques may potentially introduce artifacts, distortions, or inconsistencies in the generated data that can affect the effectiveness of ADS testing. In our empirical study, we investigated how the quality of image-to-image (I2I) techniques influences the mitigation of the sim2real gap, using a set of established metrics from the literature. We evaluated two popular generative I2I architectures, pix2pix and CycleGAN, across two ADS perception tasks at a model level, namely vehicle detection and end-to-end lane keeping, using paired simulated and real-world datasets. Our findings reveal that the effectiveness of I2I architectures varies across different ADS tasks, and existing evaluation metrics do not consistently align with the ADS behavior. Thus, we conducted task-specific fine-tuning of perception metrics, which yielded a stronger correlation. Our findings indicate that a perception metric that incorporates semantic elements, tailored to each task, can facilitate selecting the most appropriate I2I technique for a reliable assessment of the sim2real gap mitigation.

Original languageEnglish
Title of host publicationProceedings - 2024 IEEE Conference on Software Testing, Verification and Validation, ICST 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages173-184
Number of pages12
ISBN (Electronic)9798350308181
DOIs
StatePublished - 2024
Event17th IEEE Conference on Software Testing, Verification and Validation, ICST 2024 - Toronto, Canada
Duration: 27 May 202431 May 2024

Publication series

NameProceedings - 2024 IEEE Conference on Software Testing, Verification and Validation, ICST 2024

Conference

Conference17th IEEE Conference on Software Testing, Verification and Validation, ICST 2024
Country/TerritoryCanada
CityToronto
Period27/05/2431/05/24

Keywords

  • autonomous vehicles testing
  • generative adversarial networks
  • reality gap
  • sim2real

Fingerprint

Dive into the research topics of 'Assessing Quality Metrics for Neural Reality Gap Input Mitigation in Autonomous Driving Testing'. Together they form a unique fingerprint.

Cite this