Calibration of Controlled Markov Chains for Predicting Pedestrian Crossing Behavior Using Multi-objective Genetic Algorithms

Jingyuan Wu, Johannes Ruenz, Matthias Althoff

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Pedestrian motion prediction is a core issue in assisted and automated driving and challenging to solve. In this work, controlled Markov chains are used for predicting pedestrian crossing behavior in urban environments with and without crosswalks. Intentions, such as crossing a road, are estimated by incorporating the probability of colliding with other traffic participants. On a public dataset, we calibrate the model parameters using genetic algorithms which we formulate as a multi-objective optimization problem. Rather than only minimizing the position deviation of the prediction, we also consider the classification performance for pedestrians' crossing intention. The conducted evaluation shows benefits of our approach: it achieves comparable intention recognition performance compared to a support vector machine, while additionally achieving accurate spatiotemporal predictions.

Original languageEnglish
Title of host publication2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1032-1038
Number of pages7
ISBN (Electronic)9781538670248
DOIs
StatePublished - Oct 2019
Event2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019 - Auckland, New Zealand
Duration: 27 Oct 201930 Oct 2019

Publication series

Name2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019

Conference

Conference2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019
Country/TerritoryNew Zealand
CityAuckland
Period27/10/1930/10/19

Fingerprint

Dive into the research topics of 'Calibration of Controlled Markov Chains for Predicting Pedestrian Crossing Behavior Using Multi-objective Genetic Algorithms'. Together they form a unique fingerprint.

Cite this