Detecting Aware and Unaware Mind Wandering During Lecture Viewing: A Multimodal Machine Learning Approach Using Eye Tracking, Facial Videos and Physiological Data

Babette Bühler, Efe Bozkir, Hannah Deininger, Patricia Goldberg, Peter Gerjets, Ulrich Trautwein, Enkelejda Kasneci

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

Abstract

Learners often experience aware and unaware mind wandering during educational tasks, both negatively impacting learning outcomes. Diferentiating these types of task-unrelated thoughts is crucial, as they stem from diferent cognitive processes and warrant tailored support that addresses the specifc nature of mind wandering. Automated detection of these episodes could help mitigate their adverse efects, for example, by developing adaptive, attention-aware learning environments. In this study (N = 87), we explored a novel multimodal approach, combining eye tracking, facial videos, and physiological wristbands (i.e., electrodermal activity and heart rate), to predict aware and unaware mind wandering during lecture video watching. In addition, to allow comparison to previous research, we also predicted an integrated mind-wandering category. Mind wandering was assessed using 15 two-stage thought probes to determine task-unrelated thoughts and the participants’ awareness of their mind wandering. Our fndings indicate that a multimodal approach outperforms unimodal methods, utilizing the top 100 features from the fused data. Specifcally, aware mind wandering was detected at 20% above chance (AUC-PR = 0.396), unaware mind wandering at 14% above chance (AUC-PR = 0.267), and the combined category at 40% above chance (AUC-PR = 0.637). Eye tracking and video features proved more predictive than physiological measures when used as standalone modalities. SHAP analysis, employed to explain the results, highlighted the signifcance of integrating features from all three modalities for efective detection, particularly emphasizing the role of video-based facial expressions in identifying unaware mind wandering. Going beyond the current state of the art, this study demonstrates the potential of leveraging multimodal data to enhance the precision of aware and unaware mind-wandering detection and diferentiation, setting a foundation for advancing educational technologies that respond dynamically to learners’ cognitive states.

OriginalspracheEnglisch
TitelICMI 2024 - Proceedings of the 26th International Conference on Multimodal Interaction
Herausgeber (Verlag)Association for Computing Machinery
Seiten244-253
Seitenumfang10
ISBN (elektronisch)9798400704628
DOIs
PublikationsstatusVeröffentlicht - 4 Nov. 2024
Veranstaltung26th International Conference on Multimodal Interaction, ICMI 2024 - San Jose, Costa Rica
Dauer: 4 Nov. 20248 Nov. 2024

Publikationsreihe

NameACM International Conference Proceeding Series

Konferenz

Konferenz26th International Conference on Multimodal Interaction, ICMI 2024
Land/GebietCosta Rica
OrtSan Jose
Zeitraum4/11/248/11/24

Fingerprint

Untersuchen Sie die Forschungsthemen von „Detecting Aware and Unaware Mind Wandering During Lecture Viewing: A Multimodal Machine Learning Approach Using Eye Tracking, Facial Videos and Physiological Data“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren