TY - GEN
T1 - Detecting Aware and Unaware Mind Wandering During Lecture Viewing
T2 - 26th International Conference on Multimodal Interaction, ICMI 2024
AU - Bühler, Babette
AU - Bozkir, Efe
AU - Deininger, Hannah
AU - Goldberg, Patricia
AU - Gerjets, Peter
AU - Trautwein, Ulrich
AU - Kasneci, Enkelejda
N1 - Publisher Copyright:
© 2024 Copyright held by the owner/author(s).
PY - 2024/11/4
Y1 - 2024/11/4
N2 - Learners often experience aware and unaware mind wandering during educational tasks, both negatively impacting learning outcomes. Diferentiating these types of task-unrelated thoughts is crucial, as they stem from diferent cognitive processes and warrant tailored support that addresses the specifc nature of mind wandering. Automated detection of these episodes could help mitigate their adverse efects, for example, by developing adaptive, attention-aware learning environments. In this study (N = 87), we explored a novel multimodal approach, combining eye tracking, facial videos, and physiological wristbands (i.e., electrodermal activity and heart rate), to predict aware and unaware mind wandering during lecture video watching. In addition, to allow comparison to previous research, we also predicted an integrated mind-wandering category. Mind wandering was assessed using 15 two-stage thought probes to determine task-unrelated thoughts and the participants’ awareness of their mind wandering. Our fndings indicate that a multimodal approach outperforms unimodal methods, utilizing the top 100 features from the fused data. Specifcally, aware mind wandering was detected at 20% above chance (AUC-PR = 0.396), unaware mind wandering at 14% above chance (AUC-PR = 0.267), and the combined category at 40% above chance (AUC-PR = 0.637). Eye tracking and video features proved more predictive than physiological measures when used as standalone modalities. SHAP analysis, employed to explain the results, highlighted the signifcance of integrating features from all three modalities for efective detection, particularly emphasizing the role of video-based facial expressions in identifying unaware mind wandering. Going beyond the current state of the art, this study demonstrates the potential of leveraging multimodal data to enhance the precision of aware and unaware mind-wandering detection and diferentiation, setting a foundation for advancing educational technologies that respond dynamically to learners’ cognitive states.
AB - Learners often experience aware and unaware mind wandering during educational tasks, both negatively impacting learning outcomes. Diferentiating these types of task-unrelated thoughts is crucial, as they stem from diferent cognitive processes and warrant tailored support that addresses the specifc nature of mind wandering. Automated detection of these episodes could help mitigate their adverse efects, for example, by developing adaptive, attention-aware learning environments. In this study (N = 87), we explored a novel multimodal approach, combining eye tracking, facial videos, and physiological wristbands (i.e., electrodermal activity and heart rate), to predict aware and unaware mind wandering during lecture video watching. In addition, to allow comparison to previous research, we also predicted an integrated mind-wandering category. Mind wandering was assessed using 15 two-stage thought probes to determine task-unrelated thoughts and the participants’ awareness of their mind wandering. Our fndings indicate that a multimodal approach outperforms unimodal methods, utilizing the top 100 features from the fused data. Specifcally, aware mind wandering was detected at 20% above chance (AUC-PR = 0.396), unaware mind wandering at 14% above chance (AUC-PR = 0.267), and the combined category at 40% above chance (AUC-PR = 0.637). Eye tracking and video features proved more predictive than physiological measures when used as standalone modalities. SHAP analysis, employed to explain the results, highlighted the signifcance of integrating features from all three modalities for efective detection, particularly emphasizing the role of video-based facial expressions in identifying unaware mind wandering. Going beyond the current state of the art, this study demonstrates the potential of leveraging multimodal data to enhance the precision of aware and unaware mind-wandering detection and diferentiation, setting a foundation for advancing educational technologies that respond dynamically to learners’ cognitive states.
KW - Education
KW - Meta-Awareness
KW - Mind Wandering Detection
KW - Task-Unrelated Thought
UR - http://www.scopus.com/inward/record.url?scp=85212589030&partnerID=8YFLogxK
U2 - 10.1145/3678957.3685710
DO - 10.1145/3678957.3685710
M3 - Conference contribution
AN - SCOPUS:85212589030
T3 - ACM International Conference Proceeding Series
SP - 244
EP - 253
BT - ICMI 2024 - Proceedings of the 26th International Conference on Multimodal Interaction
PB - Association for Computing Machinery
Y2 - 4 November 2024 through 8 November 2024
ER -