TY - JOUR
T1 - Brain activation patterns in normal hearing adults
T2 - An fNIRS Study using an adapted clinical speech comprehension task
AU - Bálint, András
AU - Wimmer, Wilhelm
AU - Caversaccio, Marco
AU - Rummel, Christian
AU - Weder, Stefan
N1 - Publisher Copyright:
© 2024
PY - 2025/1
Y1 - 2025/1
N2 - Objectives: Understanding brain processing of auditory and visual speech is essential for advancing speech perception research and improving clinical interventions for individuals with hearing impairment. Functional near-infrared spectroscopy (fNIRS) is deemed to be highly suitable for measuring brain activity during language tasks. However, accurate data interpretation also requires validated stimuli and behavioral measures. Design: Twenty-six adults with normal hearing listened to sentences from the Oldenburg Sentence Test (OLSA), and brain activation in the temporal, occipital, and prefrontal areas was measured by fNIRS. The sentences were presented in one of the four different modalities: speech-in-quiet, speech-in-noise, audiovisual speech or visual speech (i.e., lipreading). To support the interpretation of our fNIRS data, and to obtain a more comprehensive understanding of the study population, we performed hearing tests (pure tone and speech audiometry) and collected behavioral data using validated questionnaires, in-task comprehension questions, and listening effort ratings. Results: In the auditory conditions (i.e., speech-in-quiet and speech-in-noise), we observed cortical activity in the temporal regions bilaterally. During the visual speech condition, we measured significant activation in the occipital area. Following the audiovisual condition, cortical activation was observed in both regions. Furthermore, we established a baseline for how individuals with normal hearing process visual cues during lipreading, and we found higher activity in the prefrontal cortex in noise conditions compared to quiet conditions, linked to higher listening effort. Conclusions: We demonstrated the applicability of a clinically inspired audiovisual speech-comprehension task in participants with normal hearing. The measured brain activation patterns were supported and complemented by objective and behavioral parameters.
AB - Objectives: Understanding brain processing of auditory and visual speech is essential for advancing speech perception research and improving clinical interventions for individuals with hearing impairment. Functional near-infrared spectroscopy (fNIRS) is deemed to be highly suitable for measuring brain activity during language tasks. However, accurate data interpretation also requires validated stimuli and behavioral measures. Design: Twenty-six adults with normal hearing listened to sentences from the Oldenburg Sentence Test (OLSA), and brain activation in the temporal, occipital, and prefrontal areas was measured by fNIRS. The sentences were presented in one of the four different modalities: speech-in-quiet, speech-in-noise, audiovisual speech or visual speech (i.e., lipreading). To support the interpretation of our fNIRS data, and to obtain a more comprehensive understanding of the study population, we performed hearing tests (pure tone and speech audiometry) and collected behavioral data using validated questionnaires, in-task comprehension questions, and listening effort ratings. Results: In the auditory conditions (i.e., speech-in-quiet and speech-in-noise), we observed cortical activity in the temporal regions bilaterally. During the visual speech condition, we measured significant activation in the occipital area. Following the audiovisual condition, cortical activation was observed in both regions. Furthermore, we established a baseline for how individuals with normal hearing process visual cues during lipreading, and we found higher activity in the prefrontal cortex in noise conditions compared to quiet conditions, linked to higher listening effort. Conclusions: We demonstrated the applicability of a clinically inspired audiovisual speech-comprehension task in participants with normal hearing. The measured brain activation patterns were supported and complemented by objective and behavioral parameters.
KW - Audiovisual speech
KW - Functional near-infrared spectroscopy
KW - Normal hearing
KW - Speech understanding
UR - http://www.scopus.com/inward/record.url?scp=85210621394&partnerID=8YFLogxK
U2 - 10.1016/j.heares.2024.109155
DO - 10.1016/j.heares.2024.109155
M3 - Article
AN - SCOPUS:85210621394
SN - 0378-5955
VL - 455
JO - Hearing Research
JF - Hearing Research
M1 - 109155
ER -