High-frequency neural activity predicts ...
Document type :
Article dans une revue scientifique
DOI :
PMID :
Permalink :
Title :
High-frequency neural activity predicts word parsing in ambiguous speech streams
Author(s) :
Kösem, Anne [Auteur]
Max Planck Institute for Psycholinguistics
Radboud University [Nijmegen]
Neuroimagerie cognitive - Psychologie cognitive expérimentale [UNICOG-U992]
Basirat, Anahita [Auteur]
Laboratoire Sciences Cognitives et Sciences Affectives - UMR 9193 [SCALab]
Neuroimagerie cognitive - Psychologie cognitive expérimentale [UNICOG-U992]
Sciences Cognitives et Sciences Affectives (SCALab) - UMR 9193
Azizi, Leila [Auteur]
Neuroimagerie cognitive - Psychologie cognitive expérimentale [UNICOG-U992]
van Wassenhove, Virginie [Auteur]
Neuroimagerie cognitive - Psychologie cognitive expérimentale [UNICOG-U992]
Max Planck Institute for Psycholinguistics
Radboud University [Nijmegen]
Neuroimagerie cognitive - Psychologie cognitive expérimentale [UNICOG-U992]
Basirat, Anahita [Auteur]

Laboratoire Sciences Cognitives et Sciences Affectives - UMR 9193 [SCALab]
Neuroimagerie cognitive - Psychologie cognitive expérimentale [UNICOG-U992]
Sciences Cognitives et Sciences Affectives (SCALab) - UMR 9193
Azizi, Leila [Auteur]
Neuroimagerie cognitive - Psychologie cognitive expérimentale [UNICOG-U992]
van Wassenhove, Virginie [Auteur]
Neuroimagerie cognitive - Psychologie cognitive expérimentale [UNICOG-U992]
Journal title :
Journal of Neurophysiology
Abbreviated title :
J. Neurophysiol.
Volume number :
116
Pages :
2497-2512
Publication date :
2016-12-01
ISSN :
1522-1598
HAL domain(s) :
Sciences cognitives
English abstract : [en]
During speech listening, the brain parses a continuous acoustic stream of information into computational units (e.g., syllables or words) necessary for speech comprehension. Recent neuroscientific hypotheses have proposed ...
Show more >During speech listening, the brain parses a continuous acoustic stream of information into computational units (e.g., syllables or words) necessary for speech comprehension. Recent neuroscientific hypotheses have proposed that neural oscillations contribute to speech parsing, but whether they do so on the basis of acoustic cues (bottom-up acoustic parsing) or as a function of available linguistic representations (top-down linguistic parsing) is unknown. In this magnetoencephalography study, we contrasted acoustic and linguistic parsing using bistable speech sequences. While listening to the speech sequences, participants were asked to maintain one of the two possible speech percepts through volitional control. We predicted that the tracking of speech dynamics by neural oscillations would not only follow the acoustic properties but also shift in time according to the participant's conscious speech percept. Our results show that the latency of high-frequency activity (specifically, beta and gamma bands) varied as a function of the perceptual report. In contrast, the phase of low-frequency oscillations was not strongly affected by top-down control. Whereas changes in low-frequency neural oscillations were compatible with the encoding of prelexical segmentation cues, high-frequency activity specifically informed on an individual's conscious speech percept.Show less >
Show more >During speech listening, the brain parses a continuous acoustic stream of information into computational units (e.g., syllables or words) necessary for speech comprehension. Recent neuroscientific hypotheses have proposed that neural oscillations contribute to speech parsing, but whether they do so on the basis of acoustic cues (bottom-up acoustic parsing) or as a function of available linguistic representations (top-down linguistic parsing) is unknown. In this magnetoencephalography study, we contrasted acoustic and linguistic parsing using bistable speech sequences. While listening to the speech sequences, participants were asked to maintain one of the two possible speech percepts through volitional control. We predicted that the tracking of speech dynamics by neural oscillations would not only follow the acoustic properties but also shift in time according to the participant's conscious speech percept. Our results show that the latency of high-frequency activity (specifically, beta and gamma bands) varied as a function of the perceptual report. In contrast, the phase of low-frequency oscillations was not strongly affected by top-down control. Whereas changes in low-frequency neural oscillations were compatible with the encoding of prelexical segmentation cues, high-frequency activity specifically informed on an individual's conscious speech percept.Show less >
Language :
Anglais
Audience :
Non spécifiée
Administrative institution(s) :
Université de Lille
CNRS
CHU Lille
CNRS
CHU Lille
Research team(s) :
Équipe Langage
Submission date :
2019-02-13T14:48:19Z
2020-02-11T16:34:13Z
2021-04-09T07:50:25Z
2020-02-11T16:34:13Z
2021-04-09T07:50:25Z