A rapid, objective and implicit measure ...
Document type :
Article dans une revue scientifique
Permalink :
Title :
A rapid, objective and implicit measure of visual quantity discrimination
Author(s) :
Guillaume, Mathieu [Auteur]
Mejias, Sandrine [Auteur]
Laboratoire Sciences Cognitives et Sciences Affectives - UMR 9193 [SCALab]
Sciences Cognitives et Sciences Affectives (SCALab) - UMR 9193
Rossion, Bruno [Auteur]
Service de neurologie [CHRU Nancy]
Dzhelyova, Milena [Auteur]
Université Catholique de Louvain = Catholic University of Louvain [UCL]
Schiltz, Christine [Auteur]
Mejias, Sandrine [Auteur]
Laboratoire Sciences Cognitives et Sciences Affectives - UMR 9193 [SCALab]
Sciences Cognitives et Sciences Affectives (SCALab) - UMR 9193
Rossion, Bruno [Auteur]
Service de neurologie [CHRU Nancy]
Dzhelyova, Milena [Auteur]
Université Catholique de Louvain = Catholic University of Louvain [UCL]
Schiltz, Christine [Auteur]
Journal title :
Neuropsychologia
Abbreviated title :
Neuropsychologia
Pages :
180-189
Publisher :
Elsevier BV
Publication date :
2018-03
ISSN :
0028-3932
HAL domain(s) :
Sciences cognitives
English abstract : [en]
There is evidence that accurate and rapid judgments of visual quantities form an essential component of human mathematical ability. However, explicit behavioural discrimination measures of visual quantities are readily ...
Show more >There is evidence that accurate and rapid judgments of visual quantities form an essential component of human mathematical ability. However, explicit behavioural discrimination measures of visual quantities are readily contaminated both by variations in low-level physical parameters and higher order cognitive factors, while implicit measures often lack objectivity and sensitivity at the individual participant level. Here, with electrophysiological frequency tagging, we show discrimination differences between briefly presented visual quantities as low as a ratio of 1.4 (i.e., 14 vs. 10 elements). From this threshold, the neural discrimination response increases with parametrically increasing differences in ratio between visual quantities. Inter-individual variability in magnitude of the EEG response at this population threshold ratio predicts behavioural performance at an independent number comparison task. Overall, these findings indicate that visual quantities are perceptually discriminated automatically and rapidly (i.e., at a glance) within the occipital cortex. Given its high sensitivity, this paradigm could provide an implicit diagnostic neural marker of this process suitable for a wide range of fundamental and clinical applications.Show less >
Show more >There is evidence that accurate and rapid judgments of visual quantities form an essential component of human mathematical ability. However, explicit behavioural discrimination measures of visual quantities are readily contaminated both by variations in low-level physical parameters and higher order cognitive factors, while implicit measures often lack objectivity and sensitivity at the individual participant level. Here, with electrophysiological frequency tagging, we show discrimination differences between briefly presented visual quantities as low as a ratio of 1.4 (i.e., 14 vs. 10 elements). From this threshold, the neural discrimination response increases with parametrically increasing differences in ratio between visual quantities. Inter-individual variability in magnitude of the EEG response at this population threshold ratio predicts behavioural performance at an independent number comparison task. Overall, these findings indicate that visual quantities are perceptually discriminated automatically and rapidly (i.e., at a glance) within the occipital cortex. Given its high sensitivity, this paradigm could provide an implicit diagnostic neural marker of this process suitable for a wide range of fundamental and clinical applications.Show less >
Language :
Anglais
Audience :
Non spécifiée
Popular science :
Non
Administrative institution(s) :
Université de Lille
CNRS
CHU Lille
CNRS
CHU Lille
Submission date :
2020-10-02T14:16:42Z
2020-10-15T08:39:57Z
2020-10-15T08:42:43Z
2021-04-14T12:14:38Z
2024-03-06T09:01:27Z
2024-03-06T10:43:53Z
2024-03-06T16:38:00Z
2020-10-15T08:39:57Z
2020-10-15T08:42:43Z
2021-04-14T12:14:38Z
2024-03-06T09:01:27Z
2024-03-06T10:43:53Z
2024-03-06T16:38:00Z
Files
- preprint Guillaume et al.pdf
- Version soumise (preprint)
- Open access
- Access the document