Brain responses and looking behavior during audiovisual speech integration in infants predict auditory speech comprehension in the second year of life
Kushnerenko, E., Tomalski, P., Ballieux, H., Potton, A., Birtles, D., Frostick, C. and Moore, D. 2013. Brain responses and looking behavior during audiovisual speech integration in infants predict auditory speech comprehension in the second year of life. Frontiers in Psychology. 4 (432).
|Authors||Kushnerenko, E., Tomalski, P., Ballieux, H., Potton, A., Birtles, D., Frostick, C. and Moore, D.|
The use of visual cues during the processing of audiovisual (AV) speech is known to be less efficient in children and adults with language difficulties and difficulties are known to be more prevalent in children from low-income populations. In the present study, we followed an economically diverse group of thirty-seven infants longitudinally from 6–9 months to 14–16 months of age. We used eye-tracking to examine whether individual differences in visual attention during AV processing of speech in 6–9 month old infants, particularly when processing congruent and incongruent auditory and visual speech cues, might be indicative of their later language development. Twenty-two of these 6–9 month old infants also participated in an event-related potential (ERP) AV task within the same experimental session. Language development was then followed-up at the age of 14–16 months, using two measures of language development, the Preschool Language Scale and the Oxford Communicative Development Inventory. The results show that those infants who were less efficient in auditory speech processing at the age of 6–9 months had lower receptive language scores at 14–16 months. A correlational analysis revealed that the pattern of face scanning and ERP responses to audiovisually incongruent stimuli at 6–9 months were both significantly associated with language development at 14–16 months. These findings add to the understanding of individual differences in neural signatures of AV processing and associated looking behavior in infants.
|Keywords||audiovisual speech integration; ERPs; eye-tracking|
|Journal||Frontiers in Psychology|
|Journal citation||4 (432)|
Kushnerenko_Brain responses and looking behavior during audiovisual speech integration in infants predict auditory speech comprehension in the second year of life.pdf
|16 Jul 2013|
|Publication process dates|
|Deposited||28 Oct 2013|
|Copyright information||This article was submitted to Frontiers in Language Sciences,a specialty of Frontiers in Psychology. Copyright © 2013 Kushnerenko, Tomalski, Ballieux, Potton, Birtles, Frostick and Moore. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in other forums, provided the original authors and source are credited and subject to any copyright notices concerning any third-party graphics etc.|
3views this month
1downloads this month