Semantic processing of actions at 9 months is linked to language proficiency at 9 and 18 months

https://doi.org/10.1016/j.jecp.2016.02.003Get rights and content

Highlights

  • Infants at 9 months were tested for producing an N400 for action semantics.

  • Higher language comprehension at 9 and 18 months produced an N400 at 9 months.

  • Infants with lower language comprehension did not evidence an N400 at 9 months.

  • Shared processing across both cognitive domains.

Abstract

The current study uses event-related potential methodologies to investigate how social–cognitive processes in preverbal infants relate to language performance. We assessed 9-month-olds’ understanding of the semantic structure of actions via an N400 event-related potential (ERP) response to action sequences that contained expected and unexpected outcomes. At 9 and 18 months of age, infants’ language abilities were measured using the Swedish Early Communicative Development Inventory (SECDI). Here we show that 9-month-olds’ understanding of the semantic structure of actions, evidenced in an N400 ERP response to action sequences with unexpected outcomes, is related to language comprehension scores at 9 months and is related to language production scores at 18 months of age. Infants who showed a selective N400 response to unexpected action outcomes are those who are classed as above mean in their language proficiency. The results provide evidence that language performance is related to the ability to detect and interpret human actions at 9 months of age. This study suggests that some basic cognitive mechanisms are involved in the processing of sequential events that are shared between two conceptually different cognitive domains and that pre-linguistic social understanding skills and language proficiency are linked to one another.

Introduction

Young infants show sophisticated abilities across an array of cognitive domains during early development (Mandler, 2006). One example of this is the ability to determine outcomes of actions, which is linked to the capacity to process other people’s goals and their intentions (Baldwin, Baird, Saylor, & Clark, 2001). It is possible to conceptually divide human action into two primary forms. On the one hand, actions can be communicative in nature and are designed to directly engender social understanding of information. Within this framework, infants’ preverbal social–communicative capacities such as the ability to follow others’ eye gaze (e.g., Brooks and Meltzoff, 2005, Brooks and Meltzoff, 2008, Brooks and Meltzoff, 2015) and to process pointing (e.g., Brooks and Meltzoff, 2008, Butterworth and Morissette, 1996, Tomasello et al., 2007) and gestures (e.g., Kraljević et al., 2014, Rowe and Goldwin-Meadow, 2009a, Rowe and Goldwin-Meadow, 2009b) are well studied and have been related to various aspects of later language abilities.

A second form of actions incorporates all of those actions that are non-communicative in nature but still convey goal directedness. Such goal-directed actions can be observed repeatedly in typical environments by infants during early development, ranging from parents cleaning the home to the preparation and consumption of food. In such scenarios, adults as well as young children readily construct action representations that are organized with respect to this ultimate goal (Baldwin et al., 2001, Zacks et al., 2001). For example, infants from 6 months of age can accurately predict that a cup (Hunnius & Bekkering, 2010) or a spoon (Kochukhova and Gredebäck, 2010, Reid et al., 2009) should go toward a person’s mouth rather than toward a person’s ear. These representations allow for the prediction of the consequences of actions, including the ability to interpret and describe actions and categorize action sequences (Sommerville & Woodward, 2005). Prior research has shown that the structure of actions parallels that of linguistic utterances and that both actions and language show comparable hierarchical structures (Baldwin et al., 2001, Zacks et al., 2001). It has been argued that human language emerged from the hierarchical structure of instrumental actions. The same neural circuits that control the hierarchy of these instrumental actions served as a basis on which the newly acquired function of language syntax has emerged (Gallese, 2007). The parallels that can be seen between the semantic organization of non-communicative actions and the semantics within linguistic structures raise the possibility that processing of this action type during early development may be related to language. The context that is present within the execution of sentences and actions conveys information that facilitates the prediction of future events. Situational knowledge thereby provides infants with a mechanism to use semantic rules during action observation (Ni Choisdealbha & Reid, 2014). Among other cognitive and social advantages, this enables infants to reenact the final goal of a modeled action (Gergely, Bekkering, & Kiraly, 2002) and to infer goals of an uncompleted action without seeing the achievement of the goal itself (Daum, Prinz, & Aschersleben, 2008). Given these parallels in the organization of non-communicative actions and the structure of language, it has been conjectured that the ability to process these hierarchically structured actions during early development may pave the way for language acquisition (Reid et al., 2009). From the evolutionary account, it has been suggested that language has its origin in the ability to interpret others’ gestures and actions (e.g., Corballis, 2003, Rizzolatti and Arbib, 1998). Together, these findings indicate that in addition to previous findings, which demonstrated strong relations between infants’ preverbal social–communicative capacities and language and have already been well studied, understanding of non-communicative action may also be linked to the development of language or share similar domain-general processes. To date, there is no empirical evidence to support the notion that links between the two domains exist during early development. Consequently, the current study investigated the relation between the semantic processing of non-communicative instrumental actions during infancy, that is, the ability to detect and interpret others’ action end states as either expected or unexpected and relate this ability to language abilities during the first and second postnatal years.

In language research, the N400 component of the event-related potential (ERP) has been identified as a neural signature related to the formation of a semantic representation because the N400 is elicited when a word does not fit an expected context (e.g., Kutas & Hillyard, 1980). In the action domain, N400 effects are observable when action outcomes are violated in infants and adults (e.g., Parise and Csibra, 2012, Reid and Striano, 2008, Reid et al., 2009, Wu and Coulson, 2005). In adults, the N400 has also been found to be sensitive to the relation between gesture and speech (Holle and Gunter, 2007, Kelly et al., 2007). The similarity in the electrophysiological responses to semantic violations in the action and language domain suggests that language processing may derive from understanding action during early development or, to some extent, share similar cognitive mechanisms. Should infants first start to understand the parameters of semantic structures within the action domain, this capacity may well bootstrap learning about language. Pre-linguistic semantic processing of action sequences contains hierarchically aligned structures that are similar to that of sentential structures. As such, the capacity to semantically process action may be related to language capacities.

A key question that has to date remained unanswered is whether language capacities are based on structures initially detected and interpreted within the action domain. For this purpose, we presented 28 9-month-old infants with a sequence of images with expected and unexpected action outcomes known to reliably induce an N400-like ERP component over parietal regions (Reid et al., 2009). In addition to the N400 effect, Reid and colleagues (2009) found a negative central (Nc) component, which was larger in amplitude for the expected condition over frontal and central areas when contrasted with the unexpected condition. The Nc is thought to reflect attentional processes with greater amplitude for stimuli that elicit a higher allocation of attentional resources (Reynolds & Richards, 2005) and is typically found over frontal and central sites, peaking at around 300 to 700 ms after stimulus onset (Webb, Long, & Nelson, 2005). Previously, the Nc has been found to be more enhanced for familiar stimuli, for instance, familiar versus novel faces (e.g., deHaan & Nelson, 1997). Reid and colleagues (2009) argued that their observed Nc effect was driven by infants’ higher interest in stimuli depicting food consumption. An alternative and equally plausible explanation given by the authors could be that 9-month-olds are capable of judging where food should be placed when people hold and direct food toward the head area because these are actions that infants are more likely to be exposed to in everyday life. Therefore, a familiarity effect for the expected action conclusion could drive the observed Nc effect. Our study employed the same stimuli and age group as Reid and colleagues’ study. Therefore, we expected to replicate the morphology of the Nc and N400 in Reid and colleagues’ work within the current study.

To investigate expected and unexpected goal-directed action processing at 9 months of age and infants’ language abilities, we assessed language skills at 9 months as well as when the same children reached 18 months of age by introducing two forms of the Swedish Early Communicative Development Inventory (SECDI; Eriksson, Westerlund, & Berglund, 2002). To assess infants’ action processing abilities and relate these to language proficiency, percentile scores from both the SECDI words and gestures (w&g) at 9 months and SECDI words and sentences (w&s) at 18 months were used to split our sample at the 50th percentile (for a similar paradigm, see Torkildsen et al., 2009). Consequently, we examined infants’ semantic action processing abilities at 9 months of age (via obtained ERP data) separately for each of the two language proficiency groups and time points (here termed low and high language comprehension at 9 months and low and high language production at 18 months). We hypothesized that if the application of rules relating to the structure of action were also used when processing language structures, only the group of infants with an above mean performance in language comprehension and production at both time points of the language assessment will display evidence for semantic processing within the action domain, as indexed by the N400 ERP component.

Section snippets

Participants

The final sample comprised 28 9-month-old monolingual Swedish infants (14 female) with a mean age of 277 days (range = 262–287). When split at the 50th percentile on the SECDI at 18 months of age, the low language production group (n = 14, 6 female) had an age range at 9 months of age between 263 and 284 days (M = 272) and the high language production group (n = 14, 8 female) had an age range between 262 and 287 days (M = 276). The average age during the follow-up language production assessment at 18 months

Nc ERP component

Visual inspection of the Nc effect led to the identification of 10 channels representing the left central scalp (29, 30-C1, 35, 36-C3, 37, 40, 41-C5, 42, 46, and 47) and 10 channels representing the right central scalp (87, 93, 98, 102, 103-C6, 104-C4, 105-C2, 109, 110, and 111), which were in line with topographical locations for the Nc present in prior infant research (e.g., Parise, Reid, Stets, & Striano, 2008). A time window was chosen from 350 to 600 ms post-stimulus presentation in order

Discussion

The current study used event-related potential techniques to investigate how the understanding of non-communicative actions during early development may relate to language capacities later in life. Our results successfully replicated both the N400 and Nc components in related regions of the scalp and time windows as those reported in Reid and colleagues (2009), with the ERPs following the same morphology despite the use of a high-density electrode array, whereas the original study had less than

Conclusions

These results provide evidence for a developmental account that focuses on how language function emerges from pre-linguistic social understanding skills over the first postnatal year. Following these results, experiments can now be performed whereby the nature of this relation can be further examined, with the N400 being used as a primary tool to determine the associations between action and language processing during early development. This work highlights the parallels in the organization of

Acknowledgments

This work was supported by FP7 Marie Curie ITN “ACT” (Grant 289404). V.M.R. is a reader and G.W. is a professor in the International Centre for Language and Communicative Development (LuCiD) at Lancaster University. The support of the Economic and Social Research Council (ES/L008955/1) is gratefully acknowledged. We express our gratitude to all of the families who participated in this research. The authors declare no competing interests.

References (52)

  • K. Snyder et al.

    Theoretical and methodological implications of variability in infant brain response during a recognition memory paradigm

    Infant Behavior and Development

    (2002)
  • J. Sommerville et al.

    Pulling out the intentional structure of action: The relation between action processing and action production in infancy

    Cognition

    (2005)
  • J. Torkildsen et al.

    Brain dynamics of word familiarization in 20-month-olds: Effects of productive vocabulary size

    Brain and Language

    (2009)
  • J. Torkildsen et al.

    Semantic organization of basic-level words in 20-month-olds: An ERP study

    Journal of Neurolinguistics

    (2006)
  • D.A. Baldwin et al.

    Infants parse dynamic action

    Child Development

    (2001)
  • E. Berglund et al.

    Communicative development in Swedish children 16–28 months old: The Swedish Early Communicative Development Inventory—words and sentences

    Scandinavian Journal of Psychology

    (2000)
  • M.H. Bornstein et al.

    Stability of language in childhood: A multiage, multidomain, multimeasure, and multisource study

    Developmental Psychology

    (2012)
  • R. Brooks et al.

    The development of gaze following and its relation to language

    Developmental Science

    (2005)
  • R. Brooks et al.

    Infant gaze following and pointing predict accelerated vocabulary growth through two years of age: A longitudinal growth curve modeling study

    Journal of Child Language

    (2008)
  • G.E. Butterworth et al.

    Onset of pointing and the acquisition of language in infancy

    Journal of Reproductive Infant Psychology

    (1996)
  • M.C. Corballis

    From hand to mouth: The gestural origins of language

  • M. Daum et al.

    Encoding the goal of an object directed but uncompleted reaching action in 6- and 9-month-old infants

    Developmental Science

    (2008)
  • M. deHaan et al.

    Recognition of the mother’s face by six-month-old infants: A neurobehavioral study

    Child Development

    (1997)
  • E. Domínguez-Martínez et al.

    The fixation distance to the stimulus influences ERP quality: An EEG and eye tracking N400 study

    PLoS ONE

    (2015)
  • M. Eriksson et al.

    A screening version of the Swedish Communicative Development Inventories designed for use with 18-month-old children

    Journal of Speech Language and Hearing Research

    (2002)
  • L. Fenson et al.

    Variability in early communicative development

    Monographs of the Society for Research in Child Development

    (1994)
  • Cited by (26)

    • Kinematic boundary cues modulate 12-month-old infants’ segmentation of action sequences: An ERP study

      2021, Neuropsychologia
      Citation Excerpt :

      To do this, we compared ERP activity in a time window of 250–750 ms following the onset of each action, thereby analysing the amplitude of the Nc. The Nc is a marker of infants' attention to and encoding of a stimulus (Richards et al., 2010; Reynolds and Richards, 2019) and has recently been implicated in infants’ encoding of actions and their outcomes (Kaduk et al., 2016; Pace et al., 2013; Schönebeck and Elsner, 2019) as well as of individual actions of an action sequence (Monroy et al., 2019). The present results confirmed our expectation that the onset of both the first and second action would evoke comparable Nc responses in both conditions, because the conditions only began to differ towards the end of the second action.

    • Lumping and splitting: Developmental changes in the structure of children's semantic networks

      2020, Journal of Experimental Child Psychology
      Citation Excerpt :

      Rather, contemporary theoretical accounts agree that our knowledge is structured in semantic networks that encode concepts according to relevant within- and across-domain distinctions (Cree & McRae, 2003; Crowe & Prescott, 2003; McClelland & Rogers, 2003; McRae & Jones, 2013). There is much evidence that these semantic networks play an important role in supporting a broad range of cognitive processes (Bjorklund & Jacobs, 1985; Borovsky, Ellis, Evans, & Elman, 2016; Colunga & Sims, 2017; Federmeier & Kutas, 1999; Fisher, Godwin, & Matlen, 2015; Gobbo & Chi, 1986; Kaduk et al., 2016; Medin, Lynch, Coley, & Atran, 1997), making the development of human semantic representations a central question in the study of cognition. A number of computational modeling studies provide a comprehensive framework for understanding how semantic differentiation emerges as a result of development and learning.

    • Keeping the end in mind: Preliminary brain and behavioral evidence for broad attention to endpoints in pre-linguistic infants

      2020, Infant Behavior and Development
      Citation Excerpt :

      Additional ERP research has focused on highly familiar actions (e.g., eating) with salient goal-directed outcomes (e.g., food to mouth) involving a central object (e.g., spoon) (e.g., Reid et al., 2009). From this research, it appears that very young infants (i.e., as young as 5 months; Michel, Kaduk, Ní Choisdealbha, & Reid, 2017) display N400-like electrophysiological responses (i.e., a mid-latency negativity peaking at central-parietal electrodes) to action outcomes that violate semantic expectations, such as bringing a spoonful of food to one’s forehead (Kaduk et al., 2016; Reid et al., 2009). Older children (∼24 months) demonstrate a similar pattern of N400-like electrophysiological response to the disruption of expected outcomes to relatively novel events after a brief familiarization phase (Pace, Carver, & Friend, 2013).

    View all citing articles on Scopus
    View full text