Semantics boosts syntax in artificial grammar learning tasks with recursion

J Exp Psychol Learn Mem Cogn. 2012 May;38(3):776-82. doi: 10.1037/a0026986. Epub 2012 Jan 23.

Abstract

Center-embedded recursion (CER) in natural language is exemplified by sentences such as "The malt that the rat ate lay in the house." Parsing center-embedded structures is in the focus of attention because this could be one of the cognitive capacities that make humans distinct from all other animals. The ability to parse CER is usually tested by means of artificial grammar learning (AGL) tasks, during which participants have to infer the rule from a set of artificial sentences. One of the surprising results of previous AGL experiments is that learning CER is not as easy as had been thought. We hypothesized that because artificial sentences lack semantic content, semantics could help humans learn the syntax of center-embedded sentences. To test this, we composed sentences from 4 vocabularies of different degrees of semantic content due to 3 factors (familiarity, meaning of words, and semantic relationship between words). According to our results, these factors have no effect one by one but they make learning significantly faster when combined. This leads to the assumption that there were different mechanisms at work when CER was parsed in natural and in artificial languages. This finding questions the suitability of AGL tasks with artificial vocabularies for studying the learning and processing of linguistic CER.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Association Learning / physiology*
  • Comprehension
  • Female
  • Humans
  • Male
  • Psycholinguistics*
  • Random Allocation
  • Recognition, Psychology
  • Semantics*
  • Statistics, Nonparametric
  • Verbal Learning*
  • Vocabulary
  • Young Adult