Simple neural models of classical conditioning

Biol Cybern. 1986;55(2-3):187-200. doi: 10.1007/BF00341933.

Abstract

A systematic study of the necessary and sufficient ingredients of a successful model of classical conditioning is presented. Models are constructed along the lines proposed by Gelperin, Hopfield, and Tank, who showed that many conditioning phenomena could be reproduced in a model using non-trivial distributed representations of the sensory stimuli. The additional phenomena of extinction and blocking are found to be obtainable by generalizing the Hebbian learning algorithm, rather than by additional complications in the hardware. The most successful algorithms have a minimal number of adjustable parameters, and require only local-time information about the level of postsynaptic activity. The proper behavior of these algorithms is verified by both simple analytic arguments and by direct numerical simulation. Certain detailed assumptions concerning the distributed sensory representations are also found to have a surprising degree of importance.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Animals
  • Conditioning, Classical*
  • Learning
  • Mathematics
  • Models, Neurological*
  • Models, Psychological
  • Neurons / physiology*