Abstract
In this hypothesis paper we argue that a simple Hebbian learning mechanism, along with reinforcement through well-known neuromodulatory systems, can form the basis of a computational theory of learning that can support both low level learning and the development of human level intelligence. We show that when driven by example behavior Hebbian learning rules can support procedural, episodic and semantic memory. For humans, we hypothesize that the abilities to manipulate an off-line world model and to abstract using language allow for the generation and communication of rich example behavior, respectively, and thereby support human learning and a gradual increase of collective human intelligence across generations. We also compare the properties of Example Based Hebbian (EBH) learning with those of backpropagation-based learning and argue that the EBH mechanism is more consistent with observed characteristics of human learning.