Abstract
Categorization refers to the process of mapping continuous sensory inputs onto discrete concepts. Humans, nonhuman primates, and rodents can readily learn arbitrary categories defined by low-level visual features such as hue and orientation, and behavioral studies indicate that such learning distorts perceptual sensitivity for category-defining features such that discrimination performance for physically similar exemplars from different categories is enhanced while discrimination performance for equally similar exemplars from the same category is reduced. These distortions may result from systematic biases in neural representations of discriminanda that begin at the earliest stages of visual processing. We tested this hypothesis in two experiments where human observers learned to classify a set of oriented stimuli into two discrete groups. After behavioral training, we used multivoxel pattern analysis and an inverted encoding model to visualize and quantify population-level neural representations of stimulus orientation from noninvasive measurements of human brain activity (fMRI and EEG) in early retinotopic visual cortical areas. These analyses revealed that during category discrimination neural representations of oriented stimuli in early visual areas were systematically biased towards the center of the appropriate category. These shifts were strongest for orientations near the category boundary, predicted participants’ overt category judgments, and emerged rapidly after stimulus onset. Collectively, these results suggest that category information can bias processing at very early stages of the visual processing hierarchy.