Abstract
Humans can rapidly extract information from faces even in challenging viewing conditions, yet the neural representations supporting this ability are still not well understood. Here, we manipulated the presentation duration of backward-masked facial expressions and used magnetoencephalography (MEG) to investigate the computations underpinning rapid face processing. Multivariate analyses revealed two stages in face perception, with the ventral visual stream encoding facial features prior to facial configuration. When presentation time was reduced, the emergence of sustained featural and configural representations was delayed. Importantly, these representations explained behaviour during an expression recognition task. Together, these results describe the adaptable system linking visual features, brain and behaviour during face perception.