Abstract
Our capacity to interact with our dynamic world in a timely manner (e.g., catch a ball) suggests that our brain generates predictions of unfolding external dynamics. While theories assume such neural predictions, empirical evidence typically only captures a snapshot or indirect consequence of prediction, and uses simple static stimuli of which predictability is manipulated. However, the rich dynamics of predictive representations remain largely unexplored. We present a novel dynamic extension to representational similarity analysis (RSA) that uses temporally variable models to capture naturalistic dynamic stimuli, and demonstrate both lagged and predictive neural representations in source-reconstructed MEG data. Interestingly, the predictive representations show a hierarchical pattern, such that higher-level stimulus features are predicted earlier in time, while lower-level features are predicted closer in time to the actual sensory input. This promising new approach opens the door for addressing important outstanding questions on how our brain represents and predicts our dynamic world.
Competing Interest Statement
The authors have declared no competing interest.