PT - JOURNAL ARTICLE AU - Roger Koenig-Robert AU - Genevieve Quek AU - Tijl Grootswagers AU - Manuel Varlet TI - Movement trajectories as a window into the dynamics of emerging neural representations AID - 10.1101/2023.03.15.532848 DP - 2023 Jan 01 TA - bioRxiv PG - 2023.03.15.532848 4099 - http://biorxiv.org/content/early/2023/03/16/2023.03.15.532848.short 4100 - http://biorxiv.org/content/early/2023/03/16/2023.03.15.532848.full AB - Transforming sensory inputs into meaningful neural representations is critical to adaptive behaviour in everyday environments. While non-invasive neuroimaging methods are the de-facto method for investigating neural representations, they remain expensive, not widely available, time-consuming, and restrictive in terms of the experimental conditions and participant populations they can be used with. Here we show that movement trajectories collected in online behavioural experiments can be used to measure the emergence and dynamics of neural representations with fine temporal resolution. By combining online computer mouse-tracking and publicly available neuroimaging (MEG and fMRI) data via Representational Similarity Analysis (RSA), we show that movement trajectories track the evolution of visual representations over time. We used a time constrained face/object categorization task on a previously published set of images containing human faces, illusory faces and objects to demonstrate that time-resolved representational structures derived from movement trajectories correlate with those derived from MEG, revealing the unfolding of category representations in comparable temporal detail (albeit delayed) to MEG. Furthermore, we show that movement-derived representational structures correlate with those derived from fMRI in most task-relevant brain areas, faces and objects selective areas in this proof of concept. Our results highlight the richness of movement trajectories and the power of the RSA framework to reveal and compare their information content, opening new avenues to better understand human perception.Competing Interest StatementThe authors have declared no competing interest.