PT - JOURNAL ARTICLE AU - Fountas, Zafeirios AU - Sylaidi, Anastasia AU - Nikiforou, Kyriacos AU - Seth, Anil K. AU - Shanahan, Murray AU - Roseboom, Warrick TI - A predictive processing model of episodic memory and time perception AID - 10.1101/2020.02.17.953133 DP - 2021 Jan 01 TA - bioRxiv PG - 2020.02.17.953133 4099 - http://biorxiv.org/content/early/2021/07/09/2020.02.17.953133.short 4100 - http://biorxiv.org/content/early/2021/07/09/2020.02.17.953133.full AB - Human perception and experience of time is strongly influenced by ongoing stimulation, memory of past experiences, and required task context. When paying attention to time, time experience seems to expand; when distracted, it seems to contract. When considering time based on memory, the experience may be different than in the moment, exemplified by sayings like “time flies when you’re having fun”. Experience of time also depends on the content of perceptual experience – rapidly changing or complex perceptual scenes seem longer in duration than less dynamic ones. The complexity of interactions between attention, memory, and perceptual stimulation is a likely reason that an overarching theory of time perception has been difficult to achieve. Here, we introduce a model of perceptual processing and episodic memory that makes use of hierarchical predictive coding, short-term plasticity, spatio-temporal attention, and episodic memory formation and recall, and apply this model to the problem of human time perception. In an experiment with ~ 13, 000 human participants we investigated the effects of memory, cognitive load, and stimulus content on duration reports of dynamic natural scenes up to ~ 1 minute long. Using our model to generate duration estimates, we compared human and model performance. Model-based estimates replicated key qualitative biases, including differences by cognitive load (attention), scene type (stimulation), and whether the judgement was made based on current or remembered experience (memory). Our work provides a comprehensive model of human time perception and a foundation for exploring the computational basis of episodic memory within a hierarchical predictive coding framework.Author summary Experience of the duration of present or past events is a central aspect of human experience, the underlying mechanisms of which are not yet fully understood. In this work, we combine insights from machine learning and neuroscience to propose a combination of mathematical models that replicate human perceptual processing, long-term memory, attention, and duration perception. Our computational implementation of this framework can process information from video clips of ordinary life scenes, record and recall important events, and report the duration of these clips. To assess the validity of our proposal, we conducted an experiment with ~ 13, 000 human participants. Each was shown a video between 1-64 seconds long and reported how long they believed it was. Reports of duration by our computational model qualitatively matched these human reports, made about the exact same videos. This was true regardless of the video content, whether time was actively judged or based on memory of the video, or whether the participants focused on a single task or were distracted - all factors known to influence human time perception. Our work provides the first model of human duration perception to incorporate these diverse and complex factors and provides a basis to probe the deep links between memory and time in human experience.Competing Interest StatementThe authors have declared no competing interest.