PT - JOURNAL ARTICLE AU - Christopher J. Cueva AU - Encarni Marcos AU - Alex Saez AU - Aldo Genovesio AU - Mehrdad Jazayeri AU - Ranulfo Romo AU - C. Daniel Salzman AU - Michael N. Shadlen AU - Stefano Fusi TI - Delay activity dynamics: task dependent time encoding and low dimensional trajectories AID - 10.1101/504936 DP - 2018 Jan 01 TA - bioRxiv PG - 504936 4099 - http://biorxiv.org/content/early/2018/12/29/504936.short 4100 - http://biorxiv.org/content/early/2018/12/29/504936.full AB - Our decisions often depend on multiple sensory experiences separated by time delays. The brain can remember these experiences (working memory) and, at the same time, it can easily estimate the timing between events, which plays a fundamental role in anticipating stimuli and planning future actions. To better understand the neural mechanisms underlying working memory and time encoding we analyze neural activity recorded during delays in four different experiments on non-human primates and we consider three classes of neural network models to explain the data: attractor neural networks, chaotic reservoir networks and recurrent neural networks trained with backpropagation through time. To disambiguate these models we propose two analyses: 1) decoding the passage of time from neural data, and 2) computing the cumulative dimensionality of the neural trajectory as it evolves over time. Our analyses reveal that time can be decoded with high precision in tasks where timing information is relevant and with lower precision in tasks where it is irrelevant to perform the task, suggesting that working memory need not rely on constant rates around a fixed activity pattern. In addition, our results further constrain the mechanisms underlying time encoding as we show that the dimensionality of the trajectories is low for all datasets. Consistent with this, we find that the linear “ramping” component of each neuron’s firing rate strongly contributes to the slow timescale variations that make decoding time possible. We show that these low dimensional ramping trajectories are beneficial as they allow computations learned at one point in time to generalize across time. Our observations constrain the possible models that explain the data, ruling out simple attractor models and randomly connected recurrent networks (chaotic reservoir networks) that vary on relatively fast timescales, but agree with recurrent neural network models trained with backpropagation through time. Our results demonstrate a powerful new tool for studying the interplay of temporal processing and working memory by objective classification of electrophysiological activity.