PT - JOURNAL ARTICLE AU - Ulises Pereira AU - Nicolas Brunel TI - Attractor dynamics in networks with learning rules inferred from <em>in vivo</em> data AID - 10.1101/199521 DP - 2017 Jan 01 TA - bioRxiv PG - 199521 4099 - http://biorxiv.org/content/early/2017/10/06/199521.short 4100 - http://biorxiv.org/content/early/2017/10/06/199521.full AB - The attractor neural network scenario is a popular scenario for memory storage in association cortex, but there is still a large gap between models based on this scenario and experimental data. We study a recurrent network model in which both learning rules and distribution of stored patterns are inferred from distributions of visual responses for novel and familiar images in inferior temporal cortex (ITC). Unlike classical attractor neural network models, our model exhibits graded activity in retrieval states, with distributions of firing rates that are close to lognormal. Inferred learning rules are close to maximizing the number of stored patterns within a family of unsupervised Hebbian learning rules, suggesting learning rules in ITC are optimized to store a large number of attractor states. Finally, we show that there exists two types of retrieval states: one in which firing rates are constant in time, another in which firing rates fluctuate chaotically.