PT - JOURNAL ARTICLE AU - Gaia Tavoni AU - Takahiro Doi AU - Chris Pizzica AU - Vijay Balasubramanian AU - Joshua I. Gold TI - The complexity dividend: when sophisticated inference matters AID - 10.1101/563346 DP - 2019 Jan 01 TA - bioRxiv PG - 563346 4099 - http://biorxiv.org/content/early/2019/10/08/563346.short 4100 - http://biorxiv.org/content/early/2019/10/08/563346.full AB - Animals infer latent properties of the world from noisy and changing observations. Complex, probabilistic approaches to this challenge such as Bayesian inference are accurate but cognitively demanding, relying on extensive working memory and adaptive learning. Simple heuristics are easy to implement but may be less accurate. What is the appropriate balance between complexity and accuracy? We construct a hierarchy of strategies of variable complexity and find a power law of diminishing returns: increasing complexity gives progressively smaller gains in accuracy. The rate of diminishing returns depends systematically on the statistical uncertainty in the world, such that complex strategies do not provide substantial benefits over simple ones when uncertainty is too high or too low. In between, there is a complexity dividend. We translate these theoretical insights into specific predictions about how working memory and adaptivity should be modulated by uncertainty, and we corroborate these predictions in a psychophysical experiment.