PT - JOURNAL ARTICLE AU - Gabrielle J. Gutierrez AU - Fred Rieke AU - Eric Shea-Brown TI - Nonlinear convergence preserves information AID - 10.1101/811539 DP - 2019 Jan 01 TA - bioRxiv PG - 811539 4099 - http://biorxiv.org/content/early/2019/10/21/811539.short 4100 - http://biorxiv.org/content/early/2019/10/21/811539.full AB - Neural circuits are structured in layers of converging and diverging nonlinear neurons with selectivities and preferences. These components have the potential to hamper an efficient encoding of the circuit inputs. Past computational studies have optimized the nonlinearities of single neurons, or the weights matrices of networks, to maximize encoded information yet none have grappled with simultaneously optimizing circuit structure and neuron response functions for efficient coding. Rather than an explicit optimization of that kind, our approach is to compare circuit configurations with different combinations of these suboptimal components to discover how the interactions of these components affect the efficient coding of the neural circuit. We construct computational model circuits with different configurations and we compute and compare their response entropies. We find that the circuit configuration with divergence, convergence, and nonlinear subunits preserves the most information despite the compressive loss induced by both the convergence and the nonlinearities individually. These results show that the combination of selective nonlinearities and a compressive architecture - both elements that induce lossy compression - can promote efficient coding in tandem.