PT - JOURNAL ARTICLE AU - Antje Ihlefeld AU - Nima Alamatsaz AU - Robert M Shapley TI - Human Sound Localization Depends on Sound Intensity: Implications for Sensory Coding AID - 10.1101/378505 DP - 2018 Jan 01 TA - bioRxiv PG - 378505 4099 - http://biorxiv.org/content/early/2018/07/30/378505.short 4100 - http://biorxiv.org/content/early/2018/07/30/378505.full AB - A fundamental question of human perception is how we perceive target locations in space. Through our eyes and skin, the activation patterns of sensory organs provide rich spatial cues. However, for other sensory dimensions, including sound localization and visual depth perception, spatial locations must be computed by the brain. For instance, interaural time differences (ITDs) of the sounds reaching the ears allow listeners to localize sound in the horizontal plane. Our experiments tested two prevalent theories on how ITDs affect human sound localization: 1) the labelled-line model, encoding space through tuned representations of spatial location; versus 2) the hemispheric-difference model, representing space through spike-rate distances relative to a perceptual anchor. Unlike the labelled-line model, the hemispheric-difference model predicts that with decreasing intensity, sound localization should collapse toward midline reference, and this is what we observed behaviorally. These findings cast doubt on models of human sound localization that rely on a spatially tuned map. Moreover, analogous experimental results in vision indicate that perceived depth depends upon the contrast of the target. Based on our findings, we propose that the brain uses a canonical computation of location across sensory modalities: perceived location is encoded through population spike rate relative to baseline.