Abstract
Divisive normalization is a canonical computation that explains contextual modulation of visual perception and neural responses in the visual system. Conceivably, normalization also underlies contextual modulation of bimanual touch, a perceptual process that likely requires combining what is felt on the hands with where the hands are located in space. We found that touch experienced on one hand systematically modulates how touch is perceived on the other hand. Notably, bimanual interaction patterns and their sensitivity to hand locations differed depending on whether participants directed attention to the frequency or intensity of the cues, which were always mechanical vibrations. These idiosyncratic perceptual patterns were well explained by distinct cue combination models that each comprise divisive normalization. Our findings indicate that, while feature-specific rules govern bimanual touch, normalization underlies contextual modulation between the hands.
Significance Statement How we perceive sensory cues depends on the context in which we experience them. Contextual modulation of vision results from divisive normalization, a canonical computation which adjusts the activity of visual neurons according to the pooled activity over the neural population. We tested the hypothesis that contextual interactions between cues felt on the two hands are also consistent with normalization. We found that touch on one hand systematically influenced perception on the other hand. Moreover, we observed distinct contextual modulation patterns when subjects attended to the frequency or intensity of the cues, which were always mechanical vibrations. Despite these differences, normalization models accounted for both perceptual patterns. Our results support the notion that normalization underlies contextual modulation between the hands.