Skip to main content

OPINION article

Front. Psychol., 01 August 2012
Sec. Cognitive Science
This article is part of the Research Topic The role of body and environment in cognition View all 27 articles

Embodied Space in Early Blind Individuals

  • 1 Centre de Neuroscience Système et Cognition, Institut de Recherche en Sciences Psychologiques, Université Catholique de Louvain, Louvain, Belgium
  • 2 Centre de Recherche en Neuropsychologie et Cognition, Université de Montréal, Montréal, QC, Canada
  • 3 Centre for Mind/Brain Science, University of Trento, Trento, Italy

The study of visually deprived individuals offers a unique opportunity to investigate the role that vision plays in shaping how we process our surrounding space. The visual system typically provides the most accurate and reliable spatial information about our surroundings and therefore is usually considered as the primary sense when spatial processing is at play. One of the best examples of such visual dominance in space perception comes from experiments showing that when a sound is accompanied by a visual stimulus at a different location, people tend to perceive this sound incorrectly at the same position as the visual stimulus (Pick et al., 1969). This “ventriloquist” effect occurs because the brain affords more weight to visual information in localizing the audiovisual event, thus inducing a “visual capture” of acoustic space (Alais and Burr, 2004).

It was first thought that visual deprivation might be detrimental to the development of spatial abilities in the remaining modalities since vision may be required to calibrate the other sensory systems (Axelrod, 1959; Rock and Halper, 1969; Warren and Cleaves, 1971). Interestingly, this does not appear to be the case since several studies have shown that blind people are usually as good as and often better than normal sighted controls (SCs) in the processing of non-visual spatial inputs (Lessard et al., 1998; see Collignon et al., 2009a for review). The recurrent hypothesis to explain such findings is that vision loss is partly offset by an increased use of the remaining senses (Wong et al., 2011) which triggers enhancement in their efficiency concomitantly to compensatory brain reorganization processes (Gougoux et al., 2005; Collignon et al., 2011). If this may be a part of the story, another possibility that we want to address in the present paper is that aside quantitative differences between sighted and blind people in their perceptual skills, visual deprivation may result in qualitatively different ways of processing non-visual information (Eimer, 2004). While sighted people may automatically process spatial information in an external spatial frame of reference.1 Early blind (EB) participants may preferentially use an internal coordinate system.2

Bradshaw et al. (1986) were the first to suggest a qualitative difference in the way EB individuals process touch. In this study, a rod was placed within a shorter pipe. EB and SC participants were asked to slide the rod within the pipe until the rod extremities were judged equidistant from the ends of the pipe. Results demonstrated that SC placed the midline of the pipe slightly to the left of the true midpoint (leftward bias or pseudo-neglect; see Jewell and McCourt, 2000 for a review) with hands placed in parallel or crossed over the body midline. EB participants, in contrast, showed a leftward bias with hands in parallel (see also Sampaio et al., 1995) but a rightward bias with the arms crossed. Even if this effect was only elusively discussed, authors nonetheless interpreted it as reflecting a more internal representation of space in EB. According to the view that the right hemisphere plays a dominant role in attentional control, the leftward bias shown by sighted individuals may be due to the fact that the right hemisphere bias attention to the left visual space so that rods appear longer in the control lateral left hemifield. The reversed pseudo-neglect effect presented by the EB in the crossed posture may in contrast indicate that the right hemisphere of these participants actually affords more attention to the contralateral tactile space therefore leading to an overestimation of the side of space where the left hand is placed.

More recently, Röder et al. (2004) brought new and more compelling evidence in support of this idea. In their temporal order judgment (TOJ) task, participants were asked which of the two hands received a tactile stimulus first. As expected, SC were less accurate with crossed than with uncrossed hands (Yamamoto and Kitazawa, 2001; Shore et al., 2002). This is accounted by the fact that tactile stimuli are not only represented in an anatomical reference frame but are automatically remapped into external spatial coordinates, inducing a conflict between somatotopic and external spatial codes when the hands are crossed over the body midline (Pavani et al., 2000; Kitazawa, 2002; Shore et al., 2002; Azañón and Soto-Faraco, 2008; Azañón et al., 2010a). By contrast, crossing the hands did not lead to a general decrement in EB tactile discrimination performance (Röder et al., 2004) suggesting that the automatic external remapping process of touch was not innate but rather depended on early visual experience (see also Bremner et al., 2008). This idea was later supported by an electroencephalographic study. While the detection of deviant tactile stimuli on the hand induced event-related potentials that varied in crossed when compared to uncrossed condition in SC, changing the posture of the hand had no influence on the EB brain activity (Röder et al., 2008).

The lower incidence of using an external reference frame in EB individuals has also been observed in tasks investigating the multisensory control of action (Simon effect: Röder et al., 2007), the processing of numbers (SNARC effect: Crollen et al., 2011), and spatial navigation (Vecchi et al., 2004; Noordzij et al., 2006). When required to press a left or right response key depending on the bandwidth of a sound presented from a left or right loudspeaker, EB reacted as late blind (LB) and SC in an uncrossed hand posture: they performed better when the spatial localization of the sound was compatible with the spatial localization of the response key (i.e., Simon effect). In contrast, when participants performed the task with crossed hands EB performed more rapidly than their sighted peers and, interestingly, presented a reversal of the Simon effect while LB and SC still showed a classic Simon effect (Röder et al., 2007). The presentation of a sensory stimulus to SC and LB therefore primes the response key compatible with the location of the stimulus in external space, regardless of which anatomical hand is used to press it. In EB, in contrast, the sensory stimulus primes the anatomical hand congruent with the location of the stimulus, regardless of where in space that hand is placed. SC, LB, and EB also presented a similar behavioral pattern when performing a numerical comparison task in an uncrossed hands posture. They responded faster when a left response was required for numbers smaller than five and when a right response was required for numbers larger than five (i.e., SNARC effect). As in the Simon task, however, crossing the hands resulted in a reversal of the SNARC effect in EB participants only (Crollen et al., 2011). The fact that LB and SC participants were similarly affected by crossing the hands indicates that once an external frame of reference is acquired it will continue to be used even though visual information may no longer be available (Röder et al., 2004, 2007; Crollen et al., 2011). Finally, differences between blind and sighted subjects have also been highlighted in spatial navigation tasks. While tasks requiring the use of an egocentric reference frame (i.e., route-knowledge) are performed equally well by SC and EB, tasks requiring the use of an allocentric reference frame (i.e., survey knowledge) are performed less well by the EB than by the SC (Vecchi et al., 2004; Noordzij et al., 2006).

At this stage, one may wonder why sighted individuals automatically remap touch in external coordinates since it can lead to confusion and slow down their reaction times (RT) when discriminating tactile information. This automatic remapping from somatotopic to external space is actually very effective to provide a common framework to coordinate and integrate spatial information obtained through touch with spatial information obtained through other sensory modalities, such as vision or audition which are coded by default in external spatial coordinates. This is particularly critical since the hands move constantly in the peri-personal space as different postures are adopted. The default use of an anatomically anchored reference system in EB may therefore actually prevent the effective integration of different sensory modalities in a multisensory integration task.

In a recent study, EB, LB, and SC groups were required to lateralize auditory, tactile, and audio-tactile stimuli either with the hands uncrossed or crossed over the body midline (Collignon et al., 2009b). While performance in the tactile condition replicated the pattern of results found in previous studies (greater detrimental effect of the crossed posture in LB and SC relative to EB), the results of the auditory and audio-tactile conditions showed a greater detrimental effect of the crossed posture in EB. As mentioned earlier, when EB lateralize tactile stimuli in crossed posture they do not remap the proprioceptive information onto an external spatial frame of reference and therefore do not present the conflict between body-centered and external coordinates that is present in SC or even in LB (Röder et al., 2004). EB therefore process spatial tactile information faster than their sighted peers. In contrast, the absence of automatic external remapping of touch in EB actually prevents these participants from efficiently matching the external sound location and the anatomical coordinate of the responding (auditory condition) or stimulated (audio-tactile condition) hand. The conflict created by crossing the hands is therefore more disrupting in EB than in SC or LB in the auditory and audio-tactile condition (see also Röder et al., 2007, Experiment 2). In other words, the absence of automatic activation of an external reference frame for perception and action in EB may impair multisensory integration and action control when there is a conflict between anatomical and external reference frames, for instance, when a sound has to be integrated with a touch in a hand-crossed posture (Collignon et al., 2009b).

In sum, developmental vision appears to trigger the development of the automatic recoding of sensory-perception/motor-control in an external space. Our opinion is that some of the advantages/deficits observed in EB (e.g., faster/slower RTs to non-visual events) might be explained, at least in part, by such qualitative changes in the way they process non-visual spatial information. For example, one of the most recurrent finding in the blind literature is the observation of faster RT to non-visual spatial targets in EB when compared to SC (e.g., Kujala et al., 1997; Hötting et al., 2004; Collignon et al., 2006, 2009b; Collignon and De Volder, 2009). Since the automatic external remapping process appears to occur between 100 and 360 ms (Azañón and Soto-Faraco, 2008; Heed and Röder, 2010; Overvliet et al., 2011), blind participants who do not automatically remap tactile/spatial information in external space may not only be more resistant to conflict created by crossing hand posture but may also process spatial information some hundreds of milliseconds faster than sighted individuals. Indeed, in the TOJ (Röder et al., 2004), SIMON (Röder et al., 2007), and SNARC (Crollen et al., 2011) experiments described above, EB participants consistently showed faster RT to non-visual targets in the uncrossed posture. Indirect support of the idea that EB may somehow “skip” the external remapping computational step also comes from our observation that transcranial magnetic stimulation over the right intra-parietal sulcus (where the external remapping seems to occur: Makin et al., 2007; Azañón et al., 2010b) disrupted the spatial processing of sounds only in SC but not in EB (Collignon et al., 2009c).

Interestingly, using either an internal or an external frame of reference appears to facilitate performance on different tasks. While the default use of an internal reference frame leads to better performance in tactile lateralization task in EB, the use of an external frame of reference is more adapted for spatial navigation (Noordzij et al., 2006), multisensory integration, and the control of action toward external auditory sources in peri-personal space (Röder et al., 2007; Collignon et al., 2009b). It is however important to note that the spontaneous tendency to organize the environment through internal coordinates in EB does not mean that they are incapable of constructing an external coordinate system (see Eardley and van Velzen, 2011) but this form of encoding is less automatic than the anatomical one in this population.

Footnotes

  1. ^In an external reference frame, locations are represented within a framework external to the individual’s body and therefore independent of the position of his limbs.
  2. ^In an internal reference frame, locations are represented with respect to the position of the observer’s body and are therefore dependent of the position of his limbs.

References

Alais, D., and Burr, D. (2004). The ventriloquist effect results from near-optimal bimodal integration. Curr. Biol. 14, 257–262.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Axelrod, S. (1959). Effect of Early Blindness: Performance of Blind and Sighted Children on Tactile and Auditory Tasks. New York: American Foundation for the Blind.

Azañón, E., Camacho, K., and Soto-Faraco, S. (2010a). Tactile remapping beyond space. Eur. J. Neurosci. 31, 1858–1867.

CrossRef Full Text

Azañón, E., Longo, M. R., Soto-Faraco, S., and Haggard, P. (2010b). The posterior parietal cortex remaps touch into external space. Curr. Biol. 20, 1304–1309.

CrossRef Full Text

Azañón, E., and Soto-Faraco, S. (2008). Changing reference frames during the encoding of tactile events. Curr. Biol. 18, 1044–1049.

CrossRef Full Text

Bradshaw, J. L., Nettleton, N. C., Nathan, G., and Wilson, L. (1986). Tactual-kinesthetic matching of horizontal extents by the long-term blind: absence or reversal of normal left-side underestimation. Neuropsychologia 24, 261–264.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Bremner, A. J., Holmes, N. P., and Spence, C. (2008). Infants lost in (peripersonal) space? Trends Cogn. Sci. 12, 298–305.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Collignon, O., and De Volder, A. G. (2009). Further evidence that congenitally blind participants react faster to auditory and tactile spatial targets. Can. J. Exp. Psychol. 63, 287–293.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Collignon, O., Renier, L., Bruyer, R., Tranduy, D., and Veraart, C. (2006). Improved selective and divided spatial attention in early blind subjects. Brain Res. 1075, 175–182.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Collignon, O., Vandewalle, G., Voss, P., Albouy, G., Charbonneau, G., Lassonde, M., and Lepore, F. (2011). Functional specialization for auditory-spatial processing in the occipital cortex of congenitally blind humans. Proc. Natl. Acad. Sci. U.S.A. 108, 4435–4440.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Collignon, O., Voss, P., Lassonde, M., and Lepore, F. (2009a). Cross-modal plasticity for the spatial processing of sounds in visually deprived subjects. Exp. Brain Res. 192, 343–358.

CrossRef Full Text

Collignon, O., Charbonneau, G., Lassonde, M., and Lepore, F. (2009b). Early visual deprivation alters multisensory processing in peripersonal space. Neuropsychologia 47, 3236–3243.

CrossRef Full Text

Collignon, O., Davare, M., Olivier, E., and De Volder, A. (2009c). Reorganisation of the right occipito-parietal stream for auditory spatial processing in early blind humans. A transcranial magnetic stimulation study. Brain Topogr. 21, 232–240.

CrossRef Full Text

Crollen, V., Dormal, G., Seron, X., Lepore, F., and Collignon, O. (2011). Embodied numbers: the role of vision in the development of number-space interactions. Cortex. doi: 10.1016/j.cortex.2011.11.006

CrossRef Full Text

Eardley, A. F., and van Velzen, J. (2011). Event-related potential evidence for the use of external coordinates in the preparation of tactile attention by the early blind. Eur. J. Neurosci. 3, 1897–1907.

CrossRef Full Text

Eimer, M. (2004). Multisensory integration: how visual experience shapes spatial perception. Curr. Biol. 14, 115–117.

CrossRef Full Text

Gougoux, F., Zatorre, R. J., Lassonde, M., Voss, P., and Lepore, F. (2005). A functional neuroimaging study of sound localization: visual cortex activity predicts performance in early-blind individuals. PLoS Biol. 3, e27. doi: 10.1371/journal.pbio.0030027

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Heed, T., and Röder, B. (2010). Common anatomical and external coding for hands and feet in tactile attention: evidence from event-related potentials. J. Cogn. Neurosci. 22, 184–202.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Hötting, K., Rösler, F., and Röder, B. (2004). Altered auditory–tactile interactions in congenitally blind humans: an event-related potential study. Exp. Brain Res. 159, 370–381.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Jewell, G., and McCourt, M. E. (2000). Pseudoneglect: a review and meta-analysis of performance factors in line bisection tasks. Neuropsychologia 38, 93–110.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Kitazawa, S. (2002). Where conscious sensation takes place. Conscious. Cogn. 11, 475–477.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Kujala, T., Lehtokoski, A., Alho, K., Kekoni, J., and Näätänen, R. (1997). Faster reaction times in the blind than sighted during bimodal divided attention. Acta Psychol. (Amst.) 96, 75–82.

CrossRef Full Text

Lessard, N., Pare, M., Lepore, F., and Lassonde, M. (1998). Early-blind human subjects localize sound sources better than sighted subjects. Nature 395, 278–280.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Makin, T. R., Holmes, N. P., and Zohary, E. (2007). Is that near my hand? Multisensory representation of peripersonal space in human intraparietal sulcus. J. Neurosci. 27, 731–740.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Noordzij, M. L., Zuidhoek, S., and Postma, A. (2006). The influence of visual experience on the ability to form spatial mental models based on route and survey descriptions. Cognition 100, 321–342.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Overvliet, K. E., Azanon, E., and Soto-Faraco, S. (2011). Somatosensory saccades reveal the timing of tactile spatial remapping. Neuropsychologia 49, 3046–3052.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Pavani, F., Spence, C., and Driver, J. (2000). Visual capture of touch: out-of-the-body experiences with rubber gloves. Psychol. Sci. 11, 353–359.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Pick, H. L., Warren, D. H., and Hay, J. C. (1969). Sensory conflict in judgments of spatial direction. Percept. Psychophys. 6, 203–205.

CrossRef Full Text

Rock, I., and Halper, F. (1969). From perception without a retinal image. Am. J. Psychol. 82, 425–440.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Röder, B., Focker, J., Hotting, K., and Spence, C. (2008). Spatial coordinate systems for tactile spatial attention depend on developmental vision: Evidence from event-related potentials in sighted and congenitally blind adult humans. Eur. J. Neurosci. 28, 475–483.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Röder, B., Kusmierek, A., Spence, C., and Schicke, T. (2007). Developmental vision determines the reference frame for the multisensory control of action. Proc. Natl. Acad. Sci. U.S.A. 104, 4753–4758.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Röder, B., Rösler, F., and Spence, C. (2004). Early vision impairs tactile perception in the blind. Curr. Biol. 14, 121–124.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Sampaio, E., Gouarir, C., and Mvondo, D. (1995). Tactile and visual bisection tasks by sighted and blind children. Dev. Neuropsychol. 11, 109–127.

CrossRef Full Text

Shore, D. I., Spry, E., and Spence, C. (2002). Confusing the mind by crossing the hands. Brain Res. Cogn. Brain Res. 14, 153–163.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Vecchi, T., Tinti, C., and Cornoldi, C. (2004). Spatial memory and integration processes in congenital blindness. Neuroreport 15, 2787–2790.

Pubmed Abstract | Pubmed Full Text

Warren, D. H., and Cleaves, W. T. (1971). Visual-proprioceptive interaction under large amounts of conflict. J. Exp. Psychol. 90, 206–214.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Wong, M., Gnanakumaran, V., and Goldreich, D. (2011). Tactile spatial acuity enhancement in blindness: evidence for experience-dependent mechanisms. J. Neurosci. 31, 7028–7037.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Yamamoto, S., and Kitazawa, S. (2001). Reversal of subjective temporal order due to arm crossing. Nat. Neurosci. 4, 759–765.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Citation: Crollen V and Collignon O (2012) Embodied space in early blind individuals. Front. Psychology 3:272. doi: 10.3389/fpsyg.2012.00272

Received: 08 June 2012; Accepted: 16 July 2012;
Published online: 01 August 2012.

Edited by:

Louise Connell, University of Manchester, UK

Reviewed by:

Stephanie A. Gagnon, Massachusetts General Hospital and Harvard Medical School, USA

Copyright: © 2012 Crollen and Collignon. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in other forums, provided the original authors and source are credited and subject to any copyright notices concerning any third-party graphics etc.

*Correspondence: olivier.collignon@unitn.it

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.