Skip to main content
Log in

The detection of multisensory stimuli in an orthogonal sensory space

  • Research Article
  • Published:
Experimental Brain Research Aims and scope Submit manuscript

Abstract

The detection of a stimulus can be considerably facilitated if the stimulus engages two or more sensory modalities simultaneously. This phenomenon, commonly referred to as multisensory (or cross-modal) facilitation, has been demonstrated behaviorally in cats and humans. A number of rules are thought to govern this phenomenon. These rules state that strong facilitation is to be expected only if the two sensory modalities are stimulated simultaneously and at the same place, and if the stimuli themselves are weak. However, these rules are not sufficient to allow accurate predictions of multimodal stimulus detection probabilities directly from physical stimulus parameters. Here we show that such predictions are possible on the basis of a simple and biologically plausible psychophysical model, which relates the detection of audio-visual, audio-tactile or visual-tactile stimuli to the Euclidean distance that these stimuli span in an orthogonal sensory space.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. Assume that the processing of the two sensory modalities occurred “separately” in the sense that these processes are statistically independent, as we have assumed in the formulation of the “independent model” (Eq. 6) above. Then, if the detection probabilities in each modality are p xand py, the probability of failing to detect the stimulus would be (1−px) or (1−py), respectively. Failure to detect a bimodal stimulus occurs when the observer fails to detect the stimulus in modality X and in modality Y. Under statistical independence, the probability of failure in both modalities would equal the product of the individual failure probabilities, i.e. (1−px)(1−py). The bimodal detection probability should then equal one minus this bimodal failure probability, i.e. one would expect that pxy=1−(1−px)(1−py)=px+py−pxpy. Given that pxpy>0, this must always be less than the sum of unimodal detection probabilities px+py.

Abbreviations

D :

Deviance

L :

Likelihood

N :

Number of trials

N(a,b) :

Normal (Gaussian) distribution with mean a and variance b

S :

Neural population signal (thought of as random variable)

χ 2 (a,b) :

Cumulative chi-Square distribution at a with b degrees of freedom

Δx, Δy :

Change in stimulus intensity in stimulus modality x, y

λ :

Decision threshold for the detection of a stimulus

μ :

Mean of the neural signal

Φ(z) :

Cumulative standard normal distribution at z

σ 2 :

Variance of the neural signal

References

  • Berry G, Matthews JNS, Armitage P (2001) Statistical methods in medical research. Blackwell Science Inc, Oxford

  • Bishop CM (1995) Neural networks for pattern recognition. Clarendon Press, Oxford

  • Britten KH, Shadlen MN, Newsome WT, Movshon JA (1992) The analysis of visual motion: a comparison of neuronal and psychophysical performance. J Neurosci 12:4745–4765

    CAS  PubMed  Google Scholar 

  • Britten KH, Newsome WT, Shadlen MN, Celebrini S, Movshon JA (1996) A relationship between behavioral choice and the visual responses of neurons in macaque MT. Vis Neurosci 13:87–100

    CAS  PubMed  Google Scholar 

  • Colonius H, Diederich A (2001) A maximum-likelihood approach to modeling multisensory enhancement. NIPS: 181–187

  • Drosler J (2000) An n-dimensional Weber law and the corresponding Fechner law. J Math Psychol 44:330–335

    PubMed  Google Scholar 

  • Ernst MO, Banks MS (2002) Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415:429–433

    Article  CAS  PubMed  Google Scholar 

  • Frassinetti F, Bolognini N, Ladavas E (2002) Enhancement of visual perception by crossmodal visuo-auditory interaction. Exp Brain Res 147:332–343

    Article  PubMed  Google Scholar 

  • Garner WR (1974) The processing of information and structure. John Wiley & Sons, New York

  • Georgopoulos AP, Schwartz AB, Kettner RE (1986) Neuronal population coding of movement direction. Science 233:1416–1419

    CAS  PubMed  Google Scholar 

  • Green DM, Swets JA (1974) Signal detection theory and psychophysics. Krieger, New York

  • Hillis JM, Ernst MO, Banks MS, Landy MS (2002) Combining sensory information: mandatory fusion within, but not between, senses. Science 298:1627–1630

    CAS  PubMed  Google Scholar 

  • King AJ, Palmer AR (1985) Integration of visual and auditory information in bimodal neurones in the guinea-pig superior colliculus. Exp Brain Res 60:492–500

    CAS  PubMed  Google Scholar 

  • King AJ, Schnupp JWH (1999) Multisensory convergence in neural function and development. In: Gazzaniga MS (ed) The new cognitive neurosciences. MIT Press, Cambridge, MA, pp 437–450

  • Lovelace CT, Stein BE, Wallace MT (2003) An irrelevant light enhances auditory detection in humans: a psychophysical analysis of multisensory integration in stimulus detection. Brain Res Cogn Brain Res 17:447–453

    PubMed  Google Scholar 

  • Meredith MA, Stein BE (1983) Interactions among converging sensory inputs in the superior colliculus. Science 221:389–391

    CAS  PubMed  Google Scholar 

  • Odgaard EC, Arieh Y, Marks LE (2003) Cross-modal enhancement of perceived brightness: sensory interaction versus response bias. Percept Psychophys 65:123–132

    PubMed  Google Scholar 

  • Ronacher B (1992) Pattern recognition in honeybees: multidimensional scaling reveals a city-block metric. Vis Res 32:1837–1843

    CAS  PubMed  Google Scholar 

  • Shadlen MN, Britten KH, Newsome WT, Movshon JA (1996) A computational analysis of the relationship between neuronal and behavioral responses to visual motion. J Neurosci 16:1486–1510

    CAS  PubMed  Google Scholar 

  • Stein BE, Meredith MA (1993) The merging of the senses. MIT Press, Cambridge

  • Stein BE, Huneycutt WS, Meredith MA (1988) Neurons and behavior: the same rules of multisensory integration apply. Brain Res 448:355–358

    Article  CAS  PubMed  Google Scholar 

  • Stein BE, Meredith MA, Huneycutt WS, McDade L (1989) Behavioral indeces of multisensory integration. J Cog Neurosci 1:12–24

    Google Scholar 

  • Stein BE, London N, Wilkinson LK, Price DD (1996) Enhancement of perceived visual intensity by auditory stimuli: a psychophysical analysis. J Cogn Neurosci 8:497–506

    Google Scholar 

  • Tolhurst DJ, Movshon JA, Dean AF (1983) The statistical reliability of signals in single neurons in cat and monkey visual cortex. Vis Res 23:775–785

    PubMed  Google Scholar 

  • Wallace MT, Meredith MA, Stein BE (1992) Integration of multiple sensory modalities in cat cortex. Exp Brain Res 91:484–488

    CAS  PubMed  Google Scholar 

  • Wickens TD (2002) Elementary signal detection theory. Oxford University Press, Oxford

Download references

Acknowledgements

We are grateful to Dr John Bithell for guidance on statistical methodology.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jan W. H. Schnupp.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Schnupp, J.W.H., Dawe, K.L. & Pollack, G.L. The detection of multisensory stimuli in an orthogonal sensory space. Exp Brain Res 162, 181–190 (2005). https://doi.org/10.1007/s00221-004-2136-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00221-004-2136-2

Keywords

Navigation