Visual homing: an insect perspective
Highlights
► Novel view-reconstruction techniques serve to identify navigational information. ► Comparative modelling helps to identify the computations needed in visual homing. ► New neuroanatomy tools begin to unravel the neurobiology of insect visual homing.
Introduction
Animals locate places in the world by memorizing the appearance of the landmark panorama as seen from these places. Clear evidence for this comes from the observation that insects search for a goal relative to dominant landmarks and not at the absolute position of the goal. Insect research can contribute significantly to the study of animal homing abilities because insect behaviour can be analysed in exquisite detail under the complex natural conditions in which these abilities have evolved.
Within a certain range of a goal location, the difference between memorized and currently experienced views can provide instructions on how to move towards the goal. Cartwright and Collett's snapshot model [1••] was one of the first attempts to specify the information content of views (the relative angular positions of landmarks and their apparent size) and to provide a procedure of how the mismatch between remembered and currently experienced views can be exploited to obtain navigational guidance for homing.
Since then many homing algorithms have been developed and tested in robotics research (e.g. [2, 3]) to provide proof of concept, to find parsimonious representations and algorithms and to test homing efficiency and robustness. Recent attempts to account for the natural, outdoor conditions for visual homing included the suggestion that the strong colour contrast of terrestrial objects against the sky may be of particular importance to localization [4] and the demonstration that in outdoor scenes, panoramic image differences develop smoothly with distance from a reference location (translational Image Difference Functions, IDFs) and in addition, provide robust visual compass information (rotational IDFs, see Figure 1) [5, 6].
Insect and robot navigation research has generated three important insights: first, reconstruction of what homing animals actually see under natural conditions has indicated that vision provides robust and reliable information on locations and routes, even at the very low resolution of insect compound eyes (e.g. [2, 6, 7]). Second, principled and comparative modelling and testing of visual homing schemes has narrowed our search for underlying mechanisms. Third, depending on the navigational information content of their environment, insects incorporate any sensory signature that defines a significant place into their goal seeking procedures, such as visual, olfactory, tactile and path integration information (e.g. [8••, 9, 10, 11]).
Section snippets
What defines a location in space?
Locations and routes are defined by visual, olfactory, textural and magnetic signatures that are uniquely associated with them. The topography, salience and stability of cues differ in different environments, a fact that leads to differences in their impact on the navigational abilities and mechanisms of animals as recent comparative studies show.
For instance, ants that forage on North African salt-pans, which lack stable and unique visual features rely on path integration (PI) to navigate back
The navigational information content of panoramic images
Panoramic images contain robust information on location in space and can also be used to define routes. There is an ongoing discussion in the insect and robot literature on whether or not there is a need for discrete object recognition and representation in landmark guidance. While the snapshot model of insect homing assumes scenes to be segmented into discrete landmarks [1••], recent optic flow methods work with global image differences suggesting that this is not essential (e.g. [2, 5]). The
The organization of learning walks and flights
In this context it is interesting to understand in more detail the choreography of learning walks in ants and learning flights in bees and wasps, because they demonstrate how animals actively shape the acquisition of navigational information (e.g. [16••, 17, 44]). The common elements of these learning procedures are that the insects, upon leaving the nest or a foraging site, turn back towards the goal as they move away from it. They move along arcs, pivoting around the goal and counterturning
Routes and goals
Recent work has confirmed that walking insects following idiosyncratic routes are guided by landmarks and the landmark panorama (e.g. [8••, 14, 45, 46]). Is there a difference between on-route memories and goal memories? The latter appear to function like attractors, while the former appear to be organized as a sequence. To follow routes it is sufficient to determine heading direction while location information is required to identify goals. To determine heading direction, animals can employ
Neurobiological implementation
The next big challenge in insect navigation research is to investigate the neural implementation of this ability, in the same detail as it has been possible in vertebrates (e.g. [50]).
Most progress has been made in understanding the neural basis of the skylight polarization compass [51], but recent advances in functional neuroanatomy now also provide insights as to where and when navigational memories are laid down in the insect brain. In ants, the organization of mushroom body (see Figure 3)
Outlook
In all visual homing studies, it is important to take the viewpoint of navigating animals. This includes not only their perspective, with small ants, for instance, viewing the world from less than a millimetre above ground, their spatial resolution, their visual field, but also their spectral, polarization and motion sensitivities that all determine the salience of navigation-relevant cues. For instance, how terrestrial and celestial cues (e.g. the sun, the pattern of polarized skylight and the
References and recommended reading
Papers of particular interest, published within the period of review, have been highlighted as:
• of special interest
•• of outstanding interest
Acknowledgements
I thank Sara Stieb for providing the image in Figure 3a, Markus Knaden and Wolfgang Rössler for information, Allen Cheung, Piyankarie Jayatilaka and Wolfgang Stürzl for insightful comments on an early draft of the manuscript and the participants of the 2011 Bielefeld conference on Insect Homing: Mechanisms and Models for sharing their ideas. I thank the ARC Centre of Excellence in Vision Science, the Hermon Slade Foundation, the Defence Science and Technology Organization, Australia and the
References (64)
- et al.
Visual and tactile learning of ground structures in desert ants
J. Exp. Biol.
(2006) - et al.
Goal seeking in honeybees: matching of optic flow snapshots?
J. Exp. Biol.
(2010) - et al.
Ants use the panoramic skyline as a visual cue during navigation
Curr. Biol.
(2009) - et al.
The binding and recall of snapshot memories in wood ants (Formica rufa L.)
J. Exp. Biol.
(2004) - et al.
The information content of panoramic images. II. View-based navigation in non-rectangular experimental arenas
J. Exp. Psychol.: Anim. Behav. Proc.
(2008) - et al.
Ants learn geometry and features
Curr. Biol.
(2009) - et al.
A manifold of spatial maps in the brain
Trends Cogn. Sci.
(2010) - et al.
Landmark maps for honeybees
Biol. Cybern.
(1987) - et al.
Biologically plausible visual homing methods based on optical flow techniques
Connection Sci.
(2005) - et al.
Visual homing in insects and robots
Spectral contrasts for landmark navigation
J. Opt. Soc. Am.
Depth, contrast and view-based homing in outdoor scenes
Biol. Cybern.
How might ants use panoramic views for route navigation?
J. Exp. Biol.
Holistic visual encoding of ant-like routes: navigation without waypoints
Adaptive Behav.
Idiosyncratic route-based memories in desert ants, Melophorus bagoti: how do they interact with path-integration vectors?
Neurobiol. Learn. Mem.
Smells like home: desert ants, Cataglyphis fortis, use olfactory landmarks to pinpoint the nest
Front. Zool.
Desert ants benefit from combining visual and olfactory landmarks
J. Exp. Biol.
Desert ants use foraging distance to adapt the nest search to the uncertainty of the path integrator
Behav. Ecol.
Homing strategies of the Australian desert ant Melophorus bagoti. I. Proportional path-integration takes the ant half-way home
J. Exp. Biol.
Homing strategies of the Australian desert ant Melophorus bagoti. II. Interaction of the path integrator with visual cue information
J. Exp. Biol.
Vector-based and landmark-guided navigation in desert ants inhabiting landmark-free and landmark-rich environments
J. Exp. Biol.
Path integration provides a scaffold for landmark learning in desert ants
Curr. Biol.
Preferred viewing directions of bumblebees (Bombus terrestris L.) when learning and approaching their nest site
J. Exp. Biol.
Precision and reliability in animal navigation
Bull. Math. Biol.
Static and dynamic snapshots for goal localization in insects?
Comm. Integr. Biol.
Honeybees learn the colours of landmarks
J. Comp. Physiol.
The behavioural relevance of landmark texture for honeybee homing
Front. Behav. Neurosci.
The properties of the visual system in the Australian desert ant Melophorus bagoti
Arth. Struct. Dev.
Visual gaze control during peering flight manoeuvres in honeybees
Proc. R. Soc. B
The fine structure of honeybee head and body yaw movements in a homing task
Proc. R. Soc. B
Simulated visual homing in desert ant natural environments: efficiency of skyline cues
Biol. Cybern.
Landmarks or panoramas: what do navigating ants attend to for guidance?
Front. Zool.
Cited by (196)
Spatial inconsistency of memorized positions produces different types of movements
2023, Ecological ModellingAn intrinsic oscillator underlies visual navigation in ants
2023, Current BiologyNavigation: Cognition, learning, and memory
2023, The Foraging Behavior of the Honey Bee (Apis mellifera, L.)Landmarks, beacons, or panoramic views: What do pigeons attend to for guidance in familiar environments?
2024, Learning and BehaviorTrail using ants follow idiosyncratic routes in complex landscapes
2024, Learning and Behavior