Abstract
In order to study colour signals as animals perceive them, visual ecologists usually rely on models of colour vision that do not consider patterns–the spatial arrangement of features within a signal.
HMAX describes a family of models that are used to study pattern perception in human vision research, and which have inspired many artificial intelligence algorithms. In this article, we highlight that the sensory and brain mechanisms modelled in HMAX are widespread, occurring in most if not all vertebrates, thus offering HMAX models a wide range of applications in visual ecology.
We begin with a short description of the neural mechanisms of pattern perception in vertebrates, emphasizing similarities in processes across species. Then, we provide a detailed description of HMAX, highlighting how the model is linked to biological vision. We further present sparse-HMAX, an extension of HMAX that includes a sparse coding scheme, in order to make the model even more biologically realistic and to provide a tool for estimating efficiency in information processing. In an illustrative analysis, we then show that HMAX performs better than two other reference methods (manually-positioned landmarks and the SURF algorithm) for estimating similarities between faces in a nonhuman primate species.
This manuscript is accompanied with MATLAB codes of an efficient implementation of HMAX and sparse-HMAX that can be further flexibly parameterized to model non-human colour vision, with the goal to encourage visual ecologists to adopt tools from computer vision and computational neuroscience.