RT Journal Article SR Electronic T1 Investigating the use of odour and colour foraging cues by rosy-faced lovebirds (Agapornis roseicollis) using deep-learning based behavioural analysis JF bioRxiv FD Cold Spring Harbor Laboratory SP 2024.02.18.580921 DO 10.1101/2024.02.18.580921 A1 Wai Tsang, Winson King A1 Kei Poon, Emily Shui A1 Newman, Chris A1 Buesching, Christina D. A1 Sin, Simon Yung Wa YR 2024 UL http://biorxiv.org/content/early/2024/02/21/2024.02.18.580921.abstract AB Olfaction and vision can play important roles in optimizing foraging decisions of birds, enabling them to maximize their net rate of energy intake while searching for, handling, and consuming food. Parrots have been used extensively in avian cognition research, and some species use olfactory cues to find food. Here we pioneered machine learning analysis and pose-estimation with convolutional neural networks (CNNs) to elucidate the relative importance of visual and olfactory cues for informing foraging decisions in the rosy-faced lovebird (Agapornis roseicollis) as a non-typical model species. In a binary choice experiment, we used markerless body pose tracking to analyse bird response behaviours. Rosy-faced lovebirds quickly learnt to discriminate the feeder provisioned with food by forming an association with visual (red/green papers) but not olfactory (banana/almond odour) cues. When visual cues indicated the provisioned and empty feeders, feeder choice was more successful, choice latency shorter, and interest in the empty feeder significantly lower. This demonstrates that visual cues alone are sufficient to inform lovebird foraging decisions without needing to use olfactory cues, suggesting that selection has not driven olfactory-based foraging in lovebird evolution.Competing Interest StatementThe authors have declared no competing interest.