Abstract
Automated behavioural monitoring is increasingly required for animal welfare and precision agriculture. In pig farming, detailed analyses of sow activity are essential to identify and reduce the risks of piglets being crushed during postural changes of their mothers. Here we introduce a new, non-invasive, fast and accurate method for monitoring sow behaviour based on millimeter-wave radars and deep learning analysis. We used our method to predict postural changes in crated sows and distinguish the dangerous one that lie down abruptly from those that lie down carefully using transient postures. Two radars were placed on a metal backing above the head and the upper part of the back of each of ten sows to monitor their activity during 5 hours. We analysed the radar data with a convolutional neural network and identified five postures. The average sensitivity was 96.9% for standing, 90.8% for lying, 91.4% for nursing, 87.6% for sitting, but only 11.9% for kneeling. However, the average specificity and accuracy were greater than 92% for the five postures. Interestingly, two of the ten sows occasionally moved directly from standing to lying, without using the transient postures sitting and kneeling, thereby displaying risky behaviours for their piglets. Our radar-based classifier is more accurate, faster and require less memory than current computer vision approaches. Using more sows will improve the algorithm performance and facilitate future applications for large scale deployment in animal farming.
Highlights
Automated behavioural analysis is a major challenge for precision farming.
We developed automated detection of lactating sow postures with radars and deep learning.
We identified five postures, including transitions risky for the piglets.
Our method is accurate, fast and requires less memory than computer vision.
Radars thus hold considerable promises for high through-put recording of livestock activity.
Competing Interest Statement
The authors have declared no competing interest.