Evaluation of the physical activity of a group of gestating sows using an artificial neural network
Résumé
Physical activity influences the energy requirements of group-housed gestating sows, and changes in their activity or behaviour patterns may be signs of welfare or health disorders. Ear tag accelerometers usually used for assessing the activity are fragile, costly, and invasive sensors. Instead, cameras can record videos of the group of sows but require countless hours to manually analyse the different activities of the sows. In the present work, the performances of a deep-learning algorithm developed to automatically detect the different activities of the gestating sows on images are evaluated. Two groups of 18 sows, housed in two pens, were included in the experiment and followed during two consecutive gestations. Two cameras recorded each pen continuously. Six activities (lying ventrally and laterally, sitting, standing, eating and drinking) were manually annotated by animal behaviour experts, on the 1,331 images extracted from the videos. This annotated set of images was used to train the algorithm, an object detection model that uses convolutional neural networks to detect and classify objects in an image. Another set of 403 images was used to validate the performance of the algorithm which proved to be reliable. The classification accuracy of sows lying ventrally and eating were respectively 82% and 87%. The lowest accuracies were generated by sitting (47%) and drinking (53%) activities, probably partly due to the lack of images including these activities in the training dataset. Indeed, sitting represented 3% of the activities labelled on the training dataset, and drinking only 1%. On a daily basis, the sows spent 75% of their time lying laterally (53%) or ventrally (22%). They were more active (i.e. standing) between 00:30 and 09:00, due to the start of the new feeding day at 00:00. Variation in the physical activity appeared between the two sows’ groups and the different gestation weeks. To improve this algorithm, new data will be collected and a tracking module will be integrated to be able to detect the walking activity and also to work at the individual level.