Towards an automated classification of pig calls according to their emotional valence and behavioural context: a comparison of methods - INRAE - Institut national de recherche pour l’agriculture, l’alimentation et l’environnement Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Towards an automated classification of pig calls according to their emotional valence and behavioural context: a comparison of methods

Marek Špinka
  • Fonction : Auteur
  • PersonId : 896930

Résumé

Emotions can affect vocalizations directly or indirectly through associated changes in the brain, lungs, larynx and/or vocal tract. As a result, vocal expression of emotion has been observed across species, and could serve as a potentially reliable and non-invasive indicator to assess animal emotions. In pigs (Sus scrofa), vocal expression of emotions has been relatively well studied. However, it is not known if the vocal indicators revealed in previous studies are valid across call types and contexts, and could therefore be used for an automated real-time monitoring of pig emotions on-farm. To investigate this question, we performed an analysis of an extensive and unique dataset of low (LF) and high frequency (HF) calls emitted by pigs of different breed, sex and age across many different commercial situations from birth to slaughter (7414 calls from 411 pigs). We first tested how four vocal parameters that represented a high amount of variance in the data (duration, amplitude modulation rate, spectral center of gravity, and Wiener entropy) changed as a function of the valence attributed to the contexts of production, and as a function of the contexts themselves, using linear mixed-effects models (LMM). We then tested two different automated methods of classifying the calls; permuted discriminant function analyses (pDFA) based on the four selected vocal parameters, and an image classification neural network based on spectrograms of the calls. The LMMs revealed that both the valence and the context affected all four vocal parameters in both LF and HF (p < 0.001 for all models). The neural network revealed higher classification accuracies compared to the pDFA, both for the valence (pDFA analysis: weighted average across LF and HF = 85.2% with a chance level at 55.87%; neural network = 91.5%) and context (pDFA analysis: weighted average across LF and HF = 24.4% with a chance level at 15.5%; neural network = 83.8%). Therefore, despite variability in age, sex, body size, and situation, the assumed emotional valence and the context of production can be correctly identified above chance levels, and particularly using a neural network to classify spectrograms of the entire vocalizations. These results suggest that an automated recognition system can be developed to monitor pig welfare on-farm and allow real-time discrimination of emotional states according to the valence and/or context of production.
Fichier non déposé

Dates et versions

hal-03653802 , version 1 (28-04-2022)

Identifiants

  • HAL Id : hal-03653802 , version 1

Citer

Elodie Briefer, Ciara Sypherd, Pavel Linhart, Lisette Leliveld, Mónica Padilla de La Torre, et al.. Towards an automated classification of pig calls according to their emotional valence and behavioural context: a comparison of methods. 54. Congress of the ISAE, International society for applied ethology, Aug 2021, Online, France. pp.113. ⟨hal-03653802⟩
27 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More