Agroecology: When artificial intelligence translates pig vocalisations

Pigs express their emotions through vocalisations. Recognising these sounds, and the emotions they express, would provide the information necessary for farmers to adapt their interventions and ensure the welfare of pigs throughout their lives. This is why INRAE, the Swiss Federal Institute of Technology (ETH) and the University of Copenhagen have coordinated the development of a system for recognising pig vocalisations as part of the European SOUNDWEL project. Their results, published on 7 March in Scientific Reports, point to the possibility of an automatic recognition tool for vocalisations to monitor and improve pig welfare on-farm.

Pigs express their emotions through different types of vocalisations (like grunts, screams and squeals), each of which has many more or less subtle variations. Deciphering these sounds would help livestock farmers to better understand the emotions expressed by their animals and to improve their welfare. The researchers’ idea is to develop a system to recognise and distinguish pig vocalisations, the emotions they convey and the situation that generated them, in order to help livestock farmers in their decision-making.

A library of 7400 vocalisations

In order to build this tool, the researchers started by collecting thousands of vocalisations. In the end, over 7,400 good quality vocalisations from 411 pigs in different European laboratories could be analysed. These sounds were recorded in 19 different contexts: from the birth of the pigs and throughout their life, in different types of indoor rearing (e.g. on slatted floors or on straw) and in slaughterhouses. These contexts can be sources of positive emotions (suckling, reunion with fellow pigs…) or negative ones (fights, isolation…). By combining the expertise of ethologists, bioacousticians for the detailed analysis of the acoustic structure of the recorded vocalisations (a more or less high frequency, purity of the sound…), and computational methods of artificial intelligence, the researchers worked on the automatic classification of the vocalisations according to the emotional valence (negative or positive emotion) and the situation in which they were emitted, with a view to possible action by the livestock farmer.

Artificial intelligence to translate the emotion that pigs are experiencing

The results show that artificial intelligence is very effective at recognising not only the emotional valence of the vocalisations (91.5% accuracy), but also the situation in which they were emitted (82% accuracy). On receiving a new sound, the system will automatically compare it with previously classified sounds to qualify it. This system could be of great help to livestock farmers as it could alert them in real time if a situation requires their immediate intervention, such as in the case of a piglet being crushed by the mother or repeated or prolonged fights within a group, which are an indication of a problem. It would also allow livestock farmers to reinforce positive situations for the pigs, helping them to evaluate, for example, the provision of new toys or infrastructure to enhance the welfare of their animals. It is also a very innovative system for research on the vocalisations of pigs — and other animals — as it allows working on a larger scale than more common and time-consuming manual analyses.

Similar acoustic monitoring systems already exist on farms to monitor the health of pigs by analysing the noise of their coughs. The INRAE research team is now working on adding an analysis of pig vocalisations to this listening system in order to combine physical and mental health measures for better welfare on-farm.

REFERENCE
Elodie F. Briefer, Ciara C.-R. Sypherd, Pavel Linhart, Lisette M.C. Leliveld, Monica Padilla de la Torre, Eva R. Read, Carole Guérin, Véronique Deiss, Chloé Monestier, Jeppe H. Rasmussen, Marek Špinka, Sandra Düpjan, Alain Boissy, Andrew M. Janczak, Edna Hillmann, Céline Tallet, Classification of pig calls produced from birth to slaughter according to their emotional valence and context of production, Scientific Reports, DOI : 10.1038/s41598-022-07174-8