scholarly journals Happy Cow or Thinking Pig? WUR Wolf—Facial Coding Platform for Measuring Emotions in Farm Animals

AI ◽  
2021 ◽  
Vol 2 (3) ◽  
pp. 342-354
Author(s):  
Suresh Neethirajan

Emotions play an indicative and informative role in the investigation of farm animal behaviors. Systems that respond and can measure emotions provide a natural user interface in enabling the digitalization of animal welfare platforms. The faces of farm animals can be one of the richest channels for expressing emotions. WUR Wolf (Wageningen University & Research: Wolf Mascot), a real-time facial recognition platform that can automatically code the emotions of farm animals, is presented in this study. The developed Python-based algorithms detect and track the facial features of cows and pigs, analyze the appearance, ear postures, and eye white regions, and correlate these with the mental/emotional states of the farm animals. The system is trained on a dataset of facial features of images of farm animals collected in over six farms and has been optimized to operate with an average accuracy of 85%. From these, the emotional states of animals in real time are determined. The software detects 13 facial actions and an inferred nine emotional states, including whether the animal is aggressive, calm, or neutral. A real-time emotion recognition system based on YoloV3, a Faster YoloV4-based facial detection platform and an ensemble Convolutional Neural Networks (RCNN) is presented. Detecting facial features of farm animals simultaneously in real time enables many new interfaces for automated decision-making tools for livestock farmers. Emotion sensing offers a vast potential for improving animal welfare and animal–human interactions.

2021 ◽  
Author(s):  
Suresh Neethirajan

Emotions play an indicative and informative role in the investigation of farm animal behaviors. Systems that respond and can measure emotions provide a natural user interface in enabling the digitalization of animal welfare platforms. The faces of farm animals can be one of the richest channels for expressing emotions. We present WUR Wolf (Wageningen University & Research: Wolf Mascot)a real-time facial expression recognition platform that can automatically code the emotions of farm animals. Using Python-based algorithms, we detect and track the facial features of cows and pigs, analyze the appearance, ear postures, and eye white regions, and correlate with the mental/emotional states of the farm animals. The system is trained on dataset of facial features of images of the farm animals collected in over 6 farms and has been optimized to operate with an average accuracy of 85%. From these, we infer the emotional states of animals in real time. The software detects 13 facial actions and 9 emotional states, including whether the animal is ag-gressive, calm, or neutral. A real-time emotion recognition system based on YoloV3, and Faster YoloV4-based facial detection platform and an ensemble Convolutional Neural Networks (RCNN) is presented. Detecting expressions of farm animals simultaneously in real time makes many new interfaces for automated decision-making tools possible for livestock farmers. Emotions sensing offers a vast amount of potential for improving animal welfare and animal-human interactions.


2020 ◽  
Author(s):  
Avelyne S. Villain ◽  
Mathilde Lanthony ◽  
Carole Guérin ◽  
Camille Noûs ◽  
Céline Tallet

1AbstractEnriching the life of farm animals is an obligation in intensive farming conditions. In pigs, manipulable materials are mandatory when no bedding is available. Like manipulable objects, positive human interactions might be considered as enrichment, as they provide the animals occasions to interact, increase their activity and lead to positive emotional states. In this study, we investigated how weaned piglets perceived a manipulable object, and a familiar human. After a similar familiarization to both stimuli, twenty-four weaned piglets were tested for a potential preference for one of the stimuli and submitted to isolation/reunion tests to evaluate the emotional value of the stimuli. We hypothesized that being reunited with a stimulus would attenuate the stress of social isolation and promote positive behaviors, and even more that the stimulus has a positive emotional value for piglets. Although our behavioural data did not allow to show a preference for one of the stimuli, piglets approached more often the human and were observed laying down only near the human. Using behavioural and bioacoustic data, we showed that reunion with the human decreased more the time spent in an attentive state and mobility of piglets than reunion with the object, and isolation. Vocalizations differed between reunions with the object and the human, and were different from vocalizations during isolation. The human presence led to higher frequency range, more noisy and shorter grunts. Finally, both stimuli decreased the isolation stress of piglets, and piglets seemed to be in a more positive emotional state with the human compared to the object. It confirms the potential need for positive human interactions to be used as pseudo-social enrichment in pigs.


2014 ◽  
Vol 25 (4) ◽  
pp. 279-287 ◽  
Author(s):  
Stefan Hey ◽  
Panagiota Anastasopoulou ◽  
André Bideaux ◽  
Wilhelm Stork

Ambulatory assessment of emotional states as well as psychophysiological, cognitive and behavioral reactions constitutes an approach, which is increasingly being used in psychological research. Due to new developments in the field of information and communication technologies and an improved application of mobile physiological sensors, various new systems have been introduced. Methods of experience sampling allow to assess dynamic changes of subjective evaluations in real time and new sensor technologies permit a measurement of physiological responses. In addition, new technologies facilitate the interactive assessment of subjective, physiological, and behavioral data in real-time. Here, we describe these recent developments from the perspective of engineering science and discuss potential applications in the field of neuropsychology.


2021 ◽  
Vol 11 (11) ◽  
pp. 4758
Author(s):  
Ana Malta ◽  
Mateus Mendes ◽  
Torres Farinha

Maintenance professionals and other technical staff regularly need to learn to identify new parts in car engines and other equipment. The present work proposes a model of a task assistant based on a deep learning neural network. A YOLOv5 network is used for recognizing some of the constituent parts of an automobile. A dataset of car engine images was created and eight car parts were marked in the images. Then, the neural network was trained to detect each part. The results show that YOLOv5s is able to successfully detect the parts in real time video streams, with high accuracy, thus being useful as an aid to train professionals learning to deal with new equipment using augmented reality. The architecture of an object recognition system using augmented reality glasses is also designed.


2021 ◽  
Vol 17 (7) ◽  
pp. 155014772110248
Author(s):  
Miaoyu Li ◽  
Zhuohan Jiang ◽  
Yutong Liu ◽  
Shuheng Chen ◽  
Marcin Wozniak ◽  
...  

Physical health diseases caused by wrong sitting postures are becoming increasingly serious and widespread, especially for sedentary students and workers. Existing video-based approaches and sensor-based approaches can achieve high accuracy, while they have limitations like breaching privacy and relying on specific sensor devices. In this work, we propose Sitsen, a non-contact wireless-based sitting posture recognition system, just using radio frequency signals alone, which neither compromises the privacy nor requires using various specific sensors. We demonstrate that Sitsen can successfully recognize five habitual sitting postures with just one lightweight and low-cost radio frequency identification tag. The intuition is that different postures induce different phase variations. Due to the received phase readings are corrupted by the environmental noise and hardware imperfection, we employ series of signal processing schemes to obtain clean phase readings. Using the sliding window approach to extract effective features of the measured phase sequences and employing an appropriate machine learning algorithm, Sitsen can achieve robust and high performance. Extensive experiments are conducted in an office with 10 volunteers. The result shows that our system can recognize different sitting postures with an average accuracy of 97.02%.


Animals ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 2253
Author(s):  
Severiano R. Silva ◽  
José P. Araujo ◽  
Cristina Guedes ◽  
Flávio Silva ◽  
Mariana Almeida ◽  
...  

Specific animal-based indicators that can be used to predict animal welfare have been the core of protocols for assessing the welfare of farm animals, such as those produced by the Welfare Quality project. At the same time, the contribution of technological tools for the accurate and real-time assessment of farm animal welfare is also evident. The solutions based on technological tools fit into the precision livestock farming (PLF) concept, which has improved productivity, economic sustainability, and animal welfare in dairy farms. PLF has been adopted recently; nevertheless, the need for technological support on farms is getting more and more attention and has translated into significant scientific contributions in various fields of the dairy industry, but with an emphasis on the health and welfare of the cows. This review aims to present the recent advances of PLF in dairy cow welfare, particularly in the assessment of lameness, mastitis, and body condition, which are among the most relevant animal-based indications for the welfare of cows. Finally, a discussion is presented on the possibility of integrating the information obtained by PLF into a welfare assessment framework.


Animals ◽  
2021 ◽  
Vol 11 (3) ◽  
pp. 724
Author(s):  
Alberto Cesarani ◽  
Giuseppe Pulina

The concept of welfare applied to farm animals has undergone a remarkable evolution. The growing awareness of citizens pushes farmers to guarantee the highest possible level of welfare to their animals. New perspectives could be opened for animal welfare reasoning around the concept of domestic, especially farm, animals as partial human artifacts. Therefore, it is important to understand how much a particular behavior of a farm animal is far from the natural one of its ancestors. This paper is a contribution to better understand the role of genetics of the farm animals on their behavior. This means that the naïve approach to animal welfare regarding returning animals to their natural state should be challenged and that welfare assessment should be considered.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 405
Author(s):  
Marcos Lupión ◽  
Javier Medina-Quero ◽  
Juan F. Sanjuan ◽  
Pilar M. Ortigosa

Activity Recognition (AR) is an active research topic focused on detecting human actions and behaviours in smart environments. In this work, we present the on-line activity recognition platform DOLARS (Distributed On-line Activity Recognition System) where data from heterogeneous sensors are evaluated in real time, including binary, wearable and location sensors. Different descriptors and metrics from the heterogeneous sensor data are integrated in a common feature vector whose extraction is developed by a sliding window approach under real-time conditions. DOLARS provides a distributed architecture where: (i) stages for processing data in AR are deployed in distributed nodes, (ii) temporal cache modules compute metrics which aggregate sensor data for computing feature vectors in an efficient way; (iii) publish-subscribe models are integrated both to spread data from sensors and orchestrate the nodes (communication and replication) for computing AR and (iv) machine learning algorithms are used to classify and recognize the activities. A successful case study of daily activities recognition developed in the Smart Lab of The University of Almería (UAL) is presented in this paper. Results present an encouraging performance in recognition of sequences of activities and show the need for distributed architectures to achieve real time recognition.


2021 ◽  
Vol 11 (4) ◽  
pp. 1933
Author(s):  
Hiroomi Hikawa ◽  
Yuta Ichikawa ◽  
Hidetaka Ito ◽  
Yutaka Maeda

In this paper, a real-time dynamic hand gesture recognition system with gesture spotting function is proposed. In the proposed system, input video frames are converted to feature vectors, and they are used to form a posture sequence vector that represents the input gesture. Then, gesture identification and gesture spotting are carried out in the self-organizing map (SOM)-Hebb classifier. The gesture spotting function detects the end of the gesture by using the vector distance between the posture sequence vector and the winner neuron’s weight vector. The proposed gesture recognition method was tested by simulation and real-time gesture recognition experiment. Results revealed that the system could recognize nine types of gesture with an accuracy of 96.6%, and it successfully outputted the recognition result at the end of gesture using the spotting result.


Sign in / Sign up

Export Citation Format

Share Document