Submitted Successfully!
Thank you for your contribution! You can also upload a video entry related to this topic through the link below: https://encyclopedia.pub/user/video_add?id=22215
Check Note
2000/2000
Ver. Summary Created by Modification Content Size Created at Operation
1 -- 1563 2022-04-25 05:43:17 |
2 Format Correct Meta information modification 1563 2022-04-25 08:01:31 |
Artificial Intelligence Impact Affective State Recognition in Livestock
Edit
Upload a video

Farm animals, numbering over 70 billion worldwide, are increasingly managed in large-scale, intensive farms. With both public awareness and scientific evidence growing that farm animals experience suffering, as well as affective states such as fear, frustration and distress, there is an urgent need to develop efficient and accurate methods for monitoring their welfare. At present, there are not scientifically validated ‘benchmarks’ for quantifying transient emotional (affective) states in farm animals, and no established measures of good welfare, only indicators of poor welfare, such as injury, pain and fear. Conventional approaches to monitoring livestock welfare are time-consuming, interrupt farming processes and involve subjective judgments. Biometric sensor data enabled by artificial intelligence is an emerging smart solution to unobtrusively monitoring livestock, but its potential for quantifying affective states and ground-breaking solutions in their application are yet to be realized. 

animal welfare sensors modelling
Information
Contributor :
View Times: 171
Revisions: 2 times (View History)
Update Date: 25 Apr 2022
Table of Contents

    1. Modelling Farm Animal Affective State and Behaviour Using Multimodal Sensor Fusion

    High-fidelity, integrated multimodal imaging and sensing technologies have the potential to revolutionize how livestock are monitored and cared for [1][2]. Currently, there are no commercially available multimodal biosensing platforms capable of monitoring the affective and behavioural states of farm animals in real time [3]. Developing such a platform would allow comprehensive quantitative analyses of these states, potentially leading to significant insights and advances in our understanding of optimal approaches to animal care. The development and integration of next-generation multimodal sensor systems and advanced statistical methods to estimate and predict affective and behavioural states in farm animals would significantly open pathways for enhancing animal welfare.
    Establishing a distributed network of non-obtrusive, non-invasive sensors to collect real-time behavioural and physiological data from farm animals could be the initial step in the realization of framework development (Figure 1). Non-invasive sensors comprising video and thermal imaging cameras, microphones, and wearable TNO Holst 3-in-1 patches (monitoring heart rate, respiration rate, and activity) will help in the collection of data on behavioural and affective states. Data collected during natural behaviour, without any interference from experimenters, and the data collected during protocols in which defined positive and negative affective states will be induced in the animals using established protocols, including withholding high-value food from animals to induce disappointment [4]; placing animals in crowded situations to induce frustration [5][6]; and petting and socializing the animals to induce contentment [7][8] are some possibilities.
    Animals 12 00759 g003 550
    Figure 1. Multimodal affective state recognition data analysis workflow framework of the per-animal quantified approach. EEG—electroencephalogram; FNIRS—functional near-infrared spectroscopy; ML—machine learning; CNN—convolutional neural networks.

    2. Classification and Annotation of Affective States and Behavioural Events

    Common methods to identify affective and behavioural events in farm animals using sensors and AI enabled sensor data are: (a) An automatic affective state classification approach, capitalizing on preliminary work [9] conducted by FarmWorx of the Wageningen University. Pre-existing trained farm animals’ facial recognition platform such as WUR Wolf (Wageningen University and Research—Wolf Mascot) [9] can be used to classify changes in affective state over time in pigs and cows based on the video camera data (Figure 2). (b) Manual annotation of behavioural and emotional events in the data sets by ethologists and behavioural scientists with specific expertise in cow and pig behaviour, providing “gold-standard” annotated data sets. The annotators could potentially evaluate one category of behaviour (e.g., feeding, playing, resting) or affective state (e.g., fearful, happy, relaxed) at a time for all the animals under study, to maintain consistent scoring across animals. Krippendorff’s alpha coefficient could be calculated to compute the reliability across annotators and metrics, and to assess the influence of unequal sample sizes and disparities between dimensional and categorical variables on the results.
    Animals 12 00759 g004 550
    Figure 2. Pipeline of WUR Wolf (Wageningen University and Research—Wolf Mascot) automatic approach [9] in coding affective states from facial features of cows using machine learning models. SVM—support vector machine; AU—arbitrary units.

    3. Sensor Network Fusion Protocols and Instrumentation Framework

    Integrating heterogeneous sensor types into a multimodal network involves implementing a sensing platform capable of fusing data streams with differing precisions, update rates, and data formats to produce a common framework in which these data can be correlated and analysed. At present, there exist no platforms that possess the necessary functionality to correlate heterogeneous data streams, integrate diverse data sets, and identify data from individual animals [3].
    There is a need for developing an instrumentation framework capable of integrating sensor data from diverse sensor types, opening the door to acquiring and analysing large data sets of multimodal sensor data on animal behaviour and affective state for the first time. It has to focus on establishing the hardware infrastructure to reliably gather large quantities of multimodal sensor data, along with the high-performance cloud server architecture to store and process these data.
    In order to stream data in real time from multiple sensor types simultaneously, making use of long-range wide area network (LoRaWAN) communications technology, which is rapidly emerging as the state of the art in smart farming [10][11][12] would be ideal. LoRaWAN can wirelessly transmit data from 300 different types of sensors at a time, which will thereby allow the researchers to avoid the technical complexity and cost of a conventional wired setup. Extending the functionality of the LoRaWAN system to use low-energy Bluetooth technology, by increasing the length of time that data can be acquired from portable sensors before they need to be recharged [13][14][15] would save time and resource overload. To accelerate and facilitate the real-time analysis of the data, cloud servers connected via the internet must be used to store and process the data [1][16], avoiding the need to install complex and expensive computer servers at each individual farm site. The Microsoft AZURE platform is a commercial application that could allow seamless integration between the sensor data streams and the high-performance AI and ML methods used to analyse the data.

    4. Build Predictive Models of Affective State and Behaviour

    By using the data sets collected from the distributed sensor network, robust predictive models of farm animal behaviour and affective states can be built. Advanced statistical analyses applied to the annotated data set using supervised AI and ML methods, namely the Latent Growth Curve Modelling, Random Forest and Support Vector Machine models [17][18][19], offers established approaches in capturing and measuring patterns in dynamic interactive variables, such as behaviour and affective states of farm animals. These methods employed to extract features from the visual, thermal, auditory, physiological and activity sensor data, enables different behavioural and affective states to be distinguished with high accuracy, sensitivity, and selectivity [20][21][22][23].
    Following the supervised training stage, unsupervised ML models applied aids in the determination of clusters of similar behavioural and affective state descriptors from unannotated sensor data obtained from farm animals [24][25][26]. These descriptors function as numerical “fingerprints” that allow distinct behavioural or affective states to be reliably identified, even in entirely novel data. The best features from each sensor modality corresponding to these descriptors can then be selected to define high-level specific indicators, which will then be fused to build an ML classifier-based model. There are two potential approaches to fusing sensor data from different modalities to predict behavioural and affective states which are (i) decision-level fusion, in which prediction scores from the unimodal models will be linearly combined; (ii) feature- and indicator-level fusion, in which feature vectors and indicators will be integrated across modalities to yield the prediction scores. The performance levels of different ML methods at estimating behavioural and affective states can be assessed using regression methods [27][28][29].

    5. Challenges in the Quantification and Validation of Performance Models for Affective States Measurement

    The assessment effectiveness of the platform at estimating behaviour and affective state in real time from farm animals is quite challenging. The predictive model can be evaluated by calculating its accuracy at estimating affective and behavioural states in novel data sets collected from the sensor network. In addition, the accuracy of the model can further be validated by correlating the affective and behavioural states it identifies against: (i) Quantitative assays of cortisol, lactate and oxytocin levels in blood and/or saliva samples from the animals [30][31]. These provide a reliable biochemical reference measure of emotional arousal and stress. (ii) Physiological indices associated with specific affective states in the animals, such as heart rate, respiratory rate, and body temperature. Physiological signals are more reflective of autonomic nervous system activity than non-physiological signals [32], such as facial expressions or vocalizations. Autonomic nervous system activation during emotional expression is involuntary in animals and therefore provides an unambiguous, quantitative reference measure for evaluating affective states.

    5.1. Sensor Durability

    There is a risk that a wearable sensor cannot be attached securely to the animals, or the animals may damage the sensors by chewing or crushing. To mitigate the former, animal scientists or researchers could improve the adhesion protocol or use a belly belt, which is more secure.

    5.2. Low Sensitivity of the Model at Detecting Affective and Behavioural States

    To address this, optimization of the AI algorithms and the sensors to increase sensitivity turn out to be useful.

    5.3. Lack of Correlation between Sensor Data and Biochemical Reference Values

    Researchers collaborate with veterinarians to set up the biochemical validation assays.

    5.4. Limiting the Numbers of Animals Used in the Experiments

    Increasing the sample size opens up ethical and practical issues [33]. The numbers of pigs and cows to be used in animal experiments should meet optimal research standards and experimental design but also meet the 3R (reduce, replace, refine) policies. Bayesian approaches could be used to increase the statistical power of the animal experiments using historical control data [33], while developing indices.

    References

    1. Neethirajan, S. The role of sensors, big data and machine learning in modern animal farming. Sens. Bio-Sens. Res. 2020, 29, 100367.
    2. Neethirajan, S.; Kemp, B. Digital Livestock Farming. Sens. Bio-Sens. Res. 2021, 32, 100408.
    3. Neethirajan, S.; Reimert, I.; Kemp, B. Measuring Farm Animal Emotions—Sensor-Based Approaches. Sensors 2021, 21, 553.
    4. Baciadonna, L.; Briefer, E.F.; McElligott, A.G. Investigation of reward quality-related behaviour as a tool to assess emotions. Appl. Anim. Behav. Sci. 2020, 225, 104968.
    5. Relić, R.; Sossidou, E.; Xexaki, A.; Perić, L.; Božičković, I.; Đukić-Stojčić, M. Behavioral and health problems of poultry related to rearing systems. Ank. Üniversitesi Vet. Fakültesi Derg. 2019, 66, 423–428.
    6. Verdon, M.; Rault, J.-L. 8—Aggression in Group Housed Sows and Fattening Pigs; Špinka, M., Ed.; Woodhead Publishing: Sawston, UK, 2018; pp. 235–260.
    7. Ujita, A.; El Faro, L.; Vicentini, R.R.; Lima, M.L.P.; Fernandes, L.D.O.; Oliveira, A.P.; Veroneze, R.; Negrão, J.A. Effect of positive tactile stimulation and prepartum milking routine training on behavior, cortisol and oxytocin in milking, milk composition, and milk yield in Gyr cows in early lactation. Appl. Anim. Behav. Sci. 2021, 234, 105205.
    8. Lürzel, S.; Bückendorf, L.; Waiblinger, S.; Rault, J.-L. Salivary oxytocin in pigs, cattle, and goats during positive human-animal interactions. Psychoneuroendocrinology 2020, 115, 104636.
    9. Neethirajan, S. Happy Cow or Thinking Pig? WUR Wolf—Facial Coding Platform for Measuring Emotions in Farm Animals. AI 2021, 2, 342–354.
    10. Abdullahi, U.S.; Nyabam, M.; Orisekeh, K.; Umar, S.; Sani, B.; David, E.; Umoru, A.A. Exploiting IoT and LoRaWAN Technologies for Effective Livestock Monitoring in Nigeria. AZOJETE 2019, 15, 146–159. Available online: https://azojete.com.ng/index.php/azojete/article/view/22 (accessed on 5 April 2021).
    11. Waterhouse, A.; Holland, J.P.; McLaren, A.; Arthur, R.; Duthie, C.A.; Kodam, S.; Wishart, H.M. Opportunities and challenges for real-time management (RTM) in extensive livestock systems. In Proceedings of the The European Conference in Precision Livestock Farming, Cork, Ireland, 26–29 August 2019; Available online: https://pure.sruc.ac.uk/en/publications/opportunities-and-challenges-for-real-time-management-rtm-in-exte (accessed on 6 April 2021).
    12. Citoni, B.; Fioranelli, F.; Imran, M.A.; Abbasi, Q.H. Internet of Things and LoRaWAN-Enabled Future Smart Farming. IEEE Internet Things Mag. 2019, 2, 14–19.
    13. Liu, L.-S.; Ni, J.-Q.; Zhao, R.-Q.; Shen, M.-X.; He, C.-L.; Lu, M.-Z. Design and test of a low-power acceleration sensor with Bluetooth Low Energy on ear tags for sow behaviour monitoring. Biosyst. Eng. 2018, 176, 162–171.
    14. Trogh, J.; Plets, D.; Martens, L.; Joseph, W. Bluetooth low energy based location tracking for livestock monitoring. In Proceedings of the 8th European Conference on Precision Livestock Farming, Nantes, France, 12–14 September 2017; Available online: http://hdl.handle.net/1854/LU-8544264 (accessed on 8 April 2021).
    15. Bloch, V.; Pastell, M. Monitoring of Cow Location in a Barn by an Open-Source, Low-Cost, Low-Energy Bluetooth Tag System. Sensors 2020, 20, 3841.
    16. Fote, F.N.; Mahmoudi, S.; Roukh, A.; Mahmoudi, S.A. Big data storage and analysis for smart farming. In Proceedings of the 2020 5th International Conference on Cloud Computing and Artificial Intelligence: Technologies and Applications (CloudTech), Marrakesh, Morocco, 24–26 November 2020; Available online: https://ieeexplore.ieee.org/abstract/document/9365869 (accessed on 8 April 2021).
    17. Zhang, W.; Liu, H.; Silenzio, V.M.B.; Qiu, P.; Gong, W. Machine Learning Models for the Prediction of Postpartum Depression: Application and Comparison Based on a Cohort Study. JMIR Med. Inform. 2020, 8, e15516.
    18. Meire, M.; Ballings, M.; Poel, D.V.D. The added value of auxiliary data in sentiment analysis of Facebook posts. Decis. Support Syst. 2016, 89, 98–112.
    19. Elhai, J.D.; Tiamiyu, M.F.; Weeks, J.W.; Levine, J.C.; Picard, K.J.; Hall, B. Depression and emotion regulation predict objective smartphone use measured over one week. Pers. Individ. Differ. 2018, 133, 21–28.
    20. Dhall, A. EmotiW 2019: Automatic emotion, engagement and cohesion prediction tasks. In Proceedings of the 2019 International Conference on Multimodal Interaction, Suzhou, China, 14–18 October 2019; Available online: https://dl.acm.org/doi/10.1145/3340555.3355710 (accessed on 5 April 2021).
    21. Liu, C.; Tang, T.; Lv, K.; Wang, M. Multi-feature based emotion recognition for video clips. In Proceedings of the 20th ACM International Conference on Multimodal Interaction, Boulder, CO, USA, 16–20 October 2018; Available online: https://dl.acm.org/doi/10.1145/3242969.3264989 (accessed on 5 April 2021).
    22. Pei, E.; Jiang, D.; Alioscha-Perez, M.; Sahli, H. Continuous affect recognition with weakly supervised learning. Multimed. Tools Appl. 2019, 78, 19387–19412.
    23. Chang, F.-J.; Tran, A.T.; Hassner, T.; Masi, I.; Nevatia, R.; Medioni, G. Deep, Landmark-Free FAME: Face Alignment, Modeling, and Expression Estimation. Int. J. Comput. Vis. 2019, 127, 930–956.
    24. Liakos, K.G.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine learning in agriculture: A review. Sensors 2018, 18, 2674.
    25. Valletta, J.J.; Torney, C.; Kings, M.; Thornton, A.; Madden, J. Applications of machine learning in animal behaviour studies. Anim. Behav. 2017, 124, 203–220.
    26. Gris, K.V.; Coutu, J.-P.; Gris, D. Supervised and Unsupervised Learning Technology in the Study of Rodent Behavior. Front. Behav. Neurosci. 2017, 11, 141.
    27. Chandrasekaran, B.; Gangadhar, S.; Conrad, J.M. A survey of multisensor fusion techniques, architectures and methodologies. In Proceedings of the SoutheastCon, Concord, NC, USA, 30 March–2 April 2017; Available online: https://ieeexplore.ieee.org/abstract/document/7925311 (accessed on 7 April 2021).
    28. Shah, N.H.; Milstein, A.; Bagley, S.C. Making Machine Learning Models Clinically Useful. JAMA 2019, 322, 1351.
    29. Lapuschkin, S.; Wäldchen, S.; Binder, A.; Montavon, G.; Samek, W.; Müller, K.-R. Unmasking Clever Hans predictors and assessing what machines really learn. Nat. Commun. 2019, 10, 1096.
    30. Tuteja, S.K.; Ormsby, C.; Neethirajan, S. Noninvasive Label-Free Detection of Cortisol and Lactate Using Graphene Embedded Screen-Printed Electrode. Nano-Micro Lett. 2018, 10, 41.
    31. Bienboire-Frosini, C.; Chabaud, C.; Cozzi, A.; Codecasa, E.; Pageat, P. Validation of a Commercially Available Enzyme ImmunoAssay for the Determination of Oxytocin in Plasma Samples from Seven Domestic Animal Species. Front. Neurosci. 2017, 11, 524.
    32. Siegel, P.B.; Gross, W.B. General Principles of Stress and Well-Being; Grandin, T., Ed.; CABI: Wallingford, UK, 2000; pp. 27–41.
    33. Bonapersona, V.; Hoijtink, H.; Sarabdjitsingh, R.A.; Joëls, M. Increasing the statistical power of animal experiments with historical control data. Nat. Neurosci. 2021, 24, 470–477.
    More
    Information
    Contributor MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register :
    View Times: 171
    Revisions: 2 times (View History)
    Update Date: 25 Apr 2022
    Table of Contents
      1000/1000

      Confirm

      Are you sure you want to delete?

      Video Upload Options

      Do you have a full video?
      Cite
      If you have any further questions, please contact Encyclopedia Editorial Office.
      Neethirajan, S. Artificial Intelligence Impact Affective State Recognition in Livestock. Encyclopedia. Available online: https://encyclopedia.pub/entry/22215 (accessed on 03 February 2023).
      Neethirajan S. Artificial Intelligence Impact Affective State Recognition in Livestock. Encyclopedia. Available at: https://encyclopedia.pub/entry/22215. Accessed February 03, 2023.
      Neethirajan, Suresh. "Artificial Intelligence Impact Affective State Recognition in Livestock," Encyclopedia, https://encyclopedia.pub/entry/22215 (accessed February 03, 2023).
      Neethirajan, S. (2022, April 25). Artificial Intelligence Impact Affective State Recognition in Livestock. In Encyclopedia. https://encyclopedia.pub/entry/22215
      Neethirajan, Suresh. ''Artificial Intelligence Impact Affective State Recognition in Livestock.'' Encyclopedia. Web. 25 April, 2022.
      Top
      Feedback