Technology to Automatically Record Eating Behavior: Comparison
Please note this is a comparison between Version 2 by Lindsay Dong and Version 1 by Alexander Toet.

To monitor adherence to diets and to design and evaluate nutritional interventions, it is essential to obtain objective knowledge about eating behavior. In most research, measures of eating behavior are based on self-reporting, such as 24-h recalls, food records (food diaries) and food frequency questionnaires. Self-reporting is prone to inaccuracies due to inaccurate and subjective recall and other biases. Recording behavior using nonobtrusive technology in daily life would overcome this.

  • eating
  • drinking
  • daily life
  • sensors
  • behavior

1. Introduction

As stated by the World Health Organization (WHO) “Nutrition is coming to the fore as a major modifiable determinant of chronic disease, with scientific evidence increasingly supporting the view that alterations in diet have strong effects, both positive and negative, on health throughout life” [1]. It is therefore of key importance to find efficient and solid methodologies to study eating behavior and food intake in order to help reduce potential long-term health problems caused by unhealthy diets. Past research on eating behaviors and attitudes relies intensively on self-reporting tools, such as 24-h recalls, food records (food diaries) and food frequency questionnaires (FFQ; [2,3,4][2][3][4]). However, there is an increasing understanding of the limitations of this classical approach to studying eating behaviors and attitudes. One of the major limitations of this approach is that self-reporting tools rely on participants’ recall, which may be inaccurate or biased (especially when studying the actual amount of food or liquid intake [5]). Recall biases can be caused by demand characteristics, which are cues that may indicate the study aims to participants, leading them to change their behaviors or responses based on what they think the research is about [6], or more generally by the desire to comply with social norms and expectations when it comes to food intake [7,8][7][8].
There is growing interest in identifying technologies able to improve the quality and validity of data collected to advance nutrition science. Such technologies should enable eating behavior to be measured passively (i.e., without requiring action or mental effort on the part of the users), objectively and reliably in realistic contexts. To maximize the efficiency of real-life measurement, it is vital to develop technologies that capture eating behavior patterns in a low-cost, unobtrusive and easy-to-analyze way. For real-world practicality, the technologies should be comfortable and acceptable so that they can be used in naturalistic settings for extended periods while respecting the users’ privacy.

2. Technology to Automatically Record Eating Behavior 

2.1. Eating and Drinking Activity Detection

For “eating/drinking activity detection”, many systems have been reported that measure eating- and drinking-related motions. In particular, many papers reported measuring these actions using motion sensors such as inertial sensor modules (i.e., inertial measurement units or IMUs). IMUs typically consist of various sensors such as an accelerator, gyroscope and magnetometer. These sensors are embedded in smartphones and wearable devices such as smartwatches. In [16][9], researchers collected IMU signals with off-the-shelf smartwatches to identify hand-based eating and drinking-related activities. In this case, participants wore smartwatches on their preferred wrists. Other studies have employed IMUs worn on the wrist, upper arm, head, neck and combinations thereof [17,18,19,20][10][11][12][13]. IMUs worn on the wrist or upper arms can collect movement data relatively unobtrusively during natural eating activities such as lifting food or bringing utensils to the mouth. Recent research has also improved IMUs that are attached to the head or neck, combining sensors with glasses or necklaces so that they are less bulky and users are not aware that they are being worn. Besides IMUs, proximity sensors, piezoelectric sensors and radar sensors are also used to detect hand-to-mouth gestures or jawbone movements [21,22,23][14][15][16]. Pressure sensors are used to measure eating activity as well. For instance, in [24][17], eating activities and the amount of consumed food are measured by a pressure-sensitive tablecloth and tray. These devices provide information on food-intake-related actions such as cutting, scooping, stirring or the identification of the plate or container on which the action is executed and allow the tracking of weight changes of plates and containers. Microphones, RGB-D images and video cameras are also used to detect eating and drinking-related motions.

2.2. Bite, Chewing or Swallowing Detection

Motion sensors and video are used to detect bites (count). For instance, OpenPose is an off-the-shelf software that analyzes bite counts from videos [29][18]. To assess bite weight, weight sensors and acoustic sensors have been used [30,31][19][20]. In [30][19], the bite weight measurement also provides the estimation of a full portion. Chewing or swallowing is the most well-studied eating- and drinking-related activity, as reflected by the number of papers focusing on such activities (31 papers). Motion sensors and microphones are frequently employed for this purpose. For instance, in [32][21], a gyroscope is used for chewing detection, an accelerometer for swallowing detection and a proximity sensor to detect hand-to-mouth gestures. Microphones are typically used to register chewing and swallowing sounds. In most cases, commercially available microphones are applied, while the applied detection algorithms are custom-made. Video, electroglottograph (EGG) and electromyography (EMG) devices are also used to detect chewing and swallowing. EGG detects the variations in the electrical impedance caused by the passage of food during swallowing, while EMG in these studies monitors the masseter and temporalis muscle activation for recording chewing strokes. The advantages of EGG and EMG are that they can directly detect swallowing and chewing while eating and are not, or are less, affected by other body movements compared to motion sensors. However, EMG devices are not wireless and EGG sensors need to be worn around the face, which is not optimal for use in everyday eating situations.

2.3. Portion Size Estimation

Portion size is estimated mainly by using weight sensors and food image analysis. Regarding weight sensors, the amount of food consumed is calculated by comparing the weights of plates before and after eating. An open-source system consisting of a wireless pocket-sized kitchen scale connected to a mobile application has been reported in [33][22]. As shown in Figure 3, a system turning an everyday smartphone into a weighing scale is also available [34][23]. The relative vibration intensity of the smartphone’s vibration motor and its built-in accelerometer are used to estimate the weight of food that is placed on the smartphone. Off-the-shelf smartphone cameras are typically used for volume estimation from food images. Also, several studies use RGB-D images to get more accurate volume estimations from information on the height of the target food. For image-based approaches, AI-based algorithms are often employed to calculate portion size. Some studies made prototype systems applicable to real-life situations. In [35][24], acoustic data from a microphone was collected along with food images to measure the distance from the camera to the food. This enables the food in the image to be scaled to its actual size without training images and reference objects. However, in other cases, image processing mostly uses a reference for comparing the food size.
Figure 3. VibroScale (reproduced from [34] with permission).
VibroScale (reproduced from [23] with permission).
For estimating portion size in drinks, several kinds of sensors have been tested. An IMU in a smartwatch was used to estimate drink intake volume from sip duration [38][25]. Also, in [39][26], liquid sensors such as a capacitive sensor and a conductivity sensor were used to monitor the filling levels in a cup. Some research groups developed so-called smart fridges that automatically register food items and quantities. In [40][27], image analysis of a thermal image taken by an infrared (IR) sensor embedded in a fridge provides an estimation of a drink volume. Another study proposed a system called the Playful Bottle system [41][28], which consists of a smartphone attached to a common drinking mug. Drinking motions such as picking up the mug, tilting it back and placing it on the desk are detected by the phone’s accelerometer. After the drinking action is completed and the water line becomes steady, the phone’s camera captures an image of the amount of liquid in the mug (Figure 42).
Figure 42. Playful Bottle system (reproduced from [40] with permission).

2.4. Sensor Location

Playful Bottle system (reproduced from [27] with permission).

2.4. Sensor Location

Figure 5 indicates where sensors are typically located per objective. The locations of the sensors are classified as body-attached (e.g., ear, neck, head, glasses), embedded in objects (e.g., plates, cutlery) and the environment (e.g., distant camera, magnetic trackers). For eating/drinking activity detection, sensors are mostly worn on the body, followed by embedded in the objects. Body-worn sensors are also used for bite/chewing/swallowing detection. On the other hand, for portion size estimation, object-embedded and handheld sensors are mainly chosen depending on the measuring targets
3 indicates where sensors are typically located per objective. The locations of the sensors are classified as body-attached (e.g., ear, neck, head, glasses), embedded in objects (e.g., plates, cutlery) and the environment (e.g., distant camera, magnetic trackers). For eating/drinking activity detection, sensors are mostly worn on the body, followed by embedded in the objects. Body-worn sensors are also used for bite/chewing/swallowing detection. On the other hand, for portion size estimation, object-embedded and handheld sensors are mainly chosen depending on the measuring targets
Figure 53. Sensor placement per objective.

References

  1. World Health Organization. Diet, Nutrition, and the Prevention of Chronic Diseases: Report of a Joint WHO/FAO Expert Consultation; World Health Organization: Geneva, Switzerland, 2003; Volume 916.
  2. Magarey, A.; Watson, J.; Golley, R.K.; Burrows, T.; Sutherland, R.; McNaughton, S.A.; Denney-Wilson, E.; Campbell, K.; Collins, C. Assessing dietary intake in children and adolescents: Considerations and recommendations for obesity research. Int. J. Pediatr. Obes. 2011, 6, 2–11.
  3. Shim, J.-S.; Oh, K.; Kim, H.C. Dietary assessment methods in epidemiologic studies. Epidemiol. Health 2014, 36, e2014009.
  4. Thompson, F.E.; Subar, A.F.; Loria, C.M.; Reedy, J.L.; Baranowski, T. Need for technological innovation in dietary assessment. J. Am. Diet. Assoc. 2010, 110, 48–51.
  5. Almiron-Roig, E.; Solis-Trapala, I.; Dodd, J.; Jebb, S.A. Estimating food portions. Influence of unit number, meal type and energy density. Appetite 2013, 71, 95–103.
  6. Sharpe, D.; Whelton, W.J. Frightened by an old scarecrow: The remarkable resilience of demand characteristics. Rev. Gen. Psychol. 2016, 20, 349–368.
  7. Nix, E.; Wengreen, H.J. Social approval bias in self-reported fruit and vegetable intake after presentation of a normative message in college students. Appetite 2017, 116, 552–558.
  8. Robinson, E.; Kersbergen, I.; Brunstrom, J.M.; Field, M. I’m watching you. Awareness that food consumption is being monitored is a demand characteristic in eating-behaviour experiments. Appetite 2014, 83, 19–25.
  9. Diamantidou, E.; Giakoumis, D.; Votis, K.; Tzovaras, D.; Likothanassis, S. Comparing deep learning and human crafted features for recognising hand activities of daily living from wearables. In Proceedings of the 2022 23rd IEEE International Conference on Mobile Data Management (MDM), Paphos, Cyprus, 6–9 June 2022; pp. 381–384.
  10. Kyritsis, K.; Diou, C.; Delopoulos, A. A data driven end-to-end approach for in-the-wild monitoring of eating behavior using smartwatches. IEEE J. Biomed. Health Inform. 2021, 25, 22–34.
  11. Kyritsis, K.; Tatli, C.L.; Diou, C.; Delopoulos, A. Automated analysis of in meal eating behavior using a commercial wristband IMU sensor. In Proceedings of the 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jeju, Republic of Korea, 11–15 July 2017; pp. 2843–2846.
  12. Mirtchouk, M.; Lustig, D.; Smith, A.; Ching, I.; Zheng, M.; Kleinberg, S. Recognizing eating from body-worn sensors: Combining free-living and laboratory data. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies; Association for Computing Machinery: New York, NY, USA, 2017; Volume 1, p. 85.
  13. Zhang, S.; Zhao, Y.; Nguyen, D.T.; Xu, R.; Sen, S.; Hester, J.; Alshurafa, N. NeckSense: A multi-sensor necklace for detecting eating activities in free-living conditions. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies; Association for Computing Machinery: New York, NY, USA, 2020; Volume 4, p. 72.
  14. Chun, K.S.; Bhattacharya, S.; Thomaz, E. Detecting eating episodes by tracking jawbone movements with a non-contact wearable sensor. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies; Association for Computing Machinery: New York, NY, USA, 2018; Volume 2, p. 4.
  15. Farooq, M.; Doulah, A.; Parton, J.; McCrory, M.A.; Higgins, J.A.; Sazonov, E. Validation of sensor-based food intake detection by multicamera video observation in an unconstrained environment. Nutrients 2019, 11, 609.
  16. Wang, C.; Kumar, T.S.; De Raedt, W.; Camps, G.; Hallez, H.; Vanrumste, B. Eat-Radar: Continuous gine-grained eating gesture detection using FMCW radar and 3D temporal convolutional network. arXiv 2022, arXiv:2211.04253.
  17. Zhou, B.; Cheng, J.; Sundholm, M.; Reiss, A.; Huang, W.; Amft, O.; Lukowicz, P. Smart table surface: A novel approach to pervasive dining monitoring. In Proceedings of the 2015 IEEE International Conference on Pervasive Computing and Communications (PerCom), St. Louis, MO, USA, 23–27 March 2015; pp. 155–162.
  18. Qiu, J.; Lo, F.P.W.; Lo, B. Assessing individual dietary intake in food sharing scenarios with a 360 camera and deep learning. In Proceedings of the 2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Chicago, IL, USA, 19–22 May 2019; pp. 1–4.
  19. Mertes, G.; Ding, L.; Chen, W.; Hallez, H.; Jia, J.; Vanrumste, B. Measuring and localizing individual bites using a sensor augmented plate during unrestricted eating for the aging population. IEEE J. Biomed. Health Inform. 2020, 24, 1509–1518.
  20. Papapanagiotou, V.; Ganotakis, S.; Delopoulos, A. Bite-weight estimation using commercial ear buds. In Proceedings of the 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Guadalajara, Jalisco, Mexico, 1–5 November 2021; pp. 7182–7185.
  21. Bedri, A.; Li, D.; Khurana, R.; Bhuwalka, K.; Goel, M. FitByte: Automatic diet monitoring in unconstrained situations using multimodal sensing on eyeglasses. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–12.
  22. Biasizzo, A.; Koroušić Seljak, B.; Valenčič, E.; Pavlin, M.; Santo Zarnik, M.; Blažica, B.; O’Kelly, D.; Papa, G. An open-source approach to solving the problem of accurate food-intake monitoring. IEEE Access 2021, 9, 162835–162846.
  23. Zhang, S.; Xu, Q.; Sen, S.; Alshurafa, N. VibroScale: Turning your smartphone into a weighing scale. In Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers, Virtual Event, Mexico, 12–17 September 2020; Association for Computing Machinery: New York, NY, USA; pp. 176–179.
  24. Gao, J.; Tan, W.; Ma, L.; Wang, Y.; Tang, W. MUSEFood: Multi-Sensor-Based Food Volume Estimation on Smartphones. In Proceedings of the 2019 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI), Leicester, UK, 19–23 August 2019; pp. 899–906.
  25. Hamatani, T.; Elhamshary, M.; Uchiyama, A.; Higashino, T. FluidMeter: Gauging the human daily fluid intake using smartwatches. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies; Association for Computing Machinery: New York, NY, USA, 2018; Volume 2, p. 113.
  26. Kreutzer, J.F.; Deist, J.; Hein, C.M.; Lueth, T.C. Sensor systems for monitoring fluid intake indirectly and directly. In Proceedings of the 2016 IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), San Francisco, CA, USA, 14–17 June 2016; pp. 1–6.
  27. Sharma, A.; Misra, A.; Subramaniam, V.; Lee, Y. SmrtFridge: IoT-based, user interaction-driven food item & quantity sensing. In Proceedings of the 17th Conference on Embedded Networked Sensor Systems, New York, NY, USA, 10–13 November 2019; pp. 245–257.
  28. Chiu, M.-C.; Chang, S.-P.; Chang, Y.-C.; Chu, H.-H.; Chen, C.C.-H.; Hsiao, F.-H.; Ko, J.-C. Playful bottle: A mobile social persuasion system to motivate healthy water intake. In Proceedings of the 11th International Conference on Ubiquitous Computing, Orlando, FL, USA, 30 September–3 October 2009; pp. 185–194.
More
ScholarVision Creations