Assessing Eating Behaviour Using Upper Limb Mounted Motion Sensors: A Systematic Review
Abstract
:1. Introduction
2. Materials and Methods
2.1. Definition of Common Terms
2.2. Search Strategy
2.3. Selection Process
3. Results
3.1. Study Design
3.1.1. Participant Demographics
3.1.2. Country of Data Collection
3.1.3. Study Year
3.1.4. Environment
3.1.5. Eating Utensils
3.1.6. Food
3.1.7. Comparator
3.2. Sensor Configuration
3.2.1. Sensor Selection on the Upper Limbs
3.2.2. Sensor Device
3.2.3. Sensor Position on Upper Limbs
3.2.4. Sensor Fusion
3.2.5. Sensor Sampling Frequency
3.3. Detection Approach
3.3.1. Action Classes
3.3.2. Approach Category
3.3.3. Algorithm
3.4. Eating Behaviour Assessment
3.4.1. Eating Gesture Classification
3.4.2. Eating Activity Classification
3.4.3. Eating Characteristics Classification
4. Discussion
4.1. Research Environments and Ground Truth
4.2. Eating Context and Population Groups
4.3. Advanced Models and Deep Learning
4.4. Public Database Development
4.5. Granularity of Eating Behaviour Detection and Sensor Fusion
4.6. Applicability in Dietary Assessment and Eating Behaviour Interventions
4.7. Strengths and Limitations of the Current Review
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A
Search String |
---|
(accelerometer OR gyroscope OR smartwatch OR “inertial sensor” OR “inertial sensors” OR “inertial sensing” OR smartphone OR “cell phone” OR wristband) AND (“dietary intake” OR “dietary assessment” OR “food intake” OR “nutrition assessment” OR “eating activity” OR “eating activities” OR “eating behavior” OR “eating behaviour” OR “energy intake” OR “detecting eating” OR “detect eating” OR “eating episodes” OR “eating period”) AND (“bite counting” OR “counting bites” OR “hand gesture” OR “hand gestures” OR “arm gesture” OR “arm gestures” OR “wrist gesture” OR “wrist gestures” OR “hand motion” OR “hand motions” OR “arm motion” OR “arm motions” OR “wrist motion” OR “wrist motions” OR “hand movement” OR “hand movements” OR “arm movement” OR “arm movements” OR “wrist movement” OR “wrist movements” OR “hand to mouth” OR “hand-to-mouth” OR “wrist-worn” OR “wrist-mounted”) |
Database name | Result |
---|---|
ACM | 10 |
AIS Electronic Library | 55 |
CINAHL | 16 |
EMBASE 1 | |
IEEE 2 | 140 |
MEDLINE 1 | |
Ovid databases 3 | 54 |
ScienceDirect 4 | 133 |
Scopus | 161 |
SpringerLink | 205 |
Web of Science | 18 |
Total results before removing duplicates | 792 |
Total results after removing duplicates | 653 |
References
- Shen, Y.; Salley, J.; Muth, E.; Hoover, A. Assessing the accuracy of a wrist motion tracking method for counting bites across demographic and food variables. IEEE J. Biomed. Heal. Inform. 2017, 21, 599–606. [Google Scholar] [CrossRef] [PubMed]
- Thomaz, E.; Essa, I.; Abowd, G.D. A Practical Approach for Recognizing Eating Moments with Wrist-Mounted Inertial Sensing. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing, Osaka, Japan, 7–11 September 2015; pp. 1029–1040. [Google Scholar]
- Dong, Y.; Hoover, A.; Scisco, J.; Muth, E. A new method for measuring meal intake in humans via automated wrist motion tracking. Appl. Psychophysiol. Biofeedback 2012, 37, 205–215. [Google Scholar] [CrossRef] [PubMed]
- Kim, J.; Lee, M.; Lee, K.-J.; Lee, T.; Bae, B.-C.; Cho, J.-D. An Eating Speed Guide System Using a Wristband and Tabletop Unit. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct, Heidelberg, Germany, 12–16 September 2016; pp. 121–124. [Google Scholar]
- Farooq, M.; Fontana, J.M.; Boateng, A.F.; Mccrory, M.A.; Sazonov, E. A Comparative Study of Food Intake Detection Using Artificial Neural Network and Support Vector Machine. In Proceedings of the 12th International Conference on Machine Learning and Applications, Miami, FL, USA, 4–7 December 2013; pp. 153–157. [Google Scholar]
- Fontana, J.M.; Farooq, M.; Sazonov, E. Automatic ingestion monitor: A novel wearable device for monitoring of ingestive behavior. IEEE Trans. Biomed. Eng. 2014, 61, 1772–1779. [Google Scholar] [CrossRef] [PubMed]
- Sen, S.; Subbaraju, V.; Misra, A.; Balan, R.K.; Lee, Y. Experiences in Building a Real-World Eating Recogniser. In Proceedings of the IEEE International on Workshop on Physical Analytics, New York, NY, USA, 19 June 2017; pp. 7–12. [Google Scholar]
- Varkey, J.P.; Pompili, D.; Walls, T.A. Human motion recognition using a wireless sensor-based wearable system. Pers. Ubiquitous Comput. 2012, 16, 897–910. [Google Scholar] [CrossRef]
- Zhou, Y.; Cheng, Z.; Jing, L.; Tatsuya, H. Towards unobtrusive detection and realistic attribute analysis of daily activity sequences using a finger-worn device. Appl. Intell. 2015, 43, 386–396. [Google Scholar] [CrossRef]
- Kalantarian, H.; Alshurafa, N.; Sarrafzadeh, M. A survey of diet monitoring technology. IEEE Pervasive Comput. 2017, 16, 57–65. [Google Scholar] [CrossRef]
- Rouast, P.; Adam, M.; Burrows, T.; Chiong, R. Using Deep Learning and 360 Video to Detect Eating Behavior for User Assistance Systems. In Proceedings of the European Conference on Information Systems, Portsmouth, UK, 23–28 June 2018; pp. 1–11. [Google Scholar]
- Ramos-Garcia, R.I.; Muth, E.R.; Gowdy, J.N.; Hoover, A.W. Improving the recognition of eating gestures using intergesture sequential dependencies. IEEE J. Biomed. Heal. Inform. 2015, 19, 825–831. [Google Scholar] [CrossRef] [PubMed]
- Dong, Y.; Hoover, A.; Muth, E. A Device for Detecting and Counting Bites of Food Taken by a Person during Eating. In Proceedings of the IEEE International Conference on Bioinformatics and Biomedicine, Washington, DC, USA, 1–4 November 2009; pp. 265–268. [Google Scholar]
- Dong, B.; Biswas, S. Meal-time and duration monitoring using wearable sensors. Biomed. Signal Process. Control 2017, 32, 97–109. [Google Scholar] [CrossRef]
- Parate, A.; Ganesan, D. Detecting Eating and Smoking Behaviors Using Smartwatches. In Mobile Health; Rehg, J.M., Murphy, S.A., Kumar, S., Eds.; Springer: Cham, Switzerland, 2017; pp. 175–201. ISBN 978-3-319-51393-5. [Google Scholar]
- Prioleau, T.; Moore II, E.; Ghovanloo, M. Unobtrusive and wearable systems for automatic dietary monitoring. IEEE Trans. Biomed. Eng. 2017, 64, 2075–2089. [Google Scholar] [CrossRef] [PubMed]
- Zhang, S.; Stogin, W.; Alshurafa, N. I sense overeating: Motif-based machine learning framework to detect overeating using wrist-worn sensing. Inf. Fusion 2018, 41, 37–47. [Google Scholar] [CrossRef]
- Hassannejad, H.; Matrella, G.; Ciampolini, P.; De Munari, I.; Mordonini, M.; Cagnoni, S. Automatic diet monitoring: a review of computer vision and wearable sensor-based methods. Int. J. Food Sci. Nutr. 2017, 68, 656–670. [Google Scholar] [CrossRef] [PubMed]
- Vu, T.; Lin, F.; Alshurafa, N.; Xu, W. Wearable food intake monitoring technologies: A comprehensive review. Computers 2017, 6, 4. [Google Scholar] [CrossRef]
- Doulah, A.; Mccrory, M.A.; Higgins, J.A.; Sazonov, E. A Systematic Review of Technology-Driven Methodologies for Estimation of Energy Intake. IEEE Access 2019, 7, 49653–49668. [Google Scholar] [CrossRef]
- Covidence: Better Systematic Review Management. Available online: https://www.covidence.org (accessed on 21 Feburary 2019).
- UN World Economic Situation and Prospects: Statistical Annex. Available online: https://www.un.org/development/desa/dpad/wp-content/uploads/sites/45/WESP2018_Annex.pdf (accessed on 21 Feburary 2019).
- Amft, O.; Junker, H.; Tröster, G. Detection of Eating and Drinking Arm Gestures Using Inertial Body-Worn Sensors. In Proceedings of the Ninth IEEE International Symposium on Wearable Computers, Osaka, Japan, 18–21 October 2005; pp. 1–4. [Google Scholar]
- Amft, O.; Kusserow, M.; Tröster, G. Probabilistic Parsing of Dietary Activity Events. In Proceedings of the 4th International Workshop on Wearable and Implantable Body Sensor Networks, Aachen, Germany, 26–28 March 2007; pp. 242–247. [Google Scholar]
- Amft, O.; Tröster, G. Recognition of dietary activity events using on-body sensors. Artif. Intell. Med. 2008, 42, 121–136. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Junker, H.; Amft, O.; Lukowicz, P.; Tröster, G. Gesture spotting with body-worn inertial sensors to detect user activities. Pattern Recognit. 2008, 41, 2010–2024. [Google Scholar] [CrossRef]
- Van Laerhoven, K.; Kilian, D.; Schiele, B. Using Rhythm Awareness in Long-Term Activity Recognition. In Proceedings of the IEEE International Symposium on Wearable Computers, Pittsburgh, PA, USA, 28 September–1 October 2008; pp. 63–66. [Google Scholar]
- Pirkl, G.; Stockinger, K.; Kunze, K.; Lukowicz, P. Adapting Magnetic Resonant Coupling Based Relative Positioning Technology for Wearable Activitiy Recogniton. In Proceedings of the 12th IEEE International Symposium on Wearable Computers, Pittsburgh, PA, USA, 28 September–1 October 2008; pp. 47–54. [Google Scholar]
- Tolstikov, A.; Biswas, J.; Tham, C.-K.; Yap, P. Eating Activity Primitives Detection—A Step Towards ADL Recognition. In Proceedings of the 10th International Conference on e-health Networking, Applications and Services, Singapore, 7–9 July 2008; pp. 35–41. [Google Scholar]
- Zhang, S.; Ang, M.H.; Xiao, W.; Tham, C.K. Detection of Activities for Daily Life Surveillance: Eating and Drinking. In Proceedings of the IEEE Intl. Conf. on e-Health Networking, Applications and Service, Singapore, 7–9 July 2008; pp. 171–176. [Google Scholar]
- Teixeira, T.; Jung, D.; Dublon, G.; Savvides, A. Recognizing Activities from Context and Arm Pose Using Finite State Machines. In Proceedings of the International Conference on Distributed Smart Cameras, Como, Italy, 30 August–2 September 2009; pp. 1–8. [Google Scholar]
- Amft, O.; Bannach, D.; Pirkl, G.; Kreil, M.; Lukowicz, P. Towards Wearable Sensing-Based Assessment of Fluid Intake. In Proceedings of the 8th IEEE International Conference on Pervasive Computing and Communications Workshops, Mannheim, Germany, 29 March–2 April 2010; pp. 298–303. [Google Scholar]
- Dong, Y.; Hoover, A.; Scisco, J.; Muth, E. Detecting Eating Using a Wrist Mounted Device During Normal Daily Activities. In Proceedings of the International Conference on Embedded Systems and Applications, Las Vegas, NV, USA, 18–21 July 2011; pp. 3–9. [Google Scholar]
- Grosse-Puppendahl, T.; Berlin, E.; Borazio, M. Enhancing accelerometer-based activity recognition with capacitive proximity sensing. In Ambient Intelligence; Paternò, F., de Ruyter, B., Markopoulos, P., Santoro, C., van Loenen, E., Luyten, K., Eds.; Springer: Pisa, Italy, 2012; pp. 17–32. [Google Scholar]
- Kim, H.J.; Kim, M.; Lee, S.J.; Choi, Y.S. An Analysis of Eating Activities for Automatic Food Type Recognition. In Proceedings of the Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, Hollywood, CA, USA, 3–6 December 2012; pp. 1–5. [Google Scholar]
- Mousavi Hondori, H.; Khademi, M.; Videira Lopes, C. Monitoring Intake Gestures Using Sensor Fusion (Microsoft Kinect and Inertial Sensors) for Smart Home Tele-Rehab Setting. In Proceedings of the 1st Annual Healthcare Innovation Conference of the IEEE EMBS, Houston, TX, USA, 7–9 November 2012; pp. 36–39. [Google Scholar]
- Fontana, J.M.; Farooq, M.; Sazonov, E. Estimation of Feature Importance for Food Intake Detection Based on Random Forests Classification. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Osaka, Japan, 3–7 July 2013; pp. 6756–6759. [Google Scholar]
- Kim, H.-J.; Choi, Y.S. Eating Activity Recognition for Health and Wellness: A Case Study on Asian Eating Style. In Proceedings of the IEEE International Conference on Consumer Electronics, Las Vegas, NV, USA, 11–14 January 2013; pp. 446–447. [Google Scholar]
- Ramos-Garcia, R.I.; Hoover, A.W. A study of Temporal Action Sequencing during Consumption of a Meal. In Proceedings of the International Conference on Bioinformatics, Computational Biology and Biomedical Informatics, Wshington, DC, USA, 22–25 September 2013; pp. 68–75. [Google Scholar]
- Desendorf, J.; Bassett, D.R.; Raynor, H.A.; Coe, D.P. Validity of the Bite Counter Device in a Controlled laboratory setting. Eat. Behav. 2014, 15, 502–504. [Google Scholar] [CrossRef] [PubMed]
- Dong, Y.; Scisco, J.; Wilson, M.; Muth, E.; Hoover, A. Detecting periods of eating during free-living by tracking wrist motion. IEEE J. Biomed. Heal. Inform. 2014, 18, 1253–1260. [Google Scholar] [CrossRef]
- Sen, S.; Subbaraju, V.; Misra, A.; Balan, R.K.; Lee, Y. The Case for Smartwatch-Based Diet Monitoring. In Proceedings of the IEEE International Conference on Pervasive Computing and Communication Workshops, St. Louis, MO, USA, 23–27 March 2015; pp. 585–590. [Google Scholar]
- Ye, X.; Chen, G.; Cao, Y. Automatic Eating Detection Using Head-Mount and Wrist-Worn Accelerometers. In Proceedings of the 17th International Conference on E-Health Networking, Application and Services, Boston, MA, USA, 14–17 October 2015; pp. 578–581. [Google Scholar]
- Fan, D.; Gong, J.; Lach, J. Eating Gestures Detection by Tracking Finger Motion. In Proceedings of the IEEE Wireless Health, Bethesda, MD, USA, 25–27 October 2016; pp. 1–6. [Google Scholar]
- Farooq, M.; Sazonov, E. Detection of Chewing from Piezoelectric Film Sensor Signals Using Ensemble Classifiers. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Orlando, FL, USA, 16–20 August 2016; pp. 4929–4932. [Google Scholar]
- Fortuna, C.; Giraud-Carrier, C.; West, J. Hand-to-Mouth Motion Tracking in Free-Living Conditions for Improved Weight Control. In Proceedings of the IEEE International Conference on Healthcare Informatics, Chicago, IL, USA, 4–7 October 2016; pp. 341–348. [Google Scholar]
- Maramis, C.; Kilintzis, V.; Maglaveras, N. Real-Time Bite Detection from Smartwatch Orientation Sensor Data. In Proceedings of the ACM International Conference Proceeding Series, Thessaloniki, Greece, 18–20 May 2016; pp. 1–4. [Google Scholar]
- Mirtchouk, M.; Merck, C.; Kleinberg, S. Automated Estimation of Food Type and Amount Consumed from Body-Worn Audio and Motion Sensors. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing, Heidelberg, Germany, 12–16 September 2016; pp. 451–462. [Google Scholar]
- Parra-Sánchez, S.; Gomez-González, J.M.; Ortega, I.A.Q.; Mendoza-Novelo, B.; Delgado-García, J.; Cruz, M.C.; Vega-González, A. Recognition of Activities of Daily Living Based on the Vertical Displacement of the Wrist. In Proceedings of the 14th Mexican Symposium on Medical Physics, México City, Mexico, 16–17 March 2016; pp. 1–6. [Google Scholar]
- Rahman, T.; Czerwinski, M.; Gilad-Bachrach, R.; Johns, P. Predicting “about-to-eat” Moments for just-in-Time Eating Intervention. In Proceedings of the Digital Health Conference, Montreal, QC, Canada, 11–13 April 2016; pp. 141–150. [Google Scholar]
- Sharma, S.; Jasper, P.; Muth, E.; Hoover, A. Automatic Detection of Periods of Eating Using Wrist Motion Tracking. In Proceedings of the IEEE International Conference on Connected Health, Washington, DC, USA, 27–29 June 2016. [Google Scholar]
- Shen, Y.; Muth, E.; Hoover, A. Recognizing Eating Gestures Using Context Dependent Hidden Markov Models. In Proceedings of the IEEE 1st International Conference on Connected Health: Applications, Systems and Engineering Technologies, Washington, DC, USA, 27–29 June 2016; pp. 248–253. [Google Scholar]
- Shoaib, M.; Scholten, H.; Havinga, P.J.M.; Incel, O.D. A Hierarchical Lazy Smoking Detection Algorithm Using Smartwatch Sensors. In Proceedings of the IEEE 18th International Conference on e-Health Networking, Applications and Services, Munich, Germany, 14–16 September 2016; pp. 1–6. [Google Scholar]
- Ye, X.; Chen, G.; Gao, Y.; Wang, H.; Cao, Y. Assisting Food Journaling with Automatic Eating Detection. In Proceedings of the Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 3255–3262. [Google Scholar]
- Zhang, S.; Alharbi, R.; Stogin, W.; Pourhomayun, M.; Spring, B.; Alshurafa, N. Food Watch: Detecting and Characterizing Eating Episodes through Feeding Gestures. In Proceedings of the 11th EAI International Conference on Body Area Networks, Turin, Italy, 15–16 December 2016; pp. 91–96. [Google Scholar]
- Alexander, B.; Karakas, K.; Kohout, C.; Sakarya, H.; Singh, N.; Stachtiaris, J.; Barnes, L.E.; Gerber, M.S. A Behavioral Sensing System that Promotes Positive Lifestyle Changes and Improves Metabolic Control among Adults with Type 2 Diabetes. In Proceedings of the IEEE Systems and Information Engineering Design Symposium, Charlottesville, VA, USA, 28 April 2017; pp. 283–288. [Google Scholar]
- Bi, C.; Xing, G.; Hao, T.; Huh, J.; Peng, W.; Ma, M. FamilyLog: A Mobile System for Monitoring Family Mealtime Activities. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications, Kona, HI, USA, 13–17 March 2017; pp. 1–10. [Google Scholar]
- Egilmez, B.; Poyraz, E.; Zhou, W.; Memik, G.; Dinda, P.; Alshurafa, N. UStress: Understanding College Student Subjective Stress Using Wrist-Based Passive Sensing. In Proceedings of the IEEE WristSense Workshop on Sensing Systems and Applications using Wrist Worn Smart Devices, Kona, HI, USA, 13–17 March 2017; pp. 1–6. [Google Scholar]
- Garcia-Ceja, E.; Galván-Tejada, C.E.; Brena, R. Multi-view stacking for activity recognition with sound and accelerometer data. Inf. Fusion 2017, 40, 45–56. [Google Scholar] [CrossRef]
- Kyritsis, K.; Tatli, C.L.; Diou, C.; Delopoulos, A. Automated Analysis of in Meal Eating Behavior Using a Commercial Wristband IMU Sensor. In Proceedings of the International Conference of the IEEE Engineering in Medicine and Biology Society, Seogwipo, Korea, 11–15 July 2017; pp. 2843–2846. [Google Scholar]
- Kyritsis, K.; Diou, C.; Delopoulos, A. Food Intake Detection from Inertial Sensors USING Lstm Networks. In Proceedings of the International Conference on Image Analysis and Processing, Catania, Italy, 11–15 September 2017; pp. 411–418. [Google Scholar]
- Loke, S.W.; Abkenar, A.B. Assigning group activity semantics to multi-device mobile sensor data. KI—Künstliche Intelligenz 2017, 31, 349–355. [Google Scholar] [CrossRef]
- Moschetti, A.; Fiorini, L.; Esposito, D.; Dario, P.; Cavallo, F. Daily Activity Recognition with Inertial Ring and Bracelet: An Unsupervised Approach. In Proceedings of the IEEE International Conference on Robotics and Automation, Singapore, 29 May–3 June 2017; pp. 3250–3255. [Google Scholar]
- Soubam, S.; Agrawal, M.; Naik, V. Using an Arduino and a Smartwatch to Measure Liquid Consumed from any Container. In Proceedings of the 9th International Conference on Communication Systems and Networks, Bangalore, India, 4–8 January 2017; pp. 464–467. [Google Scholar]
- Thomaz, E.; Bedri, A.; Prioleau, T.; Essa, I.; Abowd, G.D. Exploring Symmetric and Asymmetric Bimanual Eating Detection with Inertial Sensors on the Wrist. In Proceedings of the 1st Workshop on Digital Biomarkers, Niagara Falls, NY, USA, 23 June 2017; pp. 21–26. [Google Scholar]
- Yoneda, K.; Weiss, G.M. Mobile Sensor-Based Biometrics Using Common Daily Activities. In Proceedings of the IEEE Annual Ubiquitous Computing, Electronics and Mobile Communication Conference, New York, NY, USA, 19–21 October 2017; pp. 584–590. [Google Scholar]
- Zhang, S.; Alharbi, R.; Nicholson, M.; Alshurafa, N. When Generalized Eating Detection Machine Learning Models Fail in the Field. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the ACM International Symposium on Wearable Computers, Maui, HI, USA, 11–15 September 2017; pp. 613–622. [Google Scholar]
- Anderez, D.O.; Lotfi, A.; Langensiepen, C. A Hierarchical Approach in Food and Drink Intake Recognition Using Wearable Inertial Sensors. In Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference on, Corfu, Greece, 26–29 June 2018; pp. 552–557. [Google Scholar]
- Anderez, D.O.; Lotfi, A.; Langensiepen, C. A novel crossings-based segmentation approach for gesture recognition. In Advances in Computational Intelligence Systems; Springer: Cham, Switzerland, 2018; pp. 383–391. [Google Scholar]
- Balaji, R.; Bhavsar, K.; Bhowmick, B.; Mithun, B.S.; Chakravarty, K.; Chatterjee, D.; Ghose, A.; Gupta, P.; Jaiswal, D.; Kimbahune, S.; et al. A Framework for Pervasive and Ubiquitous Geriatric Monitoring. In Proceedings of the Human Aspects of IT for the Aged Population. Applications in Health, Assistance, and Entertainment, Las Vegas, NV, USA, 15–20 July 2018; pp. 205–230. [Google Scholar]
- Cho, J.; Choi, A. Asian-Style Food Intake Pattern Estimation Based on Convolutional Neural Network. In Proceedings of the IEEE International Conference on Consumer Electronics, Las Vegas, NV, USA, 12–14 January 2018; pp. 1–2. [Google Scholar]
- Clapés, A.; Pardo, À.; Pujol Vila, O.; Escalera, S. Action Detection Fusing Multiple Kinects and a WIMU: An application to in-home assistive technology for the elderly. Mach. Vis. Appl. 2018, 29, 765–788. [Google Scholar] [CrossRef]
- Kyritsis, K.; Diou, C.; Delopoulos, A. End-to-End Learning for Measuring in-Meal Eating Behavior From a Smartwatch. In Proceedings of the 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Honolulu, HI, USA, 18–21 July 2018; pp. 5511–5514. [Google Scholar]
- Manzi, A.; Moschetti, A.; Limosani, R.; Fiorini, L.; Cavallo, F. Enhancing activity recognition of self-localized robot through depth camera and wearable sensors. IEEE Sens. J. 2018, 18, 9324–9331. [Google Scholar] [CrossRef]
- Papadopoulos, A.; Kyritsis, K.; Sarafis, I.; Delopoulos, A. Personalised Meal Eating Behaviour Analysis via Semi-Supervised Learning. In Proceedings of the 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Honolulu, HI, USA, 18–21 July 2018; pp. 4768–4771. [Google Scholar]
- Schibon, G.; Amft, O. Saving Energy on Wrist-Mounted Inertial Sensors by Motion-Adaptive Duty-Cycling in Free-Living. In Proceedings of the IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks, Las Vegas, NV, USA, 4–7 March 2018; pp. 197–200. [Google Scholar]
- Shen, Y.; Muth, E.; Hoover, A. The Impact of Quantity of Training Data on Recognition of Eating Gestures. Available online: https://arxiv.org/pdf/1812.04513.pdf (accessed on 7 February 2019).
- Zambrana, C.; Idelsohn-Zielonka, S.; Claramunt-Molet, M.; Almenara-Masbernat, M.; Opisso, E.; Tormos, J.M.; Miralles, F.; Vargiu, E. Monitoring of upper-limb movements through inertial sensors—Preliminary results. Smart Heal. 2018. [Google Scholar] [CrossRef]
- Burrows, T.; Collins, C.; Adam, M.; Duncanson, K.; Rollo, M. Dietary assessment of shared plate eating: A missing link. Nutrients 2019, 11, 789. [Google Scholar] [CrossRef]
- Savy, M.; Martin-Prével, Y.; Sawadogo, P.; Kameli, Y.; Delpeuch, F. Use of variety/diversity scores for diet quality measurement: relation with nutritional status of women in a rural area in Burkina Faso. Eur. J. Clin. Nutr. 2005, 59, 703–716. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Savy, M.; Martin-Prével, Y.; Traissac, P.; Delpeuch, F. Measuring dietary diversity in rural Burkina Faso: comparison of a 1-day and a 3-day dietary recall. Public Health Nutr. 2007, 10, 71–78. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Rouast, P.V.; Adam, M.T.P.; Chiong, R. Deep learning for human affect recognition: insights and new developments. IEEE Trans. Affect. Comput. 2019, 1–20. [Google Scholar] [CrossRef]
- Mollahosseini, A.; Hasani, B.; Mahoor, M.H. AffectNet: A database for facial expression, valence, and arousal computing in the wild. IEEE Trans. Affect. Comput. 2019, 10, 18–31. [Google Scholar] [CrossRef]
- Boushey, C.J.; Spoden, M.; Zhu, F.M.; Delp, E.J.; Kerr, D.A. New mobile methods for dietary assessment: review of image-assisted and image-based dietary assessment methods. Proc. Nutr. Soc. 2017, 76, 283–294. [Google Scholar] [CrossRef] [PubMed]
- Rollo, M.; Ash, S.; Lyons-Wall, P.; Russell, A.; Rollo, M.E.; Ash, S.; Lyons-Wall, P.; Russell, A.W. Evaluation of a mobile phone image-based dietary assessment method in adults with type 2 diabetes. Nutrients 2015, 7, 4897–4910. [Google Scholar] [CrossRef] [PubMed]
- Turner-McGrievy, G.M.; Boutté, A.; Crimarco, A.; Wilcox, S.; Hutto, B.E.; Hoover, A.; Muth, E.R. Byte by bite: Use of a mobile Bite Counter and weekly behavioral challenges to promote weight loss. Smart Health 2017, 3–4, 20–26. [Google Scholar] [CrossRef] [PubMed]
- Scisco, J.L.; Muth, E.R.; Dong, Y.; Hoover, A.W. Slowing bite-rate reduces energy intake: an application of the Bite Counter device. J. Am. Diet. Assoc. 2011, 111, 1231–1235. [Google Scholar] [CrossRef] [PubMed]
- Noorbergen, T.J.; Adam, M.T.P.; Attia, J.R.; Cornforth, D.J.; Minichiello, M. Exploring the design of mHealth systems for health behavior change using mobile biosensors. Commun. Assoc. Inf. Syst. 2019, 1–37. (in press). [Google Scholar]
Term | Synonyms Used in the Literature | Definition |
---|---|---|
Action classes | Event/activity classifications/categories | Different categories that the classifiers (detection models) are trained and tested to classify |
Artificial intelligence model | Artificial intelligence approach, machine learning algorithm | The approach used for automatic eating behaviour detection |
Backward search | Search through references of included studies to find other relevant studies that are not found through database search | |
Eating activity | Eating and drinking activity | |
Eating behaviour assessment | Food intake detection, eating detection, ingestion monitoring | Assessing whether the participant is eating (including drinking) and what their eating characteristics are |
Forward search | Search for relevant studies that cited included studies | |
F-score | F1 score, F-measure | F-score is a measure of accuracy. While accuracy is the total number of correctly classified items divided by all classified items, F-score is harmonic average of the precision and recall. |
Hand-to-mouth gesture | Hand-to-mouth movement | The movement of hand carrying food with or without utensils to the mouth |
Motion sensors | Motion tracking sensors, motion detection sensors, activity tracker | Sensors used to detect movements. Wearable motion sensors focused on in the current review include upper limb-mounted motion sensors. |
Participant | Subject | An individual who has successfully participated in a study (i.e., not counting individuals who were invited but did not participate or individuals with failed measurements) |
Upper limb | Arm | Region of body that includes shoulder, upper arm, lower arm, wrist, and hand |
Author (Year), Country [Ref] | Type | Participants (Environment) | Upper Limb Sensor (Frequency) [Position] | Comparator | Algorithm (Detection) |
---|---|---|---|---|---|
Amft et al. (2005), Switzerland [23] | Conference | 2 (lab) | Acc/Gyro (100 Hz) [wrist, both] | NR | FSS, HMM (GD) |
Amft et al. (2007), Switzerland [24] | Conference | 1 (lab) | Acc/Gyro (NR) [lower arm, both] | NR | PCFG (AD) |
Amft et al. (2008), Switzerland [25] | Journal | 4 (lab) | Acc/Gyro (100 Hz) [lower arm, both] | NR | FSS (GD) |
Junker et al. (2008), Switzerland [26] | Journal | 4 (Lab) | Acc/Gyro (100 Hz) [lower arm, both] | NR | HMM (GD) |
Laerhoven et al. (2008), Germany [27] | Conference | 2 (free-living) | Acc (NR) [wrist, dominant] | Other self-report | KNN (AD) |
Pirkl et al. (2008), Germany [28] | Conference | 1 (lab) | Acc/Gyro (50 Hz), Prox [lower arm, dominant] | NR | DT (GD) |
Tolstikov et al. (2008), Singapore [29] | Conference | NR (lab) | Acc (50 Hz), Prox [wrist, dominant] | Camera | DBN (AD) |
Zhang et al. (2008), Singapore [30] | Conference | NR (lab) | Acc (NR) [wrist, both] | NR | HTM (GD) |
Dong et al. (2009), USA [13] | Conference | 10 (lab) | Acc/Gyro (60 Hz) [wrist, dominant] | Camera | C/RB (GD) |
Teixeira et al. (2009), USA [31] | Conference | 1 (free-living) | Acc/Gyro (NR) [wrist, dominant] | NR | FSM (AD) |
Amft et al. (2010), Netherlands [32] | Conference | 9 (lab) | Acc/Gyro (87 Hz), Prox [wrist, dominant] | Camera | DT, FSS (GD) |
Dong et al. (2011), USA [33] | Conference | 4 (free-living) | Acc/Gyro (60 Hz) [wrist, dominant] | Diary | C/RB (AD) |
Dong et al. (2012), USA [3] | Journal | 102 (both) | Acc/Gyro (NR) [wrist, dominant] | Camera, Diary | C/RB (GD) |
Grosse-Puppendahl et al. (2012), Germany [34] | Chapter | 7 (lab) | Acc (NR), Prox (NR) [wrist, dominant] | NR | SVM (AD) |
Kim et al. (2012), South Korea [35] | Conference | 13 (lab) | Acc (30 Hz) [wrist, dominant] | Camera | DT (CD, GD) |
Mousavi Hondori et al. (2012), USA [36] | Conference | 1 (lab) | Acc (NR) [utensil, both] | NR | NR (NR) |
Varkey et al. (2012), USA [8] | Journal | 1 (lab) | Acc/Gyro (20 Hz) [wrist, dominant] | NR | SVM (AD) |
Farooq et al. (2013), USA [5] | Conference | 13 (free-living) | Prox (NR) [wrist, dominant] | Diary, Push Button | ANN, SVM (AD) |
Fontana et al. (2013), USA [37] | Conference | 12 (free-living) | Prox (10 Hz) [wrist, dominant] | Push Button | RF (AD) |
Kim & Choi (2013), South Korea [38] | Conference | 8 (lab) | Acc (30 Hz) [wrist, dominant] | Camera | DT, NB (AD, CD, GD) |
Ramos-Garcia & Hoover (2013), USA [39] | Conference | 273 (lab) | Acc/Gyro (15 Hz) [wrist, dominant] | Camera | HMM, KNN (GD) |
Desendorf et al. (2014), USA [40] | Journal | 15 (lab) | Acc (80 Hz) [wrist, dominant] | NR | C/RB (GD) |
Dong et al. (2014), USA [41] | Journal | 43 (free-living) | Acc/Gyro (NR) [lower arm, dominant] | Mobile App | C/RB, NB (GD, AD) |
Fontana et al. (2014), USA [6] | Journal | 12 (free-living) | Prox (NR) [wrist, dominant] | Mobile App, Push Button | ANN (AD) |
Ramos-Garcia et al. (2015), USA [12] | Journal | 25 (lab) | Acc/Gyro (15 Hz) [wrist, dominant] | Camera | HMM, KNN (GD) |
Sen et al. (2015), Singapore [42] | Conference | 6 (lab) | Acc/Gyro (100 Hz) [wrist, dominant] | Wearable Camera | C/RB (GD) |
Thomaz et al. (2015), USA [2] | Conference | 28 (both) | Acc (25 Hz) [wrist, dominant] | Camera, Wearable Camera | DBSCAN, KNN, RF, SVM (AD, GD) |
Ye et al. (2015), USA [43] | Conference | 10 (lab) | Acc (50 Hz) [wrist, dominant] | Camera | DT, NB, SVM (GD) |
Zhou et al. (2015), Japan [9] | Journal | 5 (lab) | Acc (NR) [finger, dominant] | NR | DT, KNN (GAD) |
Fan et al. (2016), USA [44] | Journal | 1 (lab) | Acc/Gyro (50.1 Hz) [wrist, dominant] | NR | ANN, DT, KNN, NB, Reg, RF, SVM (AD) |
Farooq & Sazonov (2016), USA [45] | Conference | 12 (free-living) | Prox. (NR) [wrist, dominant] | Mobile App, Push Button | DT, Reg (AD) |
Fortuna et al. (2016), USA [46] | Conference | 3 (free-living) | Acc/Gyro (10 Hz) [wrist, dominant] | Camera, Wearable Camera | NB (GD) |
Kim et al. (2016), South Korea [4] | Conference | 15 (lab) | Acc/Gyro (NR) [wrist, dominant] | Camera | C/RB (GD) |
Maramis et al. (2016), Greece [47] | Conference | 8 (lab) | Acc/Gyro (NR) [wrist, dominant] | Camera | SVM (GD) |
Mirtchouk M. et al. (2016), USA [48] | Conference | 6 (lab) | Acc/Gyro (15 Hz) [wrist, both] | Camera | RF (CD) |
Parra-Sanchez et al. (2016), Mexico [49] | Conference | 7 (lab) | Electro-hydraulic (NR) [arm, both] | Camera | DT, HMM (GAD) |
Rahman et al. (2016), Canada [50] | Conference | 8 (free-living) | Acc/Gyro (15 Hz) [wrist, dominant] | Mobile App | DT, Reg, RF, SVM (AD, CD) |
Sharma et al. (2016), USA [51] | Conference | 94 (free-living) | Acc/Gyro (15 Hz) [wrist, dominant] | NR | C/RB, NB (AD, GD) |
Shen et al. (2016), USA [52] | Conference | 215 (lab) | Acc/Gyro (15 Hz) [wrist, dominant] | Camera | HMM (GD) |
Shoaib et al. (2016), Netherlands [53] | Conference | 11 (lab) | Acc/Gyro (50 Hz) [wrist, dominant] | NR | DT, RF, SVM (AD) |
Ye et al. (2016), USA [54] | Conference | 7 (free-living) | Acc (50 Hz) [wrist, dominant] | Mobile App | SVM (GD) |
Zhang et al. (2016), USA [55] | Conference | 15 (lab) | Acc/Gyro (31 Hz) [wrist, both] | Camera | DBSCAN, DT, NB, Reg, RF, SVM (GD, AD) |
Alexander et al. (2017), USA [56] | Journal | 4 (lab) | Acc/Gyro (20 Hz) [wrist, dominant] | Time Sync | NB (AD) |
Bi et al. (2017), USA [57] | Journal | 37 (free-living) | Acc (80 Hz) [wrist, dominant] | Other self-report | HMM, SVM (AD) |
Dong & Biswas (2017), USA [14] | Journal | 14 (lab) | Acc (100 Hz) [wrist, dominant] | Camera, Push Button | C/RB, HMM, SVM (GD, AD) |
Egilmez et al. (2017), USA [58] | Journal | 9 (lab) | Acc/Gyro (5 Hz) [wrist, dominant] | NR | NB, Reg, RF, SVM (GAD) |
Garcia-Ceja et al. (2017), Mexico [59] | Journal | 3 (lab) | Acc/Gyro (31 Hz) [wrist, dominant] | NR | RF (GAD) |
Kyritsis et al. (2017), Greece [60] | Conference | 8 (lab) | Acc/Gyro (62 Hz) [wrist, dominant] | Camera | HMM, SVM (AD, GD) |
Kyritsis et al. (2017), Greece [61] | Conference | 10 (lab) | Acc/Gyro (62 Hz) [wrist, dominant] | Camera | DL, SVM (AD, GD) |
Loke & Abkenar (2017), Australia [62] | Journal | 4 (lab) | Acc (NR) [wrist, dominant] | NR | DT (GAD) |
Moschetti et al. (2017), Italy [63] | Conference | 12 (lab) | Acc/Gyro (50 Hz) [wrist, dominant] | NR | GMM, KM, RF, SVM (GD) |
Sen et al. (2017), Singapore [7] | Journal | 28 (both) | Acc/Gyro (NR) [wrist, dominant] | Camera, Diary | DT, RF, SVM (AD, GD) |
Shen et al. (2017), USA [1] | Journal | 271 (lab) | Acc/Gyro (15 Hz) [wrist, dominant] | Camera | C/RB (GD) |
Soubam et al. (2017), India [64] | Conference | 11 (lab) | Acc (186 Hz) [wrist, dominant] | Time Sync | DT, NB, RF, SVM (CD, GD) |
Thomaz et al. (2017), USA [65] | Conference | 14 (lab) | Acc/Gyro (30 Hz) [wrist, both] | Camera | RF (AD) |
Yoneda & Weiss (2017), USA [66] | Conference | 51 (lab) | Acc/Gyro (20 Hz) [wrist, dominant] | NR | DT, KNN, RF (GAD) |
Zhang et al. (2017), USA [67] | Conference | 8 (free-living) | Acc/Gyro (31 Hz) [wrist, both] | Wearable Camera | RF (GD) |
Anderez et al. (2018), UK [68] | Conference | NR (lab) | Acc/Gyro (100 Hz) [wrist, dominant] | NR | DL, RF (GD) |
Anderez et al. (2018), UK [69] | Conference | NR (lab) | Acc (100 Hz) [wrist, dominant] | NR | KNN (GD) |
Balaji et al. (2018), India [70] | Conference | NR (NR) | Acc/Gyro (100 Hz) [wrist, NR] | NR | C/RB (GAD) |
Cho & Choi (2018), South Korea [71] | Conference | 8 (lab) | Acc (50 Hz) [wrist, dominant] | Camera | DL (CD, GD) |
Clapés et al. (2018), Spain [72] | Journal | 14 (lab) | Acc/Gyro (25 Hz) [wrist, dominant] | Camera | ANN, Opt (GAD, GD) |
Kyritsis et al. (2018), Greece [73] | Conference | 10 (lab) | Acc/Gyro (100 Hz) [wrist, dominant] | Camera | DL (AD) |
Manzi et al. (2018), Italy [74] | Journal | 20 (lab) | Acc/Gyro (NR) [wrist, dominant] | NR | RF (GAD) |
Papadopoulos et al. (2018), Greece [75] | Conference | 10 (lab) | Acc/Gyro (62 Hz) [wrist, dominant] | Camera | semi-supervised DL (AD) |
Schibon & Amft (2018), Germany [76] | Conference | 6 (free-living) | Acc/Gyro (NR) [wrist, both] | Diary | SVM (GD) |
Shen et al. (2018), USA [77] | arXiv | 269 (lab) | Acc/Gyro (15 Hz) [wrist, dominant] | Camera | HMM (GD) |
Zambrana et al. (2018), Spain [78] | Journal | 21 (lab) | Acc (20 Hz) [wrist, both] | Camera | KNN, RF, SVM (GAD) |
Zhang et al. (2018), USA [17] | Journal | 10 (lab) | Acc/Gyro (NR) [wrist, dominant] | Camera | RF (GD) |
Algorithm (# and % Studies)—Approach | Best Performing Algorithm (vs. Comparison Algorithm/s) | Performance Comparison Results |
---|---|---|
SVM (21, 30.4%) AD: [5,8,34,44,53,57], AD/CD: [50], CD/GD: [64], GAD: [58,78], GD: [43,47,54,63,76], GD/AD: [2,7,14,55,60,61] | SVM (vs. DT, NB) [43] (GD) | Using only wrist or head motion data SVM accuracy for eating detection was between 0.895 to 0.951 (combined wrist/head: 0.970). |
SVM (vs. KNN, RF) [78] (GAD) | Best accuracy of SVM using time-domain features was 0.957 (F = 0.957, two-second window). Best accuracy of RF model using frequency-domain features was 0.939 (F = 0.940, four-second window). | |
SVM (vs. DT, RF) [53] (AD) | F-scores for detecting eating, drinking, and smoking with SVM were 0.910, 0.780, and 0.830, compared to RF with 0.860, 0.780, and 0.840, and DT with 0.820, 0.690, and 0.780. | |
RF (19, 27.5%) AD: [37,44,53,65], AD/CD: [50], CD: [48], CD/GD: [64], GAD: [58,59,66,74,78], GD: [17,63,67,68], GD/AD: [2,7,55] | RF (vs. SVM, KNN) [2] (GD) | FR outperformed SVM and 3-NN in detecting eating gestures in two free-living settings (seven participants in one day, F = 0.761; one participant in 31 days (F = 0.713). |
RF (vs. DT, NB, Reg, SVM) [55] (GD) | Eating gesture detection with RF yielded F = 0.753 compared to F = 0.708 (SVM), F = 0.694 (Reg), F = 0.647 (DT), and F = 0.634 (NB). | |
RF (vs. DT, SVM) [7] (GD) | The accuracies achieved by RF, DT, and SVM were 0.982, 0.966, and 0.857, respectively. | |
RF (vs. SVM) [63] (GD) | Using leave one person out cross-validation method the accuracies achieved by RF and SVM were 0.943 (F = 0.949) and 0.882 (F = 0.895). | |
RF (vs. NB, Reg, SVM) [58] (GAD) | F-scores for general activity detection with FR, Reg, SVM, and NB were 0.788, 0.661, 0.6l3, and 0.268, respectively (eating was among action classes). | |
RF (vs. DT, NB, SVM) [64] (GD) | The accuracies of RF, SVM, DT, and NB in person independent drinking versus eating detection were 0.924, 0.905, 0.881 and 0.871, respectively. | |
RF (vs. DT, Reg, SVM) [50] (AD) | Person independent “about-to-eat” detection achieved F = 0.690 (RF) compared to 0.660 (SVM), 0.660 (REG), and 0.640 (DT). | |
RF (vs. DT, KNN) [66] (GAD) | Accuracies of RF, DT, and KNN were 0.997, 0.980, and 0.888, respectively (using accelerometer data). | |
DT (16; 23.2%) AD: [44,45,53], AD/CD: [50], CD/GD: [35,64], CD/GD/AD: [38], GAD: [9,49,62,66], GD: [28,32,43], GD/AD: [7,55] | DT (vs. NB) [38] (AD/CD) | Best F-scores of DT and NB were 0.930 and 0.900 (eating activity detection) and 0.780 and 0.700 (eating type detection), respectively. |
DT (vs. NB) [35] (CD/GD) | Best F-scores of DT and NB were 0.750 and 0.650 (eating utensil detection) and 0.280 and 0.190 (eating action detection), respectively. | |
HMM (10, 14.5%) AD: [57], GAD: [49], GD: [12,23,26,39,52,77], GD/AD: [14,60] | HMM (vs. SVM) [57] (AD) | HMM outperformed SVM by 6.82% on recall in family meal detection, while the average precision and recall were 0.807 and 0.895. |
HMM (vs KNN) [39] (GD) | Accuracies of HMM, and KNN were 0.843 and 0.717, respectively. | |
HMM-1 (vs. HMM-S, HMM-N, N:2-6) [77] (GD) | Accuracies of HMM-S and gesture-to-gesture HMM-1 were 0.852 and 0.895 respectively. According to the figure provided in the study the accuracy for HMM-2 to 4 stays similar and decreases for HMM-5 and 6. | |
HMM-6 (vs. HMM-N, N:1-5, KNN, S-HMM) [12] (GD) | The accuracies of HMM-6, HMM-5, HMM-4, HMM-3, HMM-2, HMM-1, sub-gesture HMM and KNN were 0.965, 0.946, 0.922, 0.896, 0.880, 0.877, 0.843 and 0.758, respectively. | |
KNN (9, 13%) AD: [27,44], GAD: [9,66,78], GD: [12,39,69], GD/AD: [2] | KNN (vs. ANN, DT, NB, Reg, RF, SVM) [44] (AD) | The accuracies of KNN, RF, Reg, SVM, ANN, NB and DT were 0.936, 0.933, 0.923, 0.920, 0.913, 0.906 and 0.893, respectively. |
KNN (vs. DT, NB) [9] (GAD) | The precision (and recall) values of KNN, DT and NB were 0.710 (0.719), 0.670 (0.686) and 0.657 (0.635), respectively. | |
DL (5, 7.2%) AD: [73,75], CD/GD: [71], GD: [68], GD/AD: [61] | RNN (vs. HMM) [61,73] (AD) | Replacing HMM with RNN in a SVM-HMM model improved F-score from 0.814 to 0.892 [61]. In a subsequent study a single-step end-to-end RNN reached almost the same performance (F = 0.884) [73]. |
ANN (4, 5.8%) AD: [5,6,44], GD/GAD: [72] | ANN (vs. SVM) [5] (AD) | ANN achieved accuracy of 0.869 (±0.065) compared to SVM (0.819, ±0.092) for eating activity detection (12 participants). ANN achieved accuracy of 0.727 compared to SVM (0.636, ±0.092) for number of meals detection (1 participant). |
KM (1, 1.4%) GD: [63] | KM (vs. GMM) [63] (GD) | In an inter-person comparison, the accuracies of unsupervised approaches KM and GMM were 0.917 (F = 0.920) and 0.796 (F = 0.805), respectively. |
Other: C/RB (11, 15.9%, AD [33], GAD [70], GD [1,3,4,13,40,42], GD/AD [14,41,51]), NB (11, 15.9%, AD [44,56], CD/GD [35,64], CD/GD/AD [38], GAD [58], GD [43,46], GD/AD [41,51,55]), Reg (5, 7.2%, AD [44,45], AD/CD [50], GAD [58], GD/AD [55]), FSS (3, 4.3%, GD: [23,25,32]), DBSCAN (2, 2.9%, GD/AD [2,55]), DBN (1, 1.4%, AD [29]), FSM (1, 1.4%, AD [31]), HTM (1, 1.4%, GD [30]), Opt (1, 1.4%, GD/GAD [72]), PCFG (1, 1.4%, AD [24]) |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Heydarian, H.; Adam, M.; Burrows, T.; Collins, C.; Rollo, M.E. Assessing Eating Behaviour Using Upper Limb Mounted Motion Sensors: A Systematic Review. Nutrients 2019, 11, 1168. https://doi.org/10.3390/nu11051168
Heydarian H, Adam M, Burrows T, Collins C, Rollo ME. Assessing Eating Behaviour Using Upper Limb Mounted Motion Sensors: A Systematic Review. Nutrients. 2019; 11(5):1168. https://doi.org/10.3390/nu11051168
Chicago/Turabian StyleHeydarian, Hamid, Marc Adam, Tracy Burrows, Clare Collins, and Megan E. Rollo. 2019. "Assessing Eating Behaviour Using Upper Limb Mounted Motion Sensors: A Systematic Review" Nutrients 11, no. 5: 1168. https://doi.org/10.3390/nu11051168
APA StyleHeydarian, H., Adam, M., Burrows, T., Collins, C., & Rollo, M. E. (2019). Assessing Eating Behaviour Using Upper Limb Mounted Motion Sensors: A Systematic Review. Nutrients, 11(5), 1168. https://doi.org/10.3390/nu11051168