Preliminary Technical Validation of LittleBeats™: A Multimodal Sensing Platform to Capture Cardiac Physiology, Motion, and Vocalizations
Abstract
:1. Introduction
1.1. Contribution of the LittleBeats™ Platform
1.2. Wearable Sensor Systems and Technical Validation as a Critical Step
1.3. Validation of ECG (Study 1, 2), IMU (Study 3), and Audio (Study 4, 5) Sensors
2. Overview of LittleBeats™ Platform
2.1. Hardware Design
2.1.1. Processing Unit
2.1.2. Memory Unit
2.1.3. Time-Keeping Unit
2.1.4. Power Unit
2.1.5. Sensing Unit
2.2. Data Acquisition
2.3. Data Synchronization
3. Study 1: Validation of ECG Sensor–Adult Sample
3.1. Materials and Methods
3.1.1. Participants
3.1.2. Study Procedure
3.1.3. Data Processing
3.2. Results
4. Study 2: Validation of ECG Sensor–Infant Sample
4.1. Materials and Methods
4.1.1. Participants
4.1.2. Study Procedure
4.1.3. Data Processing
4.2. Results
5. Study 3: Validation of Motion Sensor–Activity Recognition
5.1. Materials and Methods
5.1.1. Participants
5.1.2. Study Procedure
- The participant sits on a chair and watches a video for 2 min.
- Between each activity, the participant stands for 30 s.
- The participant walks to the end of the room and back three times.
- The participant glides or steps to the left until they reach the end of the room, then glides or steps to the right until they reach the other end of the room, for one minute.
- The participant completes squats or deep knee bends for one minute.
- The participant sits in an office chair and rotates slowly five times.
5.1.3. Data Processing
5.2. Results
5.2.1. Data Distribution and Balancing
5.2.2. Classification
6. Study 4: Validation of Audio Sensor—Speech Emotion Recognition
6.1. Materials and Methods
6.1.1. Participants
6.1.2. Study Procedure
6.1.3. Cross-Validation Check on Emotional Speech Corpus
6.1.4. Audio Data Processing
6.2. Results
7. Study 5: Validation of Audio Sensor—Automatic Speech Recognition
7.1. Materials and Methods
7.1.1. Participants
7.1.2. Study Procedure
7.2. Results
8. Discussion
8.1. ECG Sensor
8.2. IMU Sensor
8.3. Audio Sensor
8.4. Limitations and Future Directions
9. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Mohr, D.C.; Zhang, M.; Schueller, S.M. Personal Sensing: Understanding Mental Health Using Ubiquitous Sensors and Machine Learning. Annu. Rev. Clin. Psychol. 2017, 13, 23–47. [Google Scholar] [CrossRef]
- Peake, J.M.; Kerr, G.; Sullivan, J.P. A Critical Review of Consumer Wearables, Mobile Applications, and Equipment for Providing Biofeedback, Monitoring Stress, and Sleep in Physically Active Populations. Front. Physiol. 2018, 9, 743. [Google Scholar] [CrossRef] [PubMed]
- Sawka, M.N.; Friedl, K.E. Emerging Wearable Physiological Monitoring Technologies and Decision AIDS for Health and Performance. J. Appl. Physiol. 2018, 124, 430–431. [Google Scholar] [CrossRef]
- Muhammad, G.; Alshehri, F.; Karray, F.; Saddik, A.E.; Alsulaiman, M.; Falk, T.H. A Comprehensive Survey on Multimodal Medical Signals Fusion for Smart Healthcare Systems. Information Fusion. 2021, 76, 355–375. [Google Scholar] [CrossRef]
- Wallen, M.P.; Gomersall, S.R.; Keating, S.E.; Wisløff, U.; Coombes, J.S. Accuracy of Heart Rate Watches: Implications for Weight Management. PLoS ONE 2016, 11, e0154420. [Google Scholar] [CrossRef]
- Walch, O.; Huang, Y.; Forger, D.; Goldstein, C. Sleep Stage Prediction with Raw Acceleration and Photoplethysmography Heart Rate Data Derived from a Consumer Wearable Device. Sleep 2019, 42, zsz180. [Google Scholar] [CrossRef] [PubMed]
- Bagot, K.S.; Matthews, S.A.; Mason, M.; Squeglia, L.M.; Fowler, J.; Gray, K.; Herting, M.; May, A.; Colrain, I.; Godino, J.; et al. Current, Future and Potential Use of Mobile and Wearable Technologies and Social Media Data in the ABCD Study to Increase Understanding of Contributors to Child Health. Dev. Cogn. Neurosci. 2018, 32, 121–129. [Google Scholar] [CrossRef]
- Böhm, B.; Karwiese, S.D.; Böhm, H.; Oberhoffer, R. Effects of Mobile Health Including Wearable Activity Trackers to Increase Physical Activity Outcomes Among Healthy Children and Adolescents: Systematic Review. JMIR Mhealth Uhealth 2019, 7, e8298. [Google Scholar] [CrossRef] [PubMed]
- Wang, W.; Cheng, J.; Song, W.; Shen, Y. The Effectiveness of Wearable Devices as Physical Activity Interventions for Preventing and Treating Obesity in Children and Adolescents: Systematic Review and Meta-Analysis. JMIR Mhealth Uhealth 2022, 10, e32435. [Google Scholar] [CrossRef]
- Zhu, Z.; Liu, T.; Li, G.; Li, T.; Inoue, Y. Wearable Sensor Systems for Infants. Sensors 2015, 5, 3721–3749. [Google Scholar] [CrossRef]
- Chung, H.U.; Rwei, A.Y.; Hourlier-Fargette, A.; Xu, S.; Lee, K.H.; Dunne, E.C.; Xie, Z.; Liu, C.; Carlini, A.; Kim, D.H.; et al. Skin-Interfaced Biosensors for Advanced Wireless Physiological Monitoring in Neonatal and Pediatric Intensive-Care Units. Nat. Med. 2020, 26, 418–429. [Google Scholar] [CrossRef]
- Kim, J.; Yoo, S.; Liu, C.; Kwak, S.S.; Walter, J.R.; Xu, S.; Rogers, J.A. Skin-Interfaced Wireless Biosensors for Perinatal and Paediatric Health. Nat. Rev. Bioeng. 2023, 1, 631–647. [Google Scholar] [CrossRef]
- Memon, S.F.; Memon, M.; Bhatti, S. Wearable Technology for Infant Health Monitoring: A Survey. IET Circuits Devices Syst. 2020, 14, 115–129. [Google Scholar] [CrossRef]
- Wong, J.N.; Walter, J.R.; Conrad, E.C.; Seshadri, D.R.; Lee, J.Y.; Gonzalez, H.; Reuther, W.; Hong, S.J.; Pini, N.; Marsillio, L.; et al. A Comprehensive Wireless Neurological and Cardiopulmonary Monitoring Platform for Pediatrics. PLOS Digital Health 2023, 2, e0000291. [Google Scholar] [CrossRef]
- de Barbaro, K. Automated Sensing of Daily Activity: A New Lens into Development. Dev. Psychobiol. 2019, 61, 444–464. [Google Scholar] [CrossRef] [PubMed]
- Hamaker, E.L.; Wichers, M. No Time like the Present: Discovering the Hidden Dynamics in Intensive Longitudinal Data. Curr. Dir. Psychol. Sci. 2017, 26, 10–15. [Google Scholar] [CrossRef]
- Mehl, M.R.; Conner, T.S. Handbook of Research Methods for Studying Daily Life; Guilford Press: New York, NY, USA, 2012. [Google Scholar]
- Mor, N.; Doane, L.D.; Adam, E.K.; Mineka, S.; Zinbarg, R.E.; Griffith, J.W.; Craske, M.G.; Waters, A.; Nazarian, M. Within-Person Variations in Self-Focused Attention and Negative Affect in Depression and Anxiety: A Diary Study. Cogn. Emot. 2010, 24, 48–62. [Google Scholar] [CrossRef]
- Wilhelm, F.H.; Roth, W.T.; Sackner, M.A. The LifeShirt: An Advanced System for Ambulatory Measurement of Respiratory and Cardiac Function. Behav. Modif. 2003, 27, 671–691. [Google Scholar] [CrossRef]
- Behere, S.P.; Janson, C.M. Smart Wearables in Pediatric Heart Health. J. Pediatr. 2023, 253, 1–7. [Google Scholar] [CrossRef] [PubMed]
- Nuske, H.J.; Goodwin, M.S.; Kushleyeva, Y.; Forsyth, D.; Pennington, J.W.; Masino, A.J.; Finkel, E.; Bhattacharya, A.; Tan, J.; Tai, H.; et al. Evaluating Commercially Available Wireless Cardiovascular Monitors for Measuring and Transmitting Real-Time Physiological Responses in Children with Autism. Autism Res. 2021, 15, 117–130. [Google Scholar] [CrossRef] [PubMed]
- Mehl, M.R.; Pennebaker, J.W.; Crow, D.M.; Dabbs, J.; Price, J.H. The Electronically Activated Recorder (EAR): A Device for Sampling Naturalistic Daily Activities and Conversations. Behav. Res. Methods Instrum. Comput. 2001, 33, 517–523. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.; Williams, R.; Dilley, L.; Houston, D.M. A Meta-Analysis of the Predictability of LENATM Automated Measures for Child Language Development HHS Public Access. Dev. Rev. 2020, 57, 100921. [Google Scholar] [CrossRef] [PubMed]
- Cychosz, M.; Romeo, R.; Soderstrom, M.; Scaff, C.; Ganek, H.; Cristia, A.; Casillas, M.; de Barbaro, K.; Bang, J.Y.; Weisleder, A. Longform Recordings of Everyday Life: Ethics for Best Practices. Behav. Res. Methods 2020, 52, 1951–1969. [Google Scholar] [CrossRef] [PubMed]
- Bulgarelli, F.; Bergelson, E. Look Who’s Talking: A Comparison of Automated and Human-Generated Speaker Tags in Naturalistic Day-Long Recordings. Behav. Res. Methods 2020, 52, 641–653. [Google Scholar] [CrossRef]
- Greenwood, C.R.; Thiemann-Bourque, K.; Walker, D.; Buzhardt, J.; Gilkerson, J. Assessing Children’s Home Language Environments Using Automatic Speech Recognition Technology. Commun. Disord. Q. 2011, 32, 83–92. [Google Scholar] [CrossRef]
- Ul Hasan, M.N.; Negulescu, I.I. Wearable Technology for Baby Monitoring: A Review. J. Text. Eng. Fash. Technol. 2020, 6, 112–120. [Google Scholar] [CrossRef]
- Grooby, E.; Sitaula, C.; Chang Kwok, T.; Sharkey, D.; Marzbanrad, F.; Malhotra, A. Artificial Intelligence-Driven Wearable Technologies for Neonatal Cardiorespiratory Monitoring: Part 1 Wearable Technology. Pediatr. Res. 2023, 93, 413–425. [Google Scholar] [CrossRef] [PubMed]
- Porges, S.W. Cardiac Vagal Tone: A Physiological Index of Stress. Neurosci. Biobehav. Rev. 1995, 19, 225–233. [Google Scholar] [CrossRef]
- Franchak, J.M.; Tang, M.; Rousey, H.; Luo, C. Long-Form Recording of Infant Body Position in the Home Using Wearable Inertial Sensors. Behav. Res. Methods 2023. [Google Scholar] [CrossRef]
- Franchak, J.M.; Scott, V.; Luo, C. A Contactless Method for Measuring Full-Day, Naturalistic Motor Behavior Using Wearable Inertial Sensors. Front. Psychol. 2021, 12, 701343. [Google Scholar] [CrossRef]
- Airaksinen, M.; Gallen, A.; Kivi, A.; Vijayakrishnan, P.; Häyrinen, T.; Ilén, E.; Räsänen, O.; Haataja, L.M.; Vanhatalo, S. Intelligent Wearable Allows Out-of-the-Lab Tracking of Developing Motor Abilities in Infants. Commun. Med. 2022, 2, 69. [Google Scholar] [CrossRef]
- Airaksinen, M.; Räsänen, O.; Ilén, E.; Häyrinen, T.; Kivi, A.; Marchi, V.; Gallen, A.; Blom, S.; Varhe, A.; Kaartinen, N.; et al. Automatic Posture and Movement Tracking of Infants with Wearable Movement Sensors. Sci. Rep. 2020, 10, 169. [Google Scholar] [CrossRef]
- Hendry, D.; Rohl, A.L.; Rasmussen, C.L.; Zabatiero, J.; Cliff, D.P.; Smith, S.S.; Mackenzie, J.; Pattinson, C.L.; Straker, L.; Campbell, A. Objective Measurement of Posture and Movement in Young Children Using Wearable Sensors and Customised Mathematical Approaches: A Systematic Review. Sensors 2023, 23, 9661. [Google Scholar] [CrossRef]
- Wass, S.V.; Smith, C.G.; Clackson, K.; Gibb, C.; Eitzenberger, J.; Mirza, F.U. Parents Mimic and Influence Their Infant’s Autonomic State through Dynamic Affective State Matching. Curr. Biol. 2019, 29, 2415–2422.e4. [Google Scholar] [CrossRef]
- Geangu, E.; Smith, W.A.P.; Mason, H.T.; Martinez-Cedillo, A.P.; Hunter, D.; Knight, M.I.; Liang, H.; del Carmen Garcia de Soria Bazan, M.; Tse, Z.T.H.; Rowland, T.; et al. EgoActive: Integrated Wireless Wearable Sensors for Capturing Infant Egocentric Auditory–Visual Statistics and Autonomic Nervous System Function ‘in the Wild’. Sensors 2023, 23, 7930. [Google Scholar] [CrossRef]
- Mathews, S.C.; Mcshea, M.J.; Hanley, C.L.; Ravitz, A.; Labrique, A.B.; Cohen, A.B. Digital Health: A Path to Validation. NPJ Digit. Med. 2019, 2, 38. [Google Scholar] [CrossRef] [PubMed]
- McGinnis, R.S.; McGinnis, E.W. Advancing Digital Medicine with Wearables in the Wild. Sensors 2022, 22, 4576. [Google Scholar] [CrossRef] [PubMed]
- Tronick, E.; Als, H.; Adamson, L.; Wise, S.; Brazelton, T.B. The Infant’s Response to Entrapment between Contradictory Messages in Face-to-Face Interaction. J. Am. Acad. Child. Psychiatry 1978, 17, 1–13. [Google Scholar] [CrossRef] [PubMed]
- Mesman, J.; van IJzendoorn, M.H.; Bakermans-Kranenburg, M.J. The Many Faces of the Still-Face Paradigm: A Review and Meta-Analysis. Dev. Rev. 2009, 29, 120–162. [Google Scholar] [CrossRef]
- Jones-Mason, K.; Alkon, A.; Coccia, M.; Bush, N.R. Autonomic Nervous System Functioning Assessed during the Still-Face Paradigm: A Meta-Analysis and Systematic Review of Methods, Approach and Findings. Dev. Rev. 2018, 50, 113–139. [Google Scholar] [CrossRef] [PubMed]
- Porges, S.W.; Bohrer, R.E. The Analysis of Periodic Processes in Psychophysiological Research. In Principles of Psychophysiology: Physical, Social, and Inferential Elements; Cambridge University Press: New York, NY, USA, 1990; pp. 708–753. [Google Scholar]
- Li, J.; Hasegawa-Johnson, M.; McElwain, N.L. Towards Robust Family-Infant Audio Analysis Based on Unsupervised Pretraining of Wav2vec 2.0 on Large-Scale Unlabeled Family Audio. arXiv 2023, arXiv:2305.12530. [Google Scholar]
- Chang, K.C.; Hasegawa-Johnson, M.; McElwain, N.L.; Islam, B. Classification of Infant Sleep/Wake States: Cross-Attention among Large Scale Pretrained Transformer Networks Using Audio, ECG, and IMU Data. In Proceedings of the 2023 Asia Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), Taipei, Taiwan, 31 October–3 November 2023. [Google Scholar]
- McElwain, N.L.; Fisher, M.C.; Nebeker, C.; Bodway, J.M.; Islam, B.; Hasegawa-Johnson, M. Evaluating Users’ Experiences of a Child Multimodal Wearable Device. JMIR Hum. Factors, 2023; Preprint. [Google Scholar]
- Porges, S.W. The Polyvagal Theory: Neurophysiological Foundations of Emotions, Attachment, Communication, and Self-Regulation; Norton: New York, NY, USA, 2011. [Google Scholar]
- Porges, S.W. The Polyvagal Perspective. Biol. Psychol. 2007, 74, 116–143. [Google Scholar] [CrossRef]
- Beauchaine, T.P. Respiratory Sinus Arrhythmia: A Transdiagnostic Biomarker of Emotion Dysregulation and Psychopathology. Curr. Opin. Psychol. 2015, 3, 43–47. [Google Scholar] [CrossRef] [PubMed]
- Calkins, S.D.; Graziano, P.A.; Berdan, L.E.; Keane, S.P.; Degnan, K.A. Predicting Cardiac Vagal Regulation in Early Childhood from Maternal–Child Relationship Quality during Toddlerhood. Dev. Psychobiol. 2008, 50, 751–766. [Google Scholar] [CrossRef] [PubMed]
- Porges, S.W.; Furman, S.A. The Early Development of the Autonomic Nervous System Provides a Neural Platform for Social Behaviour: A Polyvagal Perspective. Infant. Child. Dev. 2011, 20, 106–118. [Google Scholar] [CrossRef]
- Huffman, L.C.; Bryan, Y.E.; del Carmen, R.; Pedersen, F.A.; Doussard-Roosevelt, J.A.; Porges, S.W. Infant Temperament and Cardiac Vagal Tone: Assessments at Twelve Weeks of Age. Child. Dev. 1998, 69, 624–635. [Google Scholar] [CrossRef]
- Davila, M.I.; Lewis, G.F.; Porges, S.W. The PhysioCam: A Novel Non-Contact Sensor to Measure Heart Rate Variability in Clinical and Field Applications. Front. Public. Health 2017, 5, 300. [Google Scholar] [CrossRef]
- Palmer, A.R.; Distefano, R.; Leneman, K.; Berry, D. Reliability of the BodyGuard2 (FirstBeat) in the Detection of Heart Rate Variability. Appl. Psychophysiol. Biofeedback 2021, 46, 251–258. [Google Scholar] [CrossRef]
- Heilman, K.J.; Porges, S.W. Accuracy of the LifeShirt® (Vivometrics) in the Detection of Cardiac Rhythms. Biol. Psychol. 2007, 75, 300–305. [Google Scholar] [CrossRef]
- Su, Y.; Malachuk, P.; Shah, D.; Khorana, A. Precision Differential Drone Navigation. 2022; manuscript in preparation. [Google Scholar]
- Hoang, M.L.; Pietrosanto, A. An Effective Method on Vibration Immunity for Inclinometer Based on MEMS Accelerometer. In Proceedings of the International Semiconductor Conference (CAS), Sinaia, Romania, 7–9 October 2020; pp. 105–108. [Google Scholar] [CrossRef]
- Rysbek, Z.; Oh, K.H.; Abbasi, B.; Zefran, M.; Di Eugenio, B. Physical Action Primitives for Collaborative Decision Making in Human-Human Manipulation. In Proceedings of the 2021 30th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Vancouver, BC, Canada, 8–12 August 2021; pp. 1319–1325. [Google Scholar] [CrossRef]
- Straczkiewicz, M.; James, P.; Onnela, J.P. A Systematic Review of Smartphone-Based Human Activity Recognition Methods for Health Research. Npj Digit. Med. 2021, 4, 148. [Google Scholar] [CrossRef] [PubMed]
- Ronao, C.A.; Cho, S.B. Human Activity Recognition with Smartphone Sensors Using Deep Learning Neural Networks. Expert Syst. Appl. 2016, 59, 235–244. [Google Scholar] [CrossRef]
- Nweke, H.F.; Teh, Y.W.; Al-garadi, M.A.; Alo, U.R. Deep Learning Algorithms for Human Activity Recognition Using Mobile and Wearable Sensor Networks: State of the Art and Research Challenges. Expert Syst. Appl. 2018, 105, 233–261. [Google Scholar] [CrossRef]
- Saun, T.J.; Grantcharov, T.P. Design and Validation of an Inertial Measurement Unit (IMU)-Based Sensor for Capturing Camera Movement in the Operating Room. HardwareX 2021, 9, e00179. [Google Scholar] [CrossRef] [PubMed]
- Henschke, J.; Kaplick, H.; Wochatz, M.; Engel, T. Assessing the Validity of Inertial Measurement Units for Shoulder Kinematics Using a Commercial Sensor-Software System: A Validation Study. Health Sci. Rep. 2022, 5, e772. [Google Scholar] [CrossRef] [PubMed]
- Blandeau, M.; Guichard, R.; Hubaut, R.; Leteneur, S. Two-Step Validation of a New Wireless Inertial Sensor System: Application in the Squat Motion. Technologies 2022, 10, 72. [Google Scholar] [CrossRef]
- Mirsamadi, S.; Barsoum, E.; Zhang, C. Automatic Speech Emotion Recognition Using Recurrent Neural Networks with Local Attention. In Proceedings of the 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), New Orleans, LA, USA, 5–9 March 2017; pp. 2227–2231. [Google Scholar] [CrossRef]
- Jin, Q.; Li, C.; Chen, S.; Wu, H. Speech Emotion Recognition with Acoustic and Lexical Features. In Proceedings of the 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), South Brisbane, QLD, Australia, 19–24 April 2015; pp. 4749–4753. [Google Scholar] [CrossRef]
- Khalil, R.A.; Jones, E.; Babar, M.I.; Jan, T.; Zafar, M.H.; Alhussain, T. Speech Emotion Recognition Using Deep Learning Techniques: A Review. IEEE Access 2019, 7, 117327–117345. [Google Scholar] [CrossRef]
- Li, J.; Hasegawa-Johnson, M.; McElwain, N.L. Analysis of Acoustic and Voice Quality Features for the Classification of Infant and Mother Vocalizations. Speech Commun. 2021, 133, 41–61. [Google Scholar] [CrossRef] [PubMed]
- Anders, F.; Hlawitschka, M.; Fuchs, M. Comparison of Artificial Neural Network Types for Infant Vocalization Classification. IEEE/ACM Trans. Audio Speech Lang. Process 2021, 29, 54–67. [Google Scholar] [CrossRef]
- Ji, C.; Mudiyanselage, T.B.; Gao, Y.; Pan, Y. A Review of Infant Cry Analysis and Classification. EURASIP J. Audio Speech Music. Process 2021, 2021, 8. [Google Scholar] [CrossRef]
- Anders, F.; Hlawitschka, M.; Fuchs, M. Automatic Classification of Infant Vocalization Sequences with Convolutional Neural Networks. Speech Commun. 2020, 119, 36–45. [Google Scholar] [CrossRef]
- Schneider, S.; Baevski, A.; Collobert, R.; Auli, M. WAV2vec: Unsupervised Pre-Training for Speech Recognition. Proc. Interspeech arXiv 2019, arXiv:1904.05862. [Google Scholar] [CrossRef]
- Kim, S.; Hori, T.; Watanabe, S. Joint CTC-Attention Based End-to-End Speech Recognition Using Multi-Task Learning. In Proceedings of the 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), New Orleans, LA, USA, 5–9 March 2017; pp. 4835–4839. [Google Scholar] [CrossRef]
- Gulati, A.; Qin, J.; Chiu, C.C.; Parmar, N.; Zhang, Y.; Yu, J.; Han, W.; Wang, S.; Zhang, Z.; Wu, Y.; et al. Conformer: Convolution-Augmented Transformer for Speech Recognition. arXiv 2020, arXiv:2005.08100. [Google Scholar]
- Bilker, W.B.; Hansen, J.A.; Brensinger, C.M.; Richard, J.; Gur, R.E.; Gur, R.C. Development of Abbreviated Nine-Item Forms of the Raven’s Standard Progressive Matrices Test. Sage J. 2012, 19, 354–369. [Google Scholar] [CrossRef] [PubMed]
- Beaumont, A.; Burton, A.R.; Lemon, J.; Bennett, B.K.; Lloyd, A.; Vollmer-Conna, U. Reduced Cardiac Vagal Modulation Impacts on Cognitive Performance in Chronic Fatigue Syndrome. PLoS ONE 2012, 7, e49518. [Google Scholar] [CrossRef]
- Hansen, A.L.; Johnsen, B.H.; Thayer, J.F. Vagal Influence on Working Memory and Attention. Int. J. Psychophysiol. 2003, 48, 263–274. [Google Scholar] [CrossRef] [PubMed]
- Castaldo, R.; Melillo, P.; Bracale, U.; Caserta, M.; Triassi, M.; Pecchia, L. Acute Mental Stress Assessment via Short Term HRV Analysis in Healthy Adults: A Systematic Review with Meta-Analysis. Biomed. Signal Process. Control. 2015, 18, 370–377. [Google Scholar] [CrossRef]
- Graziano, P.; Derefinko, K. Cardiac Vagal Control and Children’s Adaptive Functioning: A Meta-Analysis. Biol. Psychol. 2013, 94, 22–37. [Google Scholar] [CrossRef]
- Calkins, S.D.; Graziano, P.A.; Keane, S.P. Cardiac Vagal Regulation Differentiates among Children at Risk for Behavior Problems. Biol. Psychol. 2007, 74, 144–153. [Google Scholar] [CrossRef]
- University of North Carolina. CardioPeak for LB Software; Brain-Body Center for Psychophysiology and Bioengineering: Chapel Hill, NC, USA, 2020. [Google Scholar]
- Berntson, G.G.; Quigley, K.S.; Jang, J.F.; Boysen, S.T. An Approach to Artifact Identification: Application to Heart Period Data. Psychophysiology 1990, 27, 586–598. [Google Scholar] [CrossRef]
- University of North Carolina. CardioBatch Plus Software; Brain-Body Center for Psychophysiology and Bioengineering: Chapel Hill, NC, USA, 2020. [Google Scholar]
- Charlton, P.H.; Kotzen, K.; Mejía-Mejía, E.; Aston, P.J.; Budidha, K.; Mant, J.; Pettit, C.; Behar, J.A.; Kyriacou, P.A. Detecting Beats in the Photoplethysmogram: Benchmarking Open-Source Algorithms. Physiol. Meas. 2022, 43, 085007. [Google Scholar] [CrossRef]
- Stone, J.D.; Ulman, H.K.; Tran, K.; Thompson, A.G.; Halter, M.D.; Ramadan, J.H.; Stephenson, M.; Finomore, V.S.; Galster, S.M.; Rezai, A.R.; et al. Assessing the Accuracy of Popular Commercial Technologies That Measure Resting Heart Rate and Heart Rate Variability. Front. Sports Act. Living 2021, 3, 585870. [Google Scholar] [CrossRef]
- Nelson, B.W.; Allen, N.B. Accuracy of Consumer Wearable Heart Rate Measurement during an Ecologically Valid 24-Hour Period: Intraindividual Validation Study. JMIR Mhealth Uhealth 2019, 7, e10828. [Google Scholar] [CrossRef]
- Physical Activity Monitoring for Heart Rate-Real. World Analysis. 2023. Available online: https://shop.cta.tech/a/downloads/-/9cd067bfb80f173f/32bb79b304cb7831 (accessed on 27 July 2023).
- Altman, D.G.; Bland, J.M. Measurement in Medicine: The Analysis of Method Comparison Studies. Statistician 1983, 32, 307. [Google Scholar] [CrossRef]
- Fracasso, M.P.; Porges, S.W.; Lamb, M.E.; Rosenberg, A.A. Cardiac Activity in Infancy: Reliability and Stability of Individual Differences. Infant. Behav. Dev. 1994, 17, 277–284. [Google Scholar] [CrossRef]
- Xsens Functionality|Movella.com. Available online: https://www.movella.com/products/sensor-modules/xsens-functionality (accessed on 20 June 2023).
- Cohen, J. A Coefficient of Agreement for Nominal Scales. Educ. Psychol. Meas. 1960, 20, 37–46. [Google Scholar] [CrossRef]
- Shrout, P.E. Measurement Reliability and Agreement in Psychiatry. Stat. Methods Med. Res. 1998, 7, 301–317. [Google Scholar] [CrossRef] [PubMed]
- Gillick, L.; Cox, S.J. Some Statistical Issues in the Comparison of Speech Recognition Algorithms. In Proceedings of the International Conference on Acoustics, Speech, and Signal Processing, Glasgow, UK, 23–26 May 1989; pp. 532–535. [Google Scholar] [CrossRef]
- Livingstone, S.R.; Russo, F.A. The Ryerson Audio-Visual Database of Emotional Speech and Song (Ravdess): A Dynamic, Multimodal Set of Facial and Vocal Expressions in North American English. PLoS ONE 2018, 13, 0196391. [Google Scholar] [CrossRef]
- Citations Screenshots and Permissions|Audacity®. Available online: https://www.audacityteam.org/about/citations-screenshots-and-permissions/ (accessed on 26 June 2023).
- Pedregosa, F.; Michel, V.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Vanderplas, J.; Cournapeau, D.; Varoquaux, G.; Gramfort, A.; et al. Scikit-Learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Fairbanks, G. Voice and Articulation Drillbook, 2nd ed.; Harper & Row: New York, NY, USA, 1960. [Google Scholar]
- Sevitz, J.S.; Kiefer, B.R.; Huber, J.E.; Troche, M.S. Obtaining Objective Clinical Measures during Telehealth Evaluations of Dysarthria. Am. J. Speech Lang. Pathol. 2021, 30, 503–516. [Google Scholar] [CrossRef] [PubMed]
- Dibazar, A.A.; Narayanan, S.; Berger, T.W. Feature Analysis for Automatic Detection of Pathological Speech. In Proceedings of the Second Joint 24th Annual Conference and the Annual Fall Meeting of the Biomedical Engineering Society/Engineering in Medicine and Biology, Houston, TX, USA, 23–26 October 2002; pp. 182–183. [Google Scholar] [CrossRef]
- Ravanelli, M.; Parcollet, T.; Plantinga, P.W.V.; Rouhe, A.; Cornell, S.; Lugosch, L.; Subakan, C.; Dawalatabad, N.; Heba, A.; Zhong, J.; et al. SpeechBrain: A General-Purpose Speech Toolkit. arXiv 2021, arXiv:2106.04624. [Google Scholar]
- Tomar, S. Converting video formats with FFmpeg|Linux Journal. Available online: https://www.linuxjournal.com/article/8517 (accessed on 26 June 2023).
- Heafield, K. KenLM: Faster and Smaller Language Model Queries. 2011, pp. 187–197. Available online: https://aclanthology.org/W11-2123 (accessed on 26 June 2023).
- Valldeperes, A.; Altuna, X.; Martinez-Basterra, Z.; Rossi-Izquierdo, M.; Benitez-Rosario, J.; Perez-Fernandez, N.; Rey-Martinez, J. Wireless Inertial Measurement Unit (IMU)-Based Posturography. Eur. Arch. Oto-Rhino-Laryngol. 2019, 276, 3057–3065. [Google Scholar] [CrossRef] [PubMed]
- Williams, G.; Drews, P.; Goldfain, B.; Rehg, J.M.; Theodorou, E.A. Information-Theoretic Model Predictive Control: Theory and Applications to Autonomous Driving. IEEE Trans. Robot. 2018, 34, 1603–1622. [Google Scholar] [CrossRef]
Session (n Observations) | Absolute Mean Error | MAPE (%) | Mean Error (SD) | Bland–Altman Analysis | |
---|---|---|---|---|---|
Lower LoA | Upper LoA | ||||
Baseline (n = 3355) | 49.6 | 5.93% | 11.1 (77.3) | −162.54 | 140.33 |
Tangram puzzle (n = 3744) | 41.9 | 5.29% | 4.5 (62.8) | −127.59 | 118.65 |
Recovery (n = 2777) | 49.7 | 5.97% | 12.7 (73.1) | −156.02 | 130.55 |
Matrices (n = 4589) | 45.6 | 5.62% | 10.6 (68.3) | −144.51 | 123.28 |
Session (n Observations) | Absolute Mean Error | MAPE (%) | Mean Error (SD) | Bland–Altman Analysis | |
---|---|---|---|---|---|
Lower LoA | Upper LoA | ||||
Baseline (n = 907) | 5.4 | 1.17% | 1.3 (7.22) | −15.45 | 12.84 |
SFP play episode (n = 1075) | 4.4 | 0.96% | 2.0 (6.58) | −14.87 | 10.93 |
SFP still episode (n = 936) | 6.9 | 1.66% | 1.7 (10.93) | −23.09 | 19.75 |
SFP reunion episode (n = 1472) | 5.6 | 1.22% | 1.7 (9.29) | −19.92 | 16.49 |
LittleBeats™ Data | Smartphone Data | |||||||
---|---|---|---|---|---|---|---|---|
Ground Truth Labels | UP | WA | GL | SQ | UP | WA | GL | SQ |
Upright (UP) | 805 (0.991) | 3 (0.004) | 1 (0.001) | 3 (0.004) | 803 (0.989) | 0 (0) | 7 (0.009) | 2 (0.002) |
Walk (WA) | 45 (0.300) | 88 (0.587) | 8 (0.053) | 9 (0.060) | 1 (0.007) | 142 (0.947) | 3 (0.020) | 4 (0.027) |
Glide (GL) | 2 (0.011) | 11 (0.063) | 146 (0.830) | 17 (0.097) | 6 (0.034) | 1 (0.006) | 150 (0.852) | 19 (0.108) |
Squat (SQ) | 9 (0.078) | 13 (0.112) | 17 (0.147) | 77 (0.664) | 0 (0) | 8 (0.069) | 40 (0.345) | 68 (0.586) |
LittleBeats™ | |||
---|---|---|---|
Correct | Incorrect | ||
Smartphone | Correct | 1065 | 95 |
Incorrect | 61 | 33 |
LittleBeats™ Data | Smartphone Data | |||||||
---|---|---|---|---|---|---|---|---|
Ground Truth Labels | NEU | HAP | SAD | ANG | NEU | HAP | SAD | ANG |
Neutral (NEU) | 22 (0.786) | 2 (0.071) | 2 (0.071) | 2 (0.071) | 17 (0.607) | 1 (0.036) | 6 (0.214) | 4 (0.143) |
Happy (HAP) | 0 (0) | 24 (0.649) | 2 (0.054) | 11 (0.297) | 1 (0.027) | 25 (0.676) | 2 (0.054) | 9 (0.243) |
Sad (SAD) | 6 (0.154) | 8 (0.205) | 25 (0.641) | 0 (0) | 6 (0.158) | 4 (0.105) | 27 (0.711) | 1 (0.026) |
Angry (ANG) | 0 (0) | 6 (0.158) | 4 (0.105) | 28 (0.737) | 1 (0.026) | 12 (0.316) | 3 (0.079) | 22 (0.579) |
LittleBeats™ | |||
---|---|---|---|
Correct | Incorrect | ||
Smartphone | Correct | 75 | 16 |
Incorrect | 23 | 27 |
Models | LittleBeats™ WER | Smartphone WER |
---|---|---|
Greedy decoding | 5.75% | 3.58% |
Beam search | 5.80% | 3.63% |
Beam search + language model | 4.16% | 2.73% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Islam, B.; McElwain, N.L.; Li, J.; Davila, M.I.; Hu, Y.; Hu, K.; Bodway, J.M.; Dhekne, A.; Roy Choudhury, R.; Hasegawa-Johnson, M. Preliminary Technical Validation of LittleBeats™: A Multimodal Sensing Platform to Capture Cardiac Physiology, Motion, and Vocalizations. Sensors 2024, 24, 901. https://doi.org/10.3390/s24030901
Islam B, McElwain NL, Li J, Davila MI, Hu Y, Hu K, Bodway JM, Dhekne A, Roy Choudhury R, Hasegawa-Johnson M. Preliminary Technical Validation of LittleBeats™: A Multimodal Sensing Platform to Capture Cardiac Physiology, Motion, and Vocalizations. Sensors. 2024; 24(3):901. https://doi.org/10.3390/s24030901
Chicago/Turabian StyleIslam, Bashima, Nancy L. McElwain, Jialu Li, Maria I. Davila, Yannan Hu, Kexin Hu, Jordan M. Bodway, Ashutosh Dhekne, Romit Roy Choudhury, and Mark Hasegawa-Johnson. 2024. "Preliminary Technical Validation of LittleBeats™: A Multimodal Sensing Platform to Capture Cardiac Physiology, Motion, and Vocalizations" Sensors 24, no. 3: 901. https://doi.org/10.3390/s24030901
APA StyleIslam, B., McElwain, N. L., Li, J., Davila, M. I., Hu, Y., Hu, K., Bodway, J. M., Dhekne, A., Roy Choudhury, R., & Hasegawa-Johnson, M. (2024). Preliminary Technical Validation of LittleBeats™: A Multimodal Sensing Platform to Capture Cardiac Physiology, Motion, and Vocalizations. Sensors, 24(3), 901. https://doi.org/10.3390/s24030901