Optimal Sensor Placement and Multimodal Fusion for Human Activity Recognition in Agricultural Tasks
Abstract
:1. Introduction
2. Materials and Methods
2.1. Data Acquisition
2.2. Data Pre-Processing and Feature Extraction
2.2.1. Handling of Outliers and Unsynchronized Sensor Data
2.2.2. Noise Reduction
2.2.3. Activity Count and Class Imbalance in Sensor Data
2.2.4. Temporal Window Selection and Overlapping Approach
2.2.5. Encoding Categorical Data
- Standing activity with a “0” assigned value got a [1,0,0,0,0,0] one-hot vector;
- Walking (without crate) with a “1” assigned value got a [0,1,0,0,0,0] one-hot vector;
- Bending with a “2” assigned value got a [0,0,1,0,0,0] one-hot vector;
- Lifting crate with a “3” assigned value got a [0,0,0,1,0,0] one-hot vector;
- Walking (with crate) with a “4” assigned value got a [0,0,0,0,1,0] one-hot vector;
- Placing crate (with crate) with a “4” assigned value got a [0,0,0,0,0,1] one-hot vector.
2.2.6. Train/Test Split
2.2.7. Feature Scaling
2.3. LSTM Model Training and Evaluation
3. Results
3.1. Effect of Hyper-Parameters on Model Performance
3.2. Sensor Placement Combinations
3.3. Multimodal Sensor Fusion
4. Discussion
5. Conclusions
- Model performance: overall, the present LSTM-based model demonstrated high accuracy in classifying the investigated activities;
- Optimal sensor placement: by prioritizing sensor placement on the upper body to effectively capture the examined activities, we found that the chest was the most effective anatomical location for activity classification, achieving the highest F1-score of 0.939.
- Multimodal sensor fusion: fusing data from all sensors, namely accelerometers, gyroscopes, and magnetometers, substantially enhanced classification accuracy.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
- Huo, D.; Malik, A.W.; Ravana, S.D.; Rahman, A.U.; Ahmedy, I. Mapping smart farming: Addressing agricultural challenges in data-driven era. Renew. Sustain. Energy Rev. 2024, 189, 113858. [Google Scholar] [CrossRef]
- Benos, L.; Moysiadis, V.; Kateris, D.; Tagarakis, A.C.; Busato, P.; Pearson, S.; Bochtis, D. Human-Robot Interaction in Agriculture: A Systematic Review. Sensors 2023, 23, 6776. [Google Scholar] [CrossRef] [PubMed]
- Lytridis, C.; Kaburlasos, V.G.; Pachidis, T.; Manios, M.; Vrochidou, E.; Kalampokas, T.; Chatzistamatis, S. An Overview of Cooperative Robotics in Agriculture. Agronomy 2021, 11, 1818. [Google Scholar] [CrossRef]
- Vasconez, J.P.; Kantor, G.A.; Auat Cheein, F.A. Human–robot interaction in agriculture: A survey and current challenges. Biosyst. Eng. 2019, 179, 35–48. [Google Scholar] [CrossRef]
- Liu, H.; Gamboa, H.; Schultz, T. Human Activity Recognition, Monitoring, and Analysis Facilitated by Novel and Widespread Applications of Sensors. Sensors 2024, 24, 5250. [Google Scholar] [CrossRef]
- Bhola, G.; Vishwakarma, D.K. A review of vision-based indoor HAR: State-of-the-art, challenges, and future prospects. Multimed. Tools Appl. 2024, 83, 1965–2005. [Google Scholar] [CrossRef] [PubMed]
- Donisi, L.; Cesarelli, G.; Pisani, N.; Ponsiglione, A.M.; Ricciardi, C.; Capodaglio, E. Wearable Sensors and Artificial Intelligence for Physical Ergonomics: A Systematic Review of Literature. Diagnostics 2022, 12, 3048. [Google Scholar] [CrossRef]
- Moysiadis, V.; Benos, L.; Karras, G.; Kateris, D.; Peruzzi, A.; Berruto, R.; Papageorgiou, E.; Bochtis, D. Human–Robot Interaction through Dynamic Movement Recognition for Agricultural Environments. AgriEngineering 2024, 6, 2494–2512. [Google Scholar] [CrossRef]
- Upadhyay, A.; Zhang, Y.; Koparan, C.; Rai, N.; Howatt, K.; Bajwa, S.; Sun, X. Advances in ground robotic technologies for site-specific weed management in precision agriculture: A review. Comput. Electron. Agric. 2024, 225, 109363. [Google Scholar] [CrossRef]
- Tagarakis, A.C.; Benos, L.; Kyriakarakos, G.; Pearson, S.; Sørensen, C.G.; Bochtis, D. Digital Twins in Agriculture and Forestry: A Review. Sensors 2024, 24, 3117. [Google Scholar] [CrossRef] [PubMed]
- Moysiadis, V.; Katikaridis, D.; Benos, L.; Busato, P.; Anagnostis, A.; Kateris, D.; Pearson, S.; Bochtis, D. An Integrated Real-Time Hand Gesture Recognition Framework for Human-Robot Interaction in Agriculture. Appl. Sci. 2022, 12, 8160. [Google Scholar] [CrossRef]
- Han, C.; Zhang, L.; Tang, Y.; Huang, W.; Min, F.; He, J. Human activity recognition using wearable sensors by heterogeneous convolutional neural networks. Expert Syst. Appl. 2022, 198, 116764. [Google Scholar] [CrossRef]
- Minh Dang, L.; Min, K.; Wang, H.; Jalil Piran, M.; Hee Lee, C.; Moon, H. Sensor-based and vision-based human activity recognition: A comprehensive survey. Pattern Recognit. 2020, 108, 107561. [Google Scholar] [CrossRef]
- Bian, S.; Liu, M.; Zhou, B.; Lukowicz, P. The State-of-the-Art Sensing Techniques in Human Activity Recognition: A Survey. Sensors 2022, 22, 4596. [Google Scholar] [CrossRef]
- Rana, M.; Mittal, V. Wearable Sensors for Real-Time Kinematics Analysis in Sports: A Review. IEEE Sens. J. 2021, 21, 1187–1207. [Google Scholar] [CrossRef]
- Gokul, S.; Dhiksith, R.; Sundaresh, S.A.; Gopinath, M. Gesture Controlled Wireless Agricultural Weeding Robot. In Proceedings of the 2019 5th International Conference on Advanced Computing & Communication Systems (ICACCS), Coimbatore, India, 15–16 March 2019; pp. 926–929. [Google Scholar]
- Patil, P.A.; Jagyasi, B.G.; Raval, J.; Warke, N.; Vaidya, P.P. Design and development of wearable sensor textile for precision agriculture. In Proceedings of the 7th International Conference on Communication Systems and Networks (COMSNETS), Bangalore, India, 6–10 January 2015. [Google Scholar]
- Sharma, S.; Raval, J.; Jagyasi, B. Mobile sensing for agriculture activities detection. In Proceedings of the IEEE Global Humanitarian Technology Conference (GHTC), GHTC 2013, San Jose, CA, USA, 20–23 October 2013; pp. 337–342. [Google Scholar]
- Sharma, S.; Raval, J.; Jagyasi, B. Neural network based agriculture activity detection using mobile accelerometer sensors. In Proceedings of the Annual IEEE India Conference (INDICON), Pune, India, 11–13 December 2014. [Google Scholar]
- Sharma, S.; Jagyasi, B.; Raval, J.; Patil, P. AgriAcT: Agricultural Activity Training using multimedia and wearable sensing. In Proceedings of the IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), St. Louis, MO, USA, 23–27 March 2015; pp. 439–444. [Google Scholar]
- Aiello, G.; Catania, P.; Vallone, M.; Venticinque, M. Worker safety in agriculture 4.0: A new approach for mapping operator’s vibration risk through Machine Learning activity recognition. Comput. Electron. Agric. 2022, 193, 106637. [Google Scholar] [CrossRef]
- Tagarakis, A.C.; Benos, L.; Aivazidou, E.; Anagnostis, A.; Kateris, D.; Bochtis, D. Wearable Sensors for Identifying Activity Signatures in Human-Robot Collaborative Agricultural Environments. Eng. Proc. 2021, 9, 5. [Google Scholar] [CrossRef]
- Anagnostis, A.; Benos, L.; Tsaopoulos, D.; Tagarakis, A.; Tsolakis, N.; Bochtis, D. Human activity recognition through recurrent neural networks for human-robot interaction in agriculture. Appl. Sci. 2021, 11, 2188. [Google Scholar] [CrossRef]
- Open Datasets—iBO. Available online: https://ibo.certh.gr/open-datasets/ (accessed on 16 September 2024).
- Rozenstein, O.; Cohen, Y.; Alchanatis, V.; Behrendt, K.; Bonfil, D.J.; Eshel, G.; Harari, A.; Harris, W.E.; Klapp, I.; Laor, Y.; et al. Data-driven agriculture and sustainable farming: Friends or foes? Precis. Agric. 2024, 25, 520–531. [Google Scholar] [CrossRef]
- Atik, C. Towards Comprehensive European Agricultural Data Governance: Moving Beyond the “Data Ownership” Debate. IIC-Int. Rev. Intellect. Prop. Compet. Law 2022, 53, 701–742. [Google Scholar] [CrossRef]
- Botta, A.; Cavallone, P.; Baglieri, L.; Colucci, G.; Tagliavini, L.; Quaglia, G. A Review of Robots, Perception, and Tasks in Precision Agriculture. Appl. Mech. 2022, 3, 830–854. [Google Scholar] [CrossRef]
- Lavender, S.A.; Li, Y.C.; Andersson, G.B.; Natarajan, R.N. The effects of lifting speed on the peak external forward bending, lateral bending, and twisting spine moments. Ergonomics 1999, 42, 111–125. [Google Scholar] [CrossRef]
- Winter, L.; Bellenger, C.; Grimshaw, P.; Crowther, R.G. Analysis of Movement Variability in Cycling: An Exploratory Study. Sensors 2023, 23, 4972. [Google Scholar] [CrossRef]
- Jackie, D.; Zehr Danielle, R.; Carnegie, T.N.W.; Beach, T.A.C. A comparative analysis of lumbar spine mechanics during barbell- and crate-lifting: Implications for occupational lifting task assessments. Int. J. Occup. Saf. Ergon. 2020, 26, 1439872. [Google Scholar] [CrossRef]
- Huysamen, K.; Power, V.; O’Sullivan, L. Elongation of the surface of the spine during lifting and lowering, and implications for design of an upper body industrial exoskeleton. Appl. Ergon. 2018, 72, 10–16. [Google Scholar] [CrossRef] [PubMed]
- Hlucny, S.D.; Novak, D. Characterizing Human Box-Lifting Behavior Using Wearable Inertial Motion Sensors. Sensors 2020, 20, 2323. [Google Scholar] [CrossRef]
- Ghislieri, M.; Gastaldi, L.; Pastorelli, S.; Tadano, S.; Agostini, V. Wearable Inertial Sensors to Assess Standing Balance: A Systematic Review. Sensors 2019, 19, 4075. [Google Scholar] [CrossRef]
- Nazarahari, M.; Rouhani, H. Detection of daily postures and walking modalities using a single chest-mounted tri-axial accelerometer. Med. Eng. Phys. 2018, 57, 75–81. [Google Scholar] [CrossRef]
- Capture U-ImeasureU. Available online: https://imeasureu.com/capture-u/ (accessed on 19 August 2024).
- Vecchio, L. Del Choosing a Lifting Posture: Squat, Semi-Squat or Stoop. MOJ Yoga Phys. Ther. 2017, 2, 56–62. [Google Scholar] [CrossRef]
- VICON Blue Trident Inertial Measurement Unit. Available online: https://www.vicon.com/hardware/blue-trident/ (accessed on 26 August 2024).
- Sullivan, J.H.; Warkentin, M.; Wallace, L. So many ways for assessing outliers: What really works and does it matter? J. Bus. Res. 2021, 132, 530–543. [Google Scholar] [CrossRef]
- Afsar, M.M.; Saqib, S.; Aladfaj, M.; Alatiyyah, M.H.; Alnowaiser, K.; Aljuaid, H.; Jalal, A.; Park, J. Body-Worn Sensors for Recognizing Physical Sports Activities in Exergaming via Deep Learning Model. IEEE Access 2023, 11, 12460–12473. [Google Scholar] [CrossRef]
- Li, J.; Xu, Y.; Shi, H. Bidirectional LSTM with Hierarchical Attention for Text Classification. In Proceedings of the IEEE 4th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), Chengdu, China, 20–22 December 2019; pp. 456–459. [Google Scholar]
- Sklearn.Preprocessing. StandardScaler—Scikit-Learn 0.24.1 Documentation. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.StandardScaler.html (accessed on 20 January 2021).
- Hewamalage, H.; Bergmeir, C.; Bandara, K. Recurrent Neural Networks for Time Series Forecasting: Current status and future directions. Int. J. Forecast. 2021, 37, 388–427. [Google Scholar] [CrossRef]
- Rithani, M.; Kumar, R.P.; Doss, S. A review on big data based on deep neural network approaches. Artif. Intell. Rev. 2023, 56, 14765–14801. [Google Scholar] [CrossRef]
- Landi, F.; Baraldi, L.; Cornia, M.; Cucchiara, R. Working Memory Connections for LSTM. Neural Netw. 2021, 144, 334–341. [Google Scholar] [CrossRef]
- Hu, Y.; Zhang, X.-Q.; Xu, L.; Xian He, F.; Tian, Z.; She, W.; Liu, W. Harmonic Loss Function for Sensor-Based Human Activity Recognition Based on LSTM Recurrent Neural Networks. IEEE Access 2020, 8, 135617–135627. [Google Scholar] [CrossRef]
- Scikit-Learn User Guide. Available online: https://scikit-learn.org/stable/modules/cross_validation.html (accessed on 12 September 2024).
- Xia, K.; Huang, J.; Wang, H. LSTM-CNN Architecture for Human Activity Recognition. IEEE Access 2020, 8, 56855–56866. [Google Scholar] [CrossRef]
- Chung, S.; Lim, J.; Noh, K.J.; Kim, G.; Jeong, H. Sensor Data Acquisition and Multimodal Sensor Fusion for Human Activity Recognition Using Deep Learning. Sensors 2019, 19, 1716. [Google Scholar] [CrossRef]
- Nataraj, R.; Sanford, S.; Liu, M.; Harel, N.Y. Hand dominance in the performance and perceptions of virtual reach control. Acta Psychol. (Amst.) 2022, 223, 103494. [Google Scholar] [CrossRef]
- Nazari, F.; Mohajer, N.; Nahavandi, D.; Khosravi, A.; Nahavandi, S. Comparison Study of Inertial Sensor Signal Combination for Human Activity Recognition based on Convolutional Neural Networks. In Proceedings of the 15th International Conference on Human System Interaction (HSI), Melbourne, Australia, 28–31 July 2022; pp. 1–6. [Google Scholar]
- Marinoudi, V.; Benos, L.; Villa, C.C.; Lampridi, M.; Kateris, D.; Berruto, R.; Pearson, S.; Sørensen, C.G.; Bochtis, D. Adapting to the Agricultural Labor Market Shaped by Robotization. Sustainability 2024, 16, 7061. [Google Scholar] [CrossRef]
- Benos, L.; Bechar, A.; Bochtis, D. Safety and ergonomics in human-robot interactive agricultural operations. Biosyst. Eng. 2020, 200, 55–72. [Google Scholar] [CrossRef]
- Giallanza, A.; La Scalia, G.; Micale, R.; La Fata, C.M. Occupational health and safety issues in human-robot collaboration: State of the art and open challenges. Saf. Sci. 2024, 169, 106313. [Google Scholar] [CrossRef]
Combinations | F1-Score |
---|---|
1 anatomical location | |
Cervix | 0.933 |
Chest | 0.939 |
Lumbar region | 0.932 |
Right wrist | 0.919 |
Left wrist | 0.888 |
2 anatomical locations | |
Cervix, Chest | 0.931 |
Cervix, Lumbar region | 0.927 |
Cervix, Right wrist | 0.915 |
Cervix, Left wrist | 0.906 |
Chest, Lumbar region | 0.932 |
Chest, Right wrist | 0.919 |
Chest, Left wrist | 0.908 |
Lumbar region, Right wrist | 0.922 |
Lumbar region, Left wrist | 0.904 |
Right wrist, Left wrist | 0.898 |
3 anatomical locations | |
Cervix, Chest, Lumbar region | 0.931 |
Cervix, Chest, Right wrist | 0.918 |
Cervix, Chest, Left wrist | 0.913 |
Cervix, Lumbar region, Right wrist | 0.920 |
Cervix, Lumbar region, Left wrist | 0.909 |
Cervix, Right wrist, Left wrist | 0.901 |
Chest, Lumbar region, Right wrist | 0.924 |
Chest, Lumbar region, Left wrist | 0.909 |
Chest, Right wrist, Left wrist | 0.894 |
Lumbar region, Right wrist, Left wrist | 0.904 |
4 anatomical locations | |
Cervix, Chest, Lumbar region, Right wrist | 0.922 |
Cervix, Chest, Lumbar region, Left wrist | 0.914 |
Cervix, Chest, Right wrist, Left wrist | 0.906 |
Cervix, Lumbar region, Right wrist, Left wrist | 0.905 |
Chest, Lumbar region, Right wrist, Left wrist | 0.909 |
5 anatomical locations | |
Cervix, Chest, Lumbar region, Right wrist, Left wrist | 0.908 |
Confusion Matrix | Predicted Label | ||||||
---|---|---|---|---|---|---|---|
Standing | Walking (without Crate) | Bending | Lifting Crate | Walking (with Crate) | Placing Crate | ||
True label | Standing | 4456 | 497 | 3 | 3 | 0 | 0 |
Walking (without crate) | 548 | 14,685 | 303 | 59 | 42 | 2 | |
Bending | 5 | 524 | 4385 | 350 | 31 | 0 | |
Lifting crate | 3 | 111 | 366 | 5950 | 425 | 0 | |
Walking (with crate) | 0 | 37 | 20 | 260 | 15,280 | 394 | |
Placing crate | 2 | 18 | 1 | 2 | 767 | 2979 |
Confusion Matrix | Predicted Label | ||||||
---|---|---|---|---|---|---|---|
Standing | Walking (without Crate) | Bending | Lifting Crate | Walking (with Crate) | Placing Crate | ||
True label | Standing | 934 | 60 | 0 | 0 | 0 | 0 |
Walking (without crate) | 127 | 2947 | 39 | 4 | 0 | 0 | |
Bending | 0 | 33 | 1043 | 43 | 0 | 0 | |
Lifting crate | 0 | 4 | 46 | 1238 | 48 | 0 | |
Walking (with crate) | 0 | 0 | 0 | 23 | 3064 | 83 | |
Placing crate | 0 | 0 | 0 | 0 | 126 | 640 |
Accelerometer | Gyroscope | Magnetometer | F1-Score |
---|---|---|---|
√ | 0.554 | ||
√ | 0.637 | ||
√ | 0.783 | ||
√ | √ | 0.774 | |
√ | √ | 0.887 | |
√ | √ | 0.915 | |
√ | √ | √ | 0.939 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Benos, L.; Tsaopoulos, D.; Tagarakis, A.C.; Kateris, D.; Bochtis, D. Optimal Sensor Placement and Multimodal Fusion for Human Activity Recognition in Agricultural Tasks. Appl. Sci. 2024, 14, 8520. https://doi.org/10.3390/app14188520
Benos L, Tsaopoulos D, Tagarakis AC, Kateris D, Bochtis D. Optimal Sensor Placement and Multimodal Fusion for Human Activity Recognition in Agricultural Tasks. Applied Sciences. 2024; 14(18):8520. https://doi.org/10.3390/app14188520
Chicago/Turabian StyleBenos, Lefteris, Dimitrios Tsaopoulos, Aristotelis C. Tagarakis, Dimitrios Kateris, and Dionysis Bochtis. 2024. "Optimal Sensor Placement and Multimodal Fusion for Human Activity Recognition in Agricultural Tasks" Applied Sciences 14, no. 18: 8520. https://doi.org/10.3390/app14188520
APA StyleBenos, L., Tsaopoulos, D., Tagarakis, A. C., Kateris, D., & Bochtis, D. (2024). Optimal Sensor Placement and Multimodal Fusion for Human Activity Recognition in Agricultural Tasks. Applied Sciences, 14(18), 8520. https://doi.org/10.3390/app14188520