Transportation Mode Detection Using Learning Methods and Self-Contained Sensors: Review
Abstract
:1. Introduction
2. State-of-the-Art for TMD Systems
2.1. TMD Data Collection
2.1.1. Main Sensors in TMD Systems
2.1.2. Existing TMD Datasets
- Sensors: L: Light, S: Sound, A: Accelerometer, G: Gyroscope, M: Magnetometer, B: Barometer, LA: Linear accelerometer, O: Orientation.
- Device: IMU: Inertial measurement unit, Mob: Mobile phone.
- Transportation modes: S: Still, W: Walk, R: Run, Sr: Stairs, E: Elevators, Bi: Bike, MC: Motorcycle, B: Bus, C: Car, T: Train, Tr: Tram, HSR: High speed rail, Sub: Subway, M: Metro, KS: Kick-Scooter, R: Run.
2.2. Challenges with Real-World Data Collection
2.2.1. Variable Sampling Frequency
2.2.2. Data Privacy Issues
2.2.3. Variable Smartphone Sensors
2.2.4. Variable Circumstances in Data Collection
2.2.5. Variable Smartphone Sensors Orientation
2.2.6. TMD Data Quality
- Network-Level Strategy: by using network-level management and tracking the network packets, sensor failures can be detected. This technique is based on Markov models to detect the normal and abnormal sensor response [44].
- Homogeneous Strategy: this technique uses many identical sensors to detect the malfunction sensor. By arranging the same type of sensors providing the same output, adjacent to each other, the uncorrelated response can be detected, followed by the malfunction sensor [44].
- Heterogeneous Strategy: this technique merges different types of data points from sensors. By classifying the sensor outputs and training the classifier to find similar data points, the failure is detected using various subsets of sensors [44].
2.3. Window Length
2.4. Features Extraction for TMD
2.5. Features Optimization
2.5.1. Within-Class Variance in TMD Systems
2.5.2. Between-Class Variance in TMD Systems
2.6. Categories of Methods for Learning-Based AI
2.7. Performance Evaluation in Classification
- Accuracy: it calculates the percentage of correct predictions divided by the total number of predictions [1,2]. It summarizes the overall classification performance for all classes. It is a commonly used metric to assess the performance of a classification model. Determining the suitable machine learning algorithm in a supervised classification considerably affects the accuracy. There are three methods employed to evaluate a classifier’s accuracy. The first method is to divide the training dataset in two-thirds for training and the third for testing. The second method is the cross-validation technique [72], where the training dataset is divided into equal-sized subsets, and for each subset the classifier is trained on the combined data of all the other subsets. The error rate of the classifier is calculated by taking the average of the error rate of each subset. The third method is the leave-one-out validation [73], which is a particular case of the cross-validation method in which all validation subsets contain only one sample. Although this type of validation needs more computational resources, it is important when a precise estimation of a classifier’s error rate is needed [74].
- The influence of training data quantity is important for a classifier’s accuracy [75]. Having a large amount of data provides the machine learning algorithm with more information, enabling the identification of different scenarios and correlation between them before making predictions. As a result, the accuracy will increase.
2.8. Overview of Previous Studies on TMD Systems
Approach | Ref | Sampling Frequency [Hz] | Classified Mode | Sensors | Features | Dataset | Classifier | Window Size | Accuracy (%) |
---|---|---|---|---|---|---|---|---|---|
IMU | [29] | 100 | S, W, R, Bi, C, B, T, Sub | A, G, M, B | Mag, jerk, max, min, std, mean, var, kurtosis, skewness, energy, and entropy | SHL2018 [71] | RNN with LSTM | 5 s | 67.5 |
[76] | 10 | B, Sub, HSR, elevator | A, G, M | Max, mean, range, std, RMS, mean-cross rate, zero-cross rate, slope sign change, spectral centroid, spectral flatness, spectral spread, spectral roll-off, and spectral crest | HTC dataset [58] | LSTM | 12 s | 97 | |
[24] | 10 | B, Sub, HSR, elevator | A, G, M | Mag, max, min, mean, range, std, root mean, cnt zero, cnt mean, cnt slope, spectral centroid, spectral flatness, RMS, max index and max rate | Smartphones’ sensors | LSTM | 10 s (elevator), 60 s otherwise | 92 | |
[77] | 50 | S, W, Bi, B, C, T, Sub | A | Mag and FFT | Accelerometer sensors in smartphones | CNN | 10.24 s | 94.48 | |
[18] | 20 | B, W, C, Bi, T, Tr, Sub, boat, plane | A G | Min, max, avg, and std | Applications | Random forest, random tree, Bayesian network, and naïve Bayes | 5 s | 95 | |
[9] | - | S, W, R, Bi, C, B, M, T | A,G,M | Mean, std, highest FFT value | HTC dataset | ANN | 17.06 s | 87 | |
[32] | 50 | S, W, C, T, B | A,G | Min, Max, avg, std | Sensors of smartphones | Bi-LSTM | 2.56 s | 92.8 | |
[78] | 1 | S, W, R, bike, C, B, T, Sub | M, A, G, and pressure sensor | Mean value of the 3 or 4 axes of acceleration, mag, O, gravity and LA, temperature, pressure, altitude | SHL dataset | Bidirectional Encoder Representations from Transformers BERT | - | 98.8 | |
[79] | 20 | W, S, T, C, B | A, G, LA, O, S, G | Min, max, mean, std | Smartphone embedded sensors | Stacked learning technique (12 machine learning algorithms) | 5 s | >89 | |
[33] | 0.067 and 0.2 | W, Bi, C, B, T, Sub | A, G | (Max, avg) resultant acceleration, std, skewness, kurtosis, pitch and roll (gyroscope) | Sensor’s smartphone | Random forest | 10 min | 95.40, 98.78 | |
Localisa- tion | [80] | 1 | W, Bi, B, C | GNSS | Jerk, mean, std, (10th, 50th, 90th) percentile, skewness | Android app | KNN, RF, MLP | 30 s | >74 |
[81] | - | W, Bi, C, Bus | GPS | Speed, acceleration, jerk, bearing rate | Geolife data [90] | LSTM | - | 83, 81 | |
[82] | 1 | W, Bi, Tr, B, taxi, C | GPS | Time, latitude, longitude, altitude, speed | Smartphones sensors | Decision tree | - | 94.9 | |
[83] | Freq max = 1 | W, Bi, B, C, MC | GPS, WiFi cellular | Altitude, latitude, longitude, precision, acceleration | 37 volunteers in Rio de Janeiro | Hierarchical classifier | 60, 90, 120 s | >40 | |
[84] | - | W, Bi, B, C, T | GPS | Length, mean, covariance, top three velocities and top three acceleration from each segment, speed | GeoLife dataset [90] | Genetic programming | - | >77 | |
[85] | - | W, Bi, B, T, C | GPS | Speed, altitude, turning angle, net displacement, distance | GPS-enabled mobile applications | Extreme gradient boost, multilayer perceptron | - | 96 | |
[86] | - | W, Bi, B, C, T | GPS | (Avg, min, max) speed, acceleration, jerk, distance, bearing rate, turning change rate, time difference, total duration | GPS tracking data | LSTM | - | 93.94 | |
[87] | - | W, Bi, B, C, T | GPS | Speed, acceleration, jerk, bearing | GPS tracking data | k-mean | 30 s | >52 | |
[39] | - | W, Bi, C, B, T | GPS | (Avg, mean, max) speed, total distance, total time, avg bearing | GPS tracking data | K-means clustering with the ANP-PSO hybrid method | - | 88 | |
[69] | 1 | W, Bi, B, C, T | GPS | Speed, acceleration, jerk, bearing rate | GeoLife data set [90] | unsupervised deep learning | - | 86.7 | |
[88] | - | W, Bi, C, T, B | GPS | Date, time, longitude, latitude, speed, average speed, average acceleration, maximum and minimum speed, acceleration during each segment, segment distance, direction, duration, bearing | GPS tracking data of 20 different people in Falun | Random forest | 300 s | 99 | |
Hybrid | [31] | 50 | MC, W, B, Sub, Tr, S, Car | GPS, A, G | mean, std, skewness, kurtosis, (5th, 95th percentile), avg | sensor readings | CNN, Nearest Neighbor; RF, Statistical Analysis, SVM | 2 s | >75 |
[27] | GPS (1 Hz), A, G, M (100 Hz) | S, W, Bi, R, B, C, T, Sub | A, G, M, GPS | mag, mean, std, energy, kurtosis, skewness, highest FFT value, frequency | SHL dataset | Decision tree | 5.12 s | >50 | |
[89] | 50 | S, W, R, Bi, C, B, T, Sub | A, G, M, GPS | - | SHL and TMD dataset [91] | Hybrid DL classifier | - | >90 | |
[92] | - | Bi, public transport | A, GPS, heart rate data | Mean, median, std, min, max, 10th and 90th percentiles | 126 participants living in the Ile-de-France region | Random forest | - | 65% for public transport and 95% for biking |
3. Sensor Types and Locations for TMD Models
4. Existing Android Applications for TMD Data Collection
5. Standardized TMD Datasets
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Cheng, S.; Liu, Y. Research on transportation mode recognition based on multi-head attention temporal convolutional network. Sensors 2023, 23, 3585. [Google Scholar] [CrossRef] [PubMed]
- Siargkas, C.; Papapanagiotou, V.; Delopoulos, A. Transportation mode recognition based on low-rate acceleration and location signals with an attention-based multiple-instance learning network. IEEE Trans. Intell. Transp. Syst. 2024, 25, 14376–14388. [Google Scholar] [CrossRef]
- Lee, D.; Camacho, D.; Jung, J.J. Smart mobility with Big Data: Approaches, applications, and challenges. Appl. Sci. 2023, 13, 7244. [Google Scholar] [CrossRef]
- Ning, Z.; Xia, F.; Ullah, N.; Kong, X.; Hu, X. Vehicular social networks: Enabling smart mobility. IEEE Commun. Mag. 2017, 55, 16–55. [Google Scholar] [CrossRef]
- Habitat, U. Scenarios of Urban Futures: Degree of Urbanization: World Cities Report. 2022. Available online: https://unhabitat.org/sites/default/files/2022/07/chapter_2_wcr_2022.pdf (accessed on 7 July 2024).
- Kamalian, M.; Ferreira, P.; Jul, E. A survey on local transport mode detection on the edge of the network. Appl. Intell. 2022, 52, 16021–16050. [Google Scholar] [CrossRef]
- Handte, M.; Kraus, L.; Marrón, P.J.; Proff, H. Analyzing the Mobility of University Members for InnaMoRuhr. In Next Chapter in Mobility: Technische und Betriebswirtschaftliche Aspekte; Springer: Berlin/Heidelberg, Germany, 2024; pp. 461–474. [Google Scholar]
- Jiang, G.; Lam, S.K.; He, P.; Ou, C.; Ai, D. A multi-scale attributes attention model for transport mode identification. IEEE Trans. Intell. Transp. Syst. 2020, 23, 152–164. [Google Scholar] [CrossRef]
- Taherinavid, S.; Moravvej, S.V.; Chen, Y.L.; Yang, J.; Ku, C.S.; Por, L.Y. Automatic Transportation Mode Classification Using a Deep Reinforcement Learning Approach With Smartphone Sensors. IEEE Access 2023, 12, 514–533. [Google Scholar] [CrossRef]
- Yan, H.; Huang, X.; Ma, Y.; Yao, R.; Zhu, Z.; Zhang, Y.; Lu, X. AttenDenseNet for the Sussex-Huawei Locomotion-Transportation (SHL) Recognition Challenge. In Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium onWearable Computing, Cancún, Mexico, 8–12 October 2023; pp. 569–574. [Google Scholar]
- Zhao, Y.; Song, L.; Ni, C.; Zhang, Y.; Lu, X. Road network enhanced transportation mode recognition with an ensemble machine learning model. In Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium onWearable Computing, Cancún, Mexico, 8–12 October 2023; pp. 528–533. [Google Scholar]
- Chang, Y. Multimodal Data Integration for Real-Time Indoor Navigation Using a Smartphone. Master’s Thesis, City University of New York, New York, NY, USA, 2020. [Google Scholar]
- Chen, R.; Ning, T.; Zhu, Y.; Guo, S.; Luo, H.; Zhao, F. Enhancing transportation mode detection using multi-scale sensor fusion and spatial-topological attention. In Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium onWearable Computing, Cancún, Mexico, 8–12 October 2023; pp. 534–539. [Google Scholar]
- Hwang, S.; Cho, Y.; Kim, K. User-Independent Motion and Location Analysis for Sussex-Huawei Locomotion Data. In Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium onWearable Computing, Cancún, Mexico, 8–12 October 2023; pp. 517–522. [Google Scholar]
- Wang, K.; Qian, X.; Fitch, D.T.; Lee, Y.; Malik, J.; Circella, G. What travel modes do shared e-scooters displace? A review of recent research findings. Transp. Rev. 2023, 43, 5–31. [Google Scholar] [CrossRef]
- Hosseinzadeh, A.; Karimpour, A.; Kluger, R. Factors influencing shared micromobility services: An analysis of e-scooters and bikeshare. Transp. Res. Part D Transp. Environ. 2021, 100, 103047. [Google Scholar] [CrossRef]
- Oeschger, G.; Carroll, P.; Caulfield, B. Micromobility and public transport integration: The current state of knowledge. Transp. Res. Part D Transp. Environ. 2020, 89, 102628. [Google Scholar] [CrossRef]
- Hedemalm, E.; Kor, A.L.; Hallberg, J.; Andersson, K.; Pattinson, C.; Chinnici, M. Application of Online Transportation Mode Recognition in Games. Appl. Sci. 2021, 11, 8901. [Google Scholar] [CrossRef]
- Huang, H.; Cheng, Y.; Weibel, R. Transport mode detection based on mobile phone network data: A systematic review. Transp. Res. Part C Emerg. Technol. 2019, 101, 297–312. [Google Scholar] [CrossRef]
- Nikolic, M.; Bierlaire, M. Review of transportation mode detection approaches based on smartphone data. In Proceedings of the 17th Swiss Transport Research Conference, Ascona, Switzerland, 17–19 May 2017. [Google Scholar]
- Prelipcean, A.C.; Gidófalvi, G.; Susilo, Y.O. Transportation mode detection–an in-depth review of applicability and reliability. Transp. Rev. 2017, 37, 442–464. [Google Scholar] [CrossRef]
- Ballı, S.; Sağbaş, E.A. Diagnosis of transportation modes on mobile phone using logistic regression classification. IET Softw. 2018, 12, 142–151. [Google Scholar] [CrossRef]
- Alaoui, F.T.; Fourati, H.; Kibangou, A.; Robu, B.; Vuillerme, N. Urban transportation mode detection from inertial and barometric data in pedestrian mobility. IEEE Sens. J. 2021, 22, 4772–4780. [Google Scholar] [CrossRef]
- Wang, S.; Yao, S.; Niu, K.; Dong, C.; Qin, C.; Zhuang, H. Intelligent scene recognition based on deep learning. IEEE Access 2021, 9, 24984–24993. [Google Scholar] [CrossRef]
- Practical Guide to Accelerometers. 2023. Available online: https://www.phidgets.com/docs/Accelerometer_Guide?srsltid=AfmBOooC7ZrRSCQFMVdXbXKdSNKh82gK_-fhTstJM_tW5fMVtfgPvzps#Tracking_Movement (accessed on 6 May 2024).
- Jeyakumar, J.V.; Lee, E.S.; Xia, Z.; Sandha, S.S.; Tausik, N.; Srivastava, M. Deep convolutional bidirectional LSTM based transportation mode recognition. In Proceedings of the 2018 ACM international joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, Singapore, 8–12 October 2018; pp. 1606–1615. [Google Scholar]
- Wang, L.; Gjoreski, H.; Ciliberto, M.; Mekki, S.; Valentin, S.; Roggen, D. Enabling reproducible research in sensor-based transportation mode recognition with the Sussex-Huawei dataset. IEEE Access 2019, 7, 10870–10891. [Google Scholar] [CrossRef]
- Shao, W.; Zhao, F.; Wang, C.; Luo, H.; Muhammad Zahid, T.; Wang, Q.; Li, D. Location fingerprint extraction for magnetic field magnitude based indoor positioning. J. Sens. 2016, 2016, 1945695. [Google Scholar] [CrossRef]
- Ahmed, M.; Antar, A.D.; Hossain, T.; Inoue, S.; Ahad, M.A.R. Poiden: Position and orientation independent deep ensemble network for the classification of locomotion and transportation modes. In Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, London, UK, 9–13 September 2019; pp. 674–679. [Google Scholar]
- Wang, P.; Jiang, Y. Transportation mode detection using temporal convolutional networks based on sensors integrated into smartphones. Sensors 2022, 22, 6712. [Google Scholar] [CrossRef]
- Delli Priscoli, F.; Giuseppi, A.; Lisi, F. Automatic transportation mode recognition on smartphone data based on deep neural networks. Sensors 2020, 20, 7228. [Google Scholar] [CrossRef]
- Zhao, H.; Hou, C.; Alrobassy, H.; Zeng, X. Recognition of Transportation State by Smartphone Sensors Using Deep Bi-LSTM Neural Network. J. Comput. Netw. Commun. 2019, 2019, 4967261. [Google Scholar] [CrossRef]
- Shafique, M.A.; Hato, E. Improving the Accuracy of Travel Mode Detection for Low Data Collection Frequencies. Pak. J. Eng. Appl. Sci. 2020, 27. [Google Scholar]
- Taia Alaoui, F.; Fourati, H.; Vuillerme, N.; Kibangou, A.; Robu, B.; Villemazet, C. Captimove Dataset. Captimove-TMD. 2020. Available online: https://perscido.univ-grenoble-alpes.fr/datasets/DS310 (accessed on 20 June 2024).
- Carpineti, C.; Lomonaco, V.; Bedogni, L.; Di Felice, M.; Bononi, L. Custom dual transportation mode detection by smartphone devices exploiting sensor diversity. In Proceedings of the 2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Athens, Greece, 19–23 March 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 367–372. [Google Scholar]
- Wazirali, R. A Review on Privacy Preservation of Location-Based Services in Internet of Things. Intell. Autom. Soft Comput. 2022, 31, 767–769. [Google Scholar] [CrossRef]
- Monogios, S.; Magos, K.; Limniotis, K.; Kolokotronis, N.; Shiaeles, S. Privacy issues in Android applications: The cases of GPS navigators and fitness trackers. Int. J. Electron. Gov. 2022, 14, 83–111. [Google Scholar] [CrossRef]
- Android Developers. Permissions Overview. 2024. Available online: https://developer.android.com/?hl=fr (accessed on 19 July 2024).
- Sadeghian, P. Enhanced Clustering Approach for Transportation Mode Classification Using GPS Data and Particle Swarm Optimization. Master’s Thesis, Dalarna University, Falun, Sweden, 2024. [Google Scholar]
- Aggarwal, C.C.; Aggarwal, C.C. An Introduction to Outlier Analysis; Springer: Cham, Switzerland, 2017. [Google Scholar]
- Bosman, H.H.; Iacca, G.; Tejada, A.; Wörtche, H.J.; Liotta, A. Spatial anomaly detection in sensor networks using neighborhood information. Inf. Fusion 2017, 33, 41–56. [Google Scholar] [CrossRef]
- Yang, J.; Lin, L.; Sun, Z.; Chen, Y.; Jiang, S. Data validation of multifunctional sensors using independent and related variables. Sens. Actuators A Phys. 2017, 263, 76–90. [Google Scholar] [CrossRef]
- Mansouri, M.; Harkat, M.F.; Nounou, M.; Nounou, H. Midpoint-radii principal component analysis-based EWMA and application to air quality monitoring network. Chemom. Intell. Lab. Syst. 2018, 175, 55–64. [Google Scholar] [CrossRef]
- Gaddam, A.; Wilkin, T.; Angelova, M.; Gaddam, J. Detecting sensor faults, anomalies and outliers in the internet of things: A survey on the challenges and solutions. Electronics 2020, 9, 511. [Google Scholar] [CrossRef]
- Dunia, R.; Qin, S.J.; Edgar, T.F.; McAvoy, T.J. Use of principal component analysis for sensor fault identification. Comput. Chem. Eng. 1996, 20, S713–S718. [Google Scholar] [CrossRef]
- Ahmad, S.; Lavin, A.; Purdy, S.; Agha, Z. Unsupervised real-time anomaly detection for streaming data. Neurocomputing 2017, 262, 134–147. [Google Scholar] [CrossRef]
- Xiao, H.; Huang, D.; Pan, Y.; Liu, Y.; Song, K. Fault diagnosis and prognosis of wastewater processes with incomplete data by the auto-associative neural networks and ARMA model. Chemom. Intell. Lab. Syst. 2017, 161, 96–107. [Google Scholar] [CrossRef]
- Rahman, A.; Smith, D.V.; Timms, G. A novel machine learning approach toward quality assessment of sensor data. IEEE Sens. J. 2013, 14, 1035–1047. [Google Scholar] [CrossRef]
- Liang, X.; Wang, G. A convolutional neural network for transportation mode detection based on smartphone platform. In Proceedings of the 2017 IEEE 14th International Conference on Mobile Ad Hoc and Sensor Systems (MASS), Orlando, FL, USA, 22–25 October 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 338–342. [Google Scholar]
- Soares, E.F.d.S.; Campos, C.A.V.; de Lucena, S.C. Online travel mode detection method using automated machine learning and feature engineering. Future Gener. Comput. Syst. 2019, 101, 1201–1212. [Google Scholar] [CrossRef]
- Su, X.; Caceres, H.; Tong, H.; He, Q. Online travel mode identification using smartphones with battery saving considerations. IEEE Trans. Intell. Transp. Syst. 2016, 17, 2921–2934. [Google Scholar] [CrossRef]
- Tan, C.W.; Petitjean, F.; Keogh, E.; Webb, G.I. Time series classification for varying length series. arXiv 2019, arXiv:1910.04341. [Google Scholar]
- Guvensan, M.A.; Dusun, B.; Can, B.; Turkmen, H.I. A novel segment-based approach for improving classification performance of transport mode detection. Sensors 2017, 18, 87. [Google Scholar] [CrossRef]
- Drosouli, I.; Voulodimos, A.; Miaoulis, G.; Mastorocostas, P.; Ghazanfarpour, D. Transportation mode detection using an optimized long short-term memory model on multimodal sensor data. Entropy 2021, 23, 1457. [Google Scholar] [CrossRef]
- Guyon, I. A Scaling Law for the Validation-Set Training-Set Size Ratio; AT&T Bell Laboratories: Murray Hill, NJ, USA, 1997; Volume 1. [Google Scholar]
- Schilling, A.; Maier, A.; Gerum, R.; Metzner, C.; Krauss, P. Quantifying the separability of data classes in neural networks. Neural Netw. 2021, 139, 278–293. [Google Scholar] [CrossRef]
- ElMorshedy, M.M.; Fathalla, R.; El-Sonbaty, Y. Feature transformation framework for enhancing compactness and separability of data points in feature space for small datasets. Appl. Sci. 2022, 12, 1713. [Google Scholar] [CrossRef]
- Yu, M.C.; Yu, T.; Wang, S.C.; Lin, C.J.; Chang, E.Y. Big data small footprint: The design of a low-power classifier for detecting transportation modes. Proc. VLDB Endow. 2014, 7, 1429–1440. [Google Scholar] [CrossRef]
- Røislien, J.; Skare, Ø.; Gustavsen, M.; Broch, N.L.; Rennie, L.; Opheim, A. Simultaneous estimation of effects of gender, age and walking speed on kinematic gait data. Gait Posture 2009, 30, 441–445. [Google Scholar] [CrossRef] [PubMed]
- Rosso, V.; Agostini, V.; Takeda, R.; Tadano, S.; Gastaldi, L. Influence of BMI on gait characteristics of young adults: 3D evaluation using inertial sensors. Sensors 2019, 19, 4221. [Google Scholar] [CrossRef] [PubMed]
- Alaoui, F.T.; Fourati, H.; Kibangou, A.; Robu, B.; Vuillerme, N. Kick-scooters detection in sensor-based transportation mode classification methods. IFAC-PapersOnLine 2021, 54, 81–86. [Google Scholar] [CrossRef]
- Alaoui, F.T.; Fourati, H.; Kibangou, A.; Robu, B.; Vuillerme, N. Kick-scooters identification in the context of transportation mode detection using inertial sensors: Methods and accuracy. J. Intell. Transp. Syst. 2022. [Google Scholar] [CrossRef]
- Benko, Z.; Bábel, T.; Somogyvári, Z. Model-free detection of unique events in time series. Sci. Rep. 2022, 12, 227. [Google Scholar] [CrossRef] [PubMed]
- Günnemann-Gholizadeh, N. Machine Learning Methods for Detecting Rare Events in Temporal Data. Ph.D. Thesis, Technische Universität München, Munich, Germany, 2018. [Google Scholar]
- Dabiri, S.; Lu, C.T.; Heaslip, K.; Reddy, C.K. Semi-supervised deep learning approach for transportation mode identification using GPS trajectory data. IEEE Trans. Knowl. Data Eng. 2019, 32, 1010–1023. [Google Scholar] [CrossRef]
- James, J. Semi-supervised deep ensemble learning for travel mode identification. Transp. Res. Part C Emerg. Technol. 2020, 112, 120–135. [Google Scholar]
- Dabiri, S.; Heaslip, K. Inferring transportation modes from GPS trajectories using a convolutional neural network. Transp. Res. Part C Emerg. Technol. 2018, 86, 360–371. [Google Scholar] [CrossRef]
- Yazdizadeh, A.; Patterson, Z.; Farooq, B. Ensemble convolutional neural networks for mode inference in smartphone travel survey. IEEE Trans. Intell. Transp. Syst. 2019, 21, 2232–2239. [Google Scholar] [CrossRef]
- Li, L.; Zhu, J.; Zhang, H.; Tan, H.; Du, B.; Ran, B. Coupled application of generative adversarial networks and conventional neural networks for travel mode detection using GPS data. Transp. Res. Part A Policy Pract. 2020, 136, 282–292. [Google Scholar] [CrossRef]
- Markos, C.; James, J. Unsupervised deep learning for GPS-based transportation mode identification. In Proceedings of the 2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC), Rhodes, Greece, 20–23 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar]
- Wang, L.; Gjoreskia, H.; Murao, K.; Okita, T.; Roggen, D. Summary of the sussex-huawei locomotion-transportation recognition challenge. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, New York, NY, USA, 9–11 October 2018; pp. 1521–1530. [Google Scholar]
- Cross Validation. 2024. Available online: https://scikit-learn.org/stable/modules/cross_validation.html (accessed on 11 August 2024).
- leaveOneout. 2024. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.LeaveOneOut.html (accessed on 11 August 2024).
- Kotsiantis, S.B.; Zaharakis, I.; Pintelas, P. Supervised machine learning: A review of classification techniques. Emerg. Artif. Intell. Appl. Comput. Eng. 2007, 160, 3–24. [Google Scholar]
- Sun, C.; Shrivastava, A.; Singh, S.; Gupta, A. Revisiting unreasonable effectiveness of data in deep learning era. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 843–852. [Google Scholar]
- Asci, G.; Guvensan, M.A. A novel input set for LSTM-based transport mode detection. In Proceedings of the 2019 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Kyoto, Japan, 11–15 March 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 107–112. [Google Scholar]
- Liang, X.; Zhang, Y.; Wang, G.; Xu, S. A deep learning model for transportation mode detection based on smartphone sensing data. IEEE Trans. Intell. Transp. Syst. 2019, 21, 5223–5235. [Google Scholar] [CrossRef]
- Drosouli, I.; Voulodimos, A.; Mastorocostas, P.; Miaoulis, G.; Ghazanfarpour, D. TMD-BERT: A Transformer-Based Model for Transportation Mode Detection. Electronics 2023, 12, 581. [Google Scholar] [CrossRef]
- Alotaibi, B. Transportation mode detection by embedded sensors based on ensemble learning. IEEE Access 2020, 8, 145552–145563. [Google Scholar] [CrossRef]
- Zeng, J.; Zhang, G.; Hu, Y.; Wang, D. Addressing robust travel mode identification with individual trip-chain trajectory noise reduction. IET Intell. Transp. Syst. 2023, 17, 129–143. [Google Scholar] [CrossRef]
- Nawaz, A.; Zhiqiu, H.; Senzhang, W.; Hussain, Y.; Khan, I.; Khan, Z. Convolutional LSTM based transportation mode learning from raw GPS trajectories. IET Intell. Transp. Syst. 2020, 14, 570–577. [Google Scholar] [CrossRef]
- Molina-Campoverde, J.J.; Rivera-Campoverde, N.; Molina Campoverde, P.A.; Bermeo Naula, A.K. Urban Mobility Pattern Detection: Development of a Classification Algorithm Based on Machine Learning and GPS. Sensors 2024, 24, 3884. [Google Scholar] [CrossRef]
- Soares, E.F.d.S.; de MS Quintella, C.A.; Campos, C.A.V. Smartphone-based real-time travel mode detection for intelligent transportation systems. IEEE Trans. Veh. Technol. 2021, 70, 1179–1189. [Google Scholar] [CrossRef]
- Namdarpour, F.; Mesbah, M.; Gandomi, A.H.; Assemi, B. Using genetic programming on GPS trajectories for travel mode detection. IET Intell. Transp. Syst. 2022, 16, 99–113. [Google Scholar] [CrossRef]
- Roy, A.; Fuller, D.; Nelson, T.; Kedron, P. Assessing the role of geographic context in transportation mode detection from GPS data. J. Transp. Geogr. 2022, 100, 103330. [Google Scholar] [CrossRef]
- Sadeghian, P.; Golshan, A.; Zhao, M.X.; Håkansson, J. A deep semi-supervised machine learning algorithm for detecting transportation modes based on GPS tracking data. Transportation 2024. [Google Scholar] [CrossRef]
- Dutta, S.; Patra, B.K. Inferencing transportation mode using unsupervised deep learning approach exploiting GPS point-level characteristics. Appl. Intell. 2023, 53, 12489–12503. [Google Scholar] [CrossRef]
- Sadeghian, P.; Zhao, X.; Golshan, A.; Håkansson, J. A stepwise methodology for transport mode detection in GPS tracking data. Travel Behav. Soc. 2022, 26, 159–167. [Google Scholar] [CrossRef]
- Sharma, A.; Singh, S.K.; Udmale, S.S.; Singh, A.K.; Singh, R. Early transportation mode detection using smartphone sensing data. IEEE Sens. J. 2020, 21, 15651–15659. [Google Scholar] [CrossRef]
- Zheng, Y.; Zhang, L.; Xie, X.; Ma, W.Y. Mining interesting locations and travel sequences from GPS trajectories. In Proceedings of the 18th International Conference on World Wide Web, Madrid, Spain, 20–24 April 2009; pp. 791–800. [Google Scholar]
- Bedogni, L.; Di Felice, M.; Bononi, L. Context-aware Android applications through transportation mode detection techniques. Wirel. Commun. Mob. Comput. 2016, 16, 2523–2541. [Google Scholar] [CrossRef]
- Giri, S.; Brondeel, R.; El Aarbaoui, T.; Chaix, B. Application of machine learning to predict transport modes from GPS, accelerometer, and heart rate data. Int. J. Health Geogr. 2022, 21, 19. [Google Scholar] [CrossRef]
- Mousa, M.; Sharma, K.; Claudel, C.G. Inertial measurement units-based probe vehicles: Automatic calibration, trajectory estimation, and context detection. IEEE Trans. Intell. Transp. Syst. 2017, 19, 3133–3143. [Google Scholar] [CrossRef]
- Croce, D.; Giarre, L.; Pascucci, F.; Tinnirello, I.; Galioto, G.E.; Garlisi, D.; Valvo, A.L. An indoor and outdoor navigation system for visually impaired people. IEEE Access 2019, 7, 170406–170418. [Google Scholar] [CrossRef]
- Silva, C.S.; Wimalaratne, P. Towards a grid based sensor fusion for visually impaired navigation using sonar and vision measurements. In Proceedings of the 2017 IEEE Region 10 Humanitarian Technology Conference (R10-HTC), Dhaka, Bangladesh, 21–23 December 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 784–787. [Google Scholar]
- Fan, K.; Lyu, C.; Liu, Y.; Zhou, W.; Jiang, X.; Li, P.; Chen, H. Hardware implementation of a virtual blind cane on FPGA. In Proceedings of the 2017 IEEE International Conference on Real-Time Computing and Robotics (RCAR), Okinawa, Japan, 14–18 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 344–348. [Google Scholar]
- Digital Motion Analytics Platform. 2024. Available online: https://physilog.com/ (accessed on 16 August 2024).
- Zhou, Y.; Zhang, Y.; Yuan, Q.; Yang, C.; Guo, T.; Wang, Y. The smartphone-based person travel survey system: Data collection, trip extraction, and travel mode detection. IEEE Trans. Intell. Transp. Syst. 2022, 23, 23399–23407. [Google Scholar] [CrossRef]
- Ferreira, P.; Zavgorodnii, C.; Veiga, L. edgeTrans-Edge transport mode detection. Pervasive Mob. Comput. 2020, 69, 101268. [Google Scholar] [CrossRef]
- Gjoreski, H.; Ciliberto, M.; Wang, L.; Morales, F.J.O.; Mekki, S.; Valentin, S.; Roggen, D. The university of sussex-huawei locomotion and transportation dataset for multimodal analytics with mobile devices. IEEE Access 2018, 6, 42592–42604. [Google Scholar] [CrossRef]
References | Years | Subjects | Sensors | Freq (Hz) | Device Modes | Sensor Positions | Trans-Portation Modes | Total Duration (h) | Spatial Scale | Time Scale | Minimum Duration |
---|---|---|---|---|---|---|---|---|---|---|---|
[34] | 2020 | 34 | A, G, B | 32 | IMU | Hand, Wrist, Trousers’ pocket, Waist, Foot | S, W, Sr, E, Bi, B, T, KS | 48 | Grenoble (France) | 3 months | 1 h |
[31] | 2020 | 18 | A, G, M, GPS | 50 | Mob | Pocket, Hand, Car dashboard | S, W, B, C, T, Sub, MC | 140 | - | - | - |
[35] | 2018 | 13 | A, G, M, B, S, L | <20 | Mob | - | S, W, B, C, T | 31 | - | - | 1.75 h |
[27] | 2019 | 3 | A, G, M, GPS | 100 | Mob | Bag, Hips, Torso, Hand | S, W, R, Bi, C, B, T, Sub | 703 | London (UK) | 7 months | 21.5 h |
Sensors | Features |
---|---|
GPS: speed, acceleration, turn angle, trajectory | Mean, std, sinuosity, range, interquartile range, max, quantile k, three maximum values, three minimum values, autocorrelation, kurtosis, skewness, heading change rate, velocity change rate, stop rate, speed, acceleration, turn angle, trajectory |
IMU: accelerometer, gyroscope, magnetometer | Mean, std, mean crossing rate, energy, autocorrelation, kurtosis, skewness, min, max, median, range, quantile k, interquartile range, frequency with highest FFT value, ratio between the first and second highest FFT peaks, FFT value |
Barometer: pressure | Spectral centroid, spectral spread, number of zero crossings after scaling, main frequency component, power of the main frequency component, spectral energy at 1 Hz, 2 Hz,…, 10 Hz |
Machine Learning | Deep Learning | |||||||
---|---|---|---|---|---|---|---|---|
Ref | FS | CM | BM | Ref | FS | CM | BM | |
No FS | [33] | - | NB, SVM, DT, BDT | BDT | [33] | - | FF NN | - |
[31] | - | SVM | - | [31] | - | FF NN, RNN, CNN | CNN | |
[49] | - | NB, J48, kNN, SVM | - | [32] | - | Bi-LSTM | - | |
- | - | - | - | [61] | - | CNN | - | |
- | - | - | - | [49] | - | CNN | CNN | |
FM | [71] | MI, MRMR | DT | - | - | - | - | - |
WM | [35] | manual | SVM, DT | - | [23] | automated | CNN | CNN |
[22] | manual | NB, BN, kNN, LR, J48, DTb, RT | LR | [35] | manual | FF NN | - | |
EM | [33] | RF | - | - | - | - | - | - |
[31] | - | RF | - | - | - | - | - | |
[23] | - | RF | - | - | - | - | - | |
[61] | - | RF | RF | - | - | - | - | |
[35] | - | RF | RF | - | - | - | - | |
[49] | - | AdaBoost, RF | - | - | - | - | - |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gharbi, I.; Taia-Alaoui, F.; Fourati, H.; Vuillerme, N.; Zhou, Z. Transportation Mode Detection Using Learning Methods and Self-Contained Sensors: Review. Sensors 2024, 24, 7369. https://doi.org/10.3390/s24227369
Gharbi I, Taia-Alaoui F, Fourati H, Vuillerme N, Zhou Z. Transportation Mode Detection Using Learning Methods and Self-Contained Sensors: Review. Sensors. 2024; 24(22):7369. https://doi.org/10.3390/s24227369
Chicago/Turabian StyleGharbi, Ilhem, Fadoua Taia-Alaoui, Hassen Fourati, Nicolas Vuillerme, and Zebo Zhou. 2024. "Transportation Mode Detection Using Learning Methods and Self-Contained Sensors: Review" Sensors 24, no. 22: 7369. https://doi.org/10.3390/s24227369
APA StyleGharbi, I., Taia-Alaoui, F., Fourati, H., Vuillerme, N., & Zhou, Z. (2024). Transportation Mode Detection Using Learning Methods and Self-Contained Sensors: Review. Sensors, 24(22), 7369. https://doi.org/10.3390/s24227369