Fuzzy Computing Model of Activity Recognition on WSN Movement Data for Ubiquitous Healthcare Measurement
Abstract
:1. Introduction
2. Measurement Methods
2.1. WSN Sensors and Signals
2.2. Movement Features
- gAR and gBR: the relative acceleration vectors for sensors A and B, respectively. The relative acceleration of motion can assist in detecting the starting motion when the body begins moving from the static position. Because of gravity, the initial acceleration of sensors in static status should be identical to 1 g. Thus, the relevant vectors for calculations were obtained by subtracting the initial acceleration from the present data, i.e., gAR = gA − gA0 and gBR = gB − gB0, where gA0 and gB0 are the initial acceleration vectors measured by sensors A and B, respectively.
- ωAR and ωBR: the relative angular velocity vectors of sensors A and B, respectively. They can aid in recognizing body rotations. An initial value was subtracted from each angular velocity to calculate the rotating components in three directions. These were ωAR = ωA − ωA0 and ωBR = ωB − ωB0, where ωA0 and ωB0 are the vectors of initial angular velocity measured by sensors A and B, respectively. In addition, we extensively considered some features that can be derived from the features above to be the candidates for the process.
- θA and θB: the tilt angles of sensors A (at the chest) and B (at the thigh) with respect to the initial status. The acceleration vectors at the i-th time step, for example, gAi and gBi for the sensors A and B, can be applied to conduct the tilt angle of the sensor with respect to the initial position. The tilt angles of sensor A can be computed by cos(θA) = (gAi∙gA0)/(|gAi||gA0|) and vice versa for the sensor B. Let θAB be a tilt angle between two sensors, θAB = cos−1((gA∙gB)/(|gA||gB|)) is within domain [0, π]. In which, the operator “∙” denotes an inner product of two vectors, and |gA| = (gA1gA1 + gA2gA2 + gA3gA3)1/2. Theoretically, θA should approximate 90° when the position of sensor A is changed from the postures of sit or stand to lie; similarly, θB should approach 90° when the position of sensor B is changed from stand to sit or lie. A small angle is observed from θB between sit and lie because sensor B on the thigh cannot be perfectly horizontal to the ground. Low, medium, and high degrees of the angle features can be defined for fuzzification.
- σgA, σgB, and σωA, σωB: the standard deviation of acceleration and angular velocity of the chest and thigh, respectively. When the movement is unstable (e.g., walk or run), the sensed data may return significant variations in acceleration. Similarly, the shoulders and limbs sway when moving, creating angular velocity. Thus, these features aid in distinguishing the motion statuses of walk and run, if the body continues in a uniform motion without significant acceleration. The mean value (μ) and standard deviation (σ) of the PDF can be counted to evaluate moving levels.
- νgAR and νgBR: the difference rate of relative acceleration of the chest and thigh, respectively, to the initial status. Similar to extracting the standard deviation, these features given by Equation (2) return normalized differences in the accelerations at the i-th time step with respect to the initial status. Particularly, apparent variations of the features extracted from the sensor B were studied for dynamic motion of the thigh.
- γgA, γgB and γωA, γωB: the gradient of acceleration and angular velocity of the chest and thigh, respectively. For either sensor A or B, each gradient component of acceleration or angular velocity on the 1-, 2-, and 3-axes at the i-th time step can be given by Equation (3).
2.3. Fuzzy Inference System
- (i)
- Select qualified input and output features required by the algorithm. Proper features are adopted by comparing the variations of the feature values and practical activities. The various inputs can be collaborated reciprocally for a similar output. For example, {θA, θB, ωBRx, gBRx, σgBx} can be combined as a set of input features for SET-1 to get the output feature of activities, such as lie, sit, stand, walk, and run. Similarly, an input feature set, such as {θA, θB, γωAx, γgBx, σgAx} for SET-2, could yield equivalent outputs but based on different criteria, where subscript x denotes the x-axis component in the global coordinates.
- (ii)
- Create corresponding membership functions due to the input features for fuzzification that define the participative degrees of the features in the activity. It allows various distribution criteria of membership functions in the FIS. For example, θB can present significant low- or high-angle degrees in two PDFs of stand and sit as shown in Figure 3a. In this case, Figure 3b plots the membership functions that the trapezoidal distribution was simply applied for the FIS.
- (iii)
- Induce the fuzzy rules for activity recognition. These rules are created by the fuzzy logic of “if-then” syntax to recognize the input features and conduct the output features; for example, if (θB = “high-angle”, θA = “low-angle” and ωBRx = “low” neglecting gBRx and σBRx) then the output = sit. The membership function of output features for activities, such as lie, sit, stand, walk, and run, can be quantified by using the triangle distribution, as shown in Figure 3c, for the Mamdani-type FIS.
- (iv)
- Substitute the fuzzy rules with the input and output features into the defuzzification process to produce resultant patterns of recognition. One set of fuzzy rules may yield a pattern criterion, whereas the corresponding output feature can be obtained by the given input features.
2.4. Adaptive Neuro-Fuzzy Inference System
- (i)
- Define the motion index of the output feature. In this study, we defined a range of arbitrary values for the indexes of dynamic motions; e.g., let the random numbers belong to the index range [1, 2] and [3, 4] (e.g., 1.65 and 3.63), and represent walk and run, respectively.
- (ii)
- Assign membership functions for the chosen features of input. For this case, we chose three components of σgB as input features and assigned Gaussian-type membership function to the fuzzy set. In which, if there are m features for input and n membership functions for each feature, then the ANFIS modeling requires mn constants for output features. Therefore, we simply used 3 Gaussian membership functions for each input feature and 27 weighted-average numbers for the crisp output to create the initial FIS.
- (iii)
- Load training data set. A column of the motion indexes was added to the data set of the input feature, and we can load them with the initial FIS into the ANFIS for training. In addition, the applied toolbox supports the functionality of grid partitioning for automatically generating the fuzzy set of the initial FIS according to the loaded data.
- (iv)
- Repeat the training process till steady. The ANFIS can adapt the necessary parameters of the initial FIS in the training process and return the RMSEs of each epoch. The process needs to be repeated until the RMSEs reach a steady value. Consequently, a final FIS can be obtained for estimating the output.
3. Recognition Procedure and Results
3.1. Membership Functions and Fuzzification
3.2. Fuzzy Logic Rules and Defuzzification
3.3. Results from FIS Modeling
3.4. Improvement with ANFIS Modeling
4. Discussion and Application
4.1. Discussion on Fuzzy Computing Model
4.2. Application in Ubiquitous Healthcare
5. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Colombo, R.; Pisano, F.; Micera, S.; Mazzone, A.; Delconte, C.; Carrozza, M.C.; Dario, P.; Minuco, G. Robotic techniques for upper limb evaluation and rehabilitation of stroke patients. IEEE Trans. Neural Syst. Rehabil. Eng. 2005, 13, 311–324. [Google Scholar] [CrossRef] [PubMed]
- Zollo, L.; Rossini, L.; Bravi, M.; Magrone, G.; Sterzi, S.; Guglielmelli, E. Quantitative evaluation of upper-limb motor control in robot-aided rehabilitation. Med. Biol. Eng. Comput. 2011, 49, 1131–1144. [Google Scholar] [CrossRef] [PubMed]
- Moreno, J.C.; del Ama, A.J.; de los Reyes-Guzmán, A.; Gil-Agudo, Á.; Ceres, R.; Pons, J.L. Neurorobotic and hybrid management of lower limb motor disorders: A review. Med. Biol. Eng. Comput. 2011, 49, 1119–1130. [Google Scholar] [CrossRef] [PubMed]
- Kim, J.-N.; Ryu, M.-H.; Choi, H.-R.; Yang, Y.-S.; Kim, T.-K. Development and functional evaluation of an upper extremity rehabilitation system based on inertial sensors and virtual reality. Int. J. Distrib. Sens. Netw. 2013, 2013, 168078. [Google Scholar] [CrossRef]
- Darwish, A.; Hassanien, A.E. Wearable and implantable wireless sensor network solutions for healthcare monitoring. Sensors 2011, 11, 5561–5595. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Büsching, F.; Kulau, U.; Gietzelt, M.; Wolf, L. Comparison and validation of capacitive accelerometers for health care applications. Comput. Methods Programs Biomed. 2012, 106, 79–88. [Google Scholar] [CrossRef] [PubMed]
- Park, K.H. A ubiquitous motion tracking system using sensors in a personal health device. Int. J. Distrib. Sens. Netw. 2013, 2013, 298209. [Google Scholar] [CrossRef]
- Klingeberg, T.; Schilling, M. Mobile wearable device for long term monitoring of vital signs. Comput. Methods Programs Biomed. 2012, 106, 89–96. [Google Scholar] [CrossRef] [PubMed]
- Ren, H.; Li, H.; Liang, X.; He, S.; Dai, Y.; Zhao, L. Privacy-enhanced and multifunctional health data aggregation under differential privacy guarantees. Sensors 2016, 16, 1463. [Google Scholar] [CrossRef] [PubMed]
- Hsieh, N.-C.; Hung, L.-P.; Park, J.H.; Yen, N.Y. Ensuring healthcare services provision: An integrated approach of resident contexts extraction and analysis via smart objects. Int. J. Distrib. Sens. Netw. 2014, 2014, 481952. [Google Scholar] [CrossRef]
- Shin, S.; Um, J.; Seo, D.; Choi, S.-P.; Lee, S.; Jung, H.; Yi, M.Y. Platform to build the knowledge base by combining sensor data and context data. Int. J. Distrib. Sens. Netw. 2014, 2014, 542764. [Google Scholar] [CrossRef]
- Najafi, B.; Aminian, K.; Paraschiv-Ionescu, A.; Loew, F.; Bula, C.J.; Robert, P. Ambulatory system for human motion analysis using a kinematic sensor: Monitoring of daily physical activity in the elderly. IEEE Trans. Biomed. Eng. 2003, 50, 711–723. [Google Scholar] [CrossRef] [PubMed]
- Vo, Q.V.; Hoang, M.T.; Choi, D.J. Personalization in mobile activity recognition system using K-medoids clustering algorithm. Int. J. Distrib. Sens. Netw. 2013, 2013, 315841. [Google Scholar] [CrossRef]
- Jovanov, E.; Milenkovic, A.; Otto, C.; de Groen, P.C. A wireless body area network of intelligent motion sensors for computer assisted physical rehabilitation. J. NeuroEng. Rehabil. 2005. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kim, G.-S. Development of 6-axis force/moment sensor for a humanoid robot’s foot. IET Sci. Meas. Technol. 2008, 2, 122–133. [Google Scholar] [CrossRef]
- Fortino, G.; Giannantonio, R.; Gravina, R.; Kuryloski, P.; Jafari, R. Enabling effective programming and flexible management of efficient body sensor network applications. IEEE Trans. Hum. Mach. Syst. 2013, 43, 115–133. [Google Scholar] [CrossRef]
- Ghasemzadeh, H.; Panuccio, P.; Trovato, S.; Fortino, G.; Jafari, R. Power-aware activity monitoring using distributed wearable sensors. IEEE Trans. Hum. Mach. Syst. 2014, 44, 537–544. [Google Scholar] [CrossRef]
- Fortinoa, G.; Galzaranoa, S.; Gravinaa, R.; Lib, W. A framework for collaborative computing and multi-sensor data fusion in body sensor networks. Inf. Fusion 2015, 22, 50–70. [Google Scholar] [CrossRef]
- Akyildiz, I.F.; Su, W.; Sankarasubramaniam, Y.; Cayirci, E. Wireless sensor networks: A survey. Comput. Netw. 2002, 38, 393–422. [Google Scholar] [CrossRef]
- Gharghan, S.K.; Nordin, R.; Ismail, M. A wireless sensor network with soft computing localization techniques for track cycling applications. Sensors 2016, 16, 1043. [Google Scholar] [CrossRef] [PubMed]
- Ghasemzadeh, H.; Jafari, R. Coordination analysis of human movements with body sensor networks: A signal processing model to evaluate baseball swings. IEEE Sens. J. 2011, 3, 603–610. [Google Scholar] [CrossRef]
- Moschetti, A.; Fiorini, L.; Esposito, D.; Dario, P.; Cavallo, F. Recognition of daily gestures with wearable inertial rings and bracelets. Sensors 2016, 16, 1341. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chen, B.R.; Patel, S.; Buckley, T.; Rednic, R.; McClure, D.J.; Shih, L.; Tarsy, D.; Welsh, M.; Bonato, P. A web-based system for home monitoring of patients with Parkinson’s disease using wearable sensors. IEEE Trans. Biomed. Eng. 2011, 58, 831–836. [Google Scholar] [CrossRef] [PubMed]
- Kim, H.-M.; Yoon, J.; Kim, G.-S. Development of a six-axis force/moment sensor for a spherical-type finger force measuring system. IET Sci. Meas. Technol. 2012, 6, 96–104. [Google Scholar] [CrossRef]
- Fortino, G.; Gravina, R. A cloud-assisted wearable system for physical rehabilitation. In ICTs for Improving Patients Rehabilitation Research Techniques (REHAB 2014); Series Communications in Computer and Information Science; Springer: Berlin/Heidelberg, Germany, 2014; Volume 515, pp. 168–182. [Google Scholar]
- Fortino, G.; Gravina, R. Rehab-aaService: A cloud-based motor rehabilitation digital assistant. In Proceedings of the 8th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth ‘14), Oldenburg, Germany, 20–23 May 2014; pp. 305–308.
- Greene, B.R.; McGrath, D.; O’Neill, R.; O’Donovan, K.J.; Burns, A.; Caulfield, B. An adaptive gyroscope-based algorithm for temporal gait analysis. Med. Biol. Eng. Comput. 2011, 48, 1251–1260. [Google Scholar] [CrossRef] [PubMed]
- Khan, A.M.; Lee, Y.-K.; Lee, S.; Kim, T.-S. Accelerometer’s position independent physical activity recognition system for long-term activity monitoring in the elderly. Med. Biol. Eng. Comput. 2011, 48, 1271–1279. [Google Scholar] [CrossRef] [PubMed]
- Lee, C.C. Fuzzy logic in control systems: Fuzzy logic controller part II. IEEE Trans. Syst. Man Cybern. 1990, 20, 419–435. [Google Scholar] [CrossRef]
- Bardossy, A.; Duckstein, L. Fuzzy Rule-Based Modeling with Applications to Geophysical, Biological and Engineering Systems; CRC Press: Boca Raton, FL, USA, 1995. [Google Scholar]
- Herrero, D.; Martinez, H. Fuzzy mobile-robot positioning in intelligent spaces using wireless sensor networks. Sensors 2011, 11, 10820–10839. [Google Scholar] [CrossRef] [PubMed]
- Khan, A.M.; Lee, Y.K.; Lee, S.Y.; Kim, T.S. A tri-axial accelerometer sensor-based human activity recognition via augmented signal features and hierarchical recognizer. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 1166–1172. [Google Scholar] [CrossRef] [PubMed]
- Lin, H.-C.; Chiang, S.-Y.; Lee, K.; Kan, Y.-C. An activity recognition model using inertial sensor nodes in a wireless sensor network for frozen shoulder rehabilitation exercises. Sensors 2015, 15, 2181–2204. [Google Scholar] [CrossRef] [PubMed]
- Zadeh, L.A. Outline of a new approach to the analysis of complex systems and decision processes. IEEE Trans. Syst. Man Cybern. 1973, 3, 28–44. [Google Scholar] [CrossRef]
- Mamdani, E.H.; Assilian, S. An experiment in linguistic synthesis with a fuzzy logic controller. Int. J. Man Mach. Stud. 1975, 7, 1–13. [Google Scholar] [CrossRef]
- Takagi, T.; Sugeno, M. Derivation of fuzzy control rules from human operator’s control actions. In Proceedings of the IFAC Symposium on Fuzzy Information, Knowledge Representation and Decision Analysis, Marseilles, France, 19–21 July 1983; pp. 55–60.
- Jin, G.H.; Lee, S.B.; Lee, T.S. Context Awareness of Human Motion States Using Accelerometer. J. Med. Syst. 2008, 32, 93–100. [Google Scholar] [CrossRef] [PubMed]
- Chang, J.-Y.; Shyu, J.-J.; Cho, C.-W. Fuzzy rule inference based human activity recognition. In Proceedings of the 2009 IEEE International Symposium on Control Applications & Intelligent Control (CCA&ISIC), Saint Petersburg, Russia, 8–10 July 2009; pp. 211–215.
- Helmi, M.; AlModarresi, S.M.T. Human activity recognition using a fuzzy inference system. In Proceedings of the IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2009), Jeju, Korea, 20–24 August 2009.
- Iglesias, J.A.; Angelov, P.; Ledezma, A.; Sanchis, A. Human activity recognition based on evolving fuzzy systems. Int. J. Neural Syst. 2010, 20, 355–364. [Google Scholar] [CrossRef] [PubMed]
- Kim, E.; Helal, S. Training-free fuzzy logic based human activity recognition. J. Inf. Process Syst. 2014, 10, 335–354. [Google Scholar] [CrossRef]
- Jang, J.-S.R. ANFIS: Adaptive-network-based fuzzy inference system. IEEE Trans. Syst. Man Cybern. 1993, 23, 665–685. [Google Scholar] [CrossRef]
- Hu, W.; Xie, D.; Tan, T.; Maybank, S. Learning activity patterns using fuzzy self-organizing neural network. IEEE Trans. Syst. Man Cybern. B Cybern. 2004, 34, 1618–1626. [Google Scholar] [CrossRef] [PubMed]
- Yang, J.-Y.; Chen, Y.-P.; Lee, G.-Y.; Liou, S.-N.; Wang, J.-S. Activity recognition using one triaxial accelerometer: A neuro-fuzzy classifier with feature reduction. In Entertainment Computing—ICEC 2007; Springer: Berlin/Heidelberg, Germany, 2007; Volume 4740, pp. 395–400. [Google Scholar]
- Liu, C.-T.; Chan, C.-T. A fuzzy logic prompting mechanism based on pattern recognition and accumulated activity effective index using a smartphone embedded sensor. Sensors 2016, 16, 1322. [Google Scholar] [CrossRef] [PubMed]
- Kan, Y.-C.; Chen, C.-K. A wearable inertial sensor node for body motion analysis. IEEE Sens. J. 2012, 12, 651–657. [Google Scholar] [CrossRef]
- Lin, H.-C.; Kan, Y.-C.; Hong, Y.-M. The comprehensive gateway model for diverse environmental monitoring upon wireless sensor network. IEEE Sens. J. 2011, 11, 1293–1303. [Google Scholar] [CrossRef]
- Bao, L.; Intille, S.S. Activity recognition from user-annotated acceleration data. In Pervasive Computing; Series Lecture Notes in Computer Science (LNCS); Springer: Berlin/Heidelberg, Germany, 2004; Volume 3001, pp. 1–17. [Google Scholar]
- Ravi, N.; Dandekar, N.; Mysore, P.; Littman, M.L. Activity recognition from accelerometer data. In Proceedings of the Conference on Innovative Applications of Artificial Intelligence (IAAI’05), Pittsburgh, PA, USA, 9–13 July 2005; Volume 3, pp. 1541–1546.
- Dong, W.; Shah, H. Vertex method for computing functions of fuzzy variables. Fuzzy Sets Syst. Arch. 1987, 24, 65–78. [Google Scholar] [CrossRef]
- Bezdek, J.C.; Pal, S.K. Fuzzy Models for Pattern Recognition, Methods that Search for Structures in Data; IEEE Press: New York, NY, USA, 1992. [Google Scholar]
- Pedrycz, W. Fuzzy logic in development of fundamentals of pattern recognition. Int. J. Approx. Reason. 1991, 5, 251–264. [Google Scholar] [CrossRef]
- Pärkkä, J.; Ermes, M.; Korpipää, P.; Mäntyjärvi, J.; Peltola, J.; Korhonen, I. Activity classification using realistic data from wearable sensors. IEEE Trans. Inf. Technol. Biomed. 2006, 10, 119–128. [Google Scholar] [CrossRef] [PubMed]
- Chan, C.S.; Liu, H. Fuzzy qualitative human motion recognition. IEEE Trans. Fuzzy Syst. 2009, 17, 851–862. [Google Scholar] [CrossRef]
- Ponce, H.; de Lourdes Martínez-Villaseñor, M.; Miralles-Pechuán, L. A novel wearable sensor-based human activity recognition approach using artificial hydrocarbon networks. Sensors 2016, 16, 1033. [Google Scholar] [CrossRef] [PubMed]
- Preece, S.J.; Goulermas, J.Y.; Kenney, L.P.J.; Howard, D. A comparison of feature extraction methods for the classification of dynamic activities from accelerometer data. IEEE Trans. Biomed. Eng. 2009, 56, 871–879. [Google Scholar] [CrossRef] [PubMed]
- Jain, A.K.; Duin, R.P.W.; Mao, J. Statistical pattern recognition: A review. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 4–37. [Google Scholar] [CrossRef]
- Kern, N.; Schiele, B.; Schmidt, A. Multi-sensor activity context detection for wearable computing. In Artificial Intelligent—EUSAI 2003; Series Lecture Notes in Computer Science (LNCS); Springer: Berlin/Heidelberg, Germany, 2003; Volume 2875, pp. 220–232. [Google Scholar]
- Junker, H.; Amft, O.; Lukowicz, P.; Tröster, G. Gesture spotting with body-worn inertial sensors to detect user activities. Pattern Recognit. 2008, 41, 2010–2024. [Google Scholar] [CrossRef]
- Chiang, S.-Y.; Kan, Y.-C.; Tu, Y.-C.; Lin, H.-C. Activity recognition by fuzzy logic system in wireless sensor network for physical therapy. In Intelligent Decision Technologies (IDT ‘12); Series Smart Innovation, Systems and Technologies; Springer: Berlin/Heidelberg, Germany, 2012; Volume 16, pp. 191–200. [Google Scholar]
SET-1 | |||
Feature Sets | [D1]:(δ10, δ11) | [D2]:(δ20, δ21) | [D3]:(δ30, δ31) |
Ф(θA)low & Ф(θB)low | [1]:(0, 16) | [1→0]:(16, 36) | [0]:(36, 90) |
Ф(θA)high & Ф(θB)high | [0]:(0, 30) | [0→1]:(30, 60) | [1]:(60, 90) |
Ф(ωBRx)low | [1]:(0, 25) | - | - |
Ф(ωBRx)high | [1]:(60, 300) | - | - |
Ф(gBRx)low | [0→1]:(0, 0.08) | [1]:(0.08, 0.28) | [1→0]:(0.28, 0.4) |
Ф(gBRx)high | [0]:(0, 0.28) | [0→1]:(0.28, 0.38) | [1]:(0.38, 1.35) |
Ф(σgBx)low | [0→1]:(0, 0.07) | [1]:(0.07, 0.28) | [1→0]:(0.28, 0.32) |
Ф(σgBx)high | [0→1]:(0.28, 0.4) | [1]:(0.4, 1) | - |
SET-2 | |||
Feature Sets | [D1]:(δ10, δ11) | [D2]:(δ20, δ21) | [D3]:(δ30, δ31) |
Ф(θA)low & Ф(θB)low | [1]:(0, 16) | [1→0]:(16, 36) | [0]:(36, 90) |
Ф(θA)high & Ф(θB)high | [0]:(0, 30) | [0→1]:(30, 60) | [1]:(60, 90) |
Ф(γωARx)low | [1]:(0, 15) | [1→0]:(15, 22) | [0]:(22, 60) |
Ф(γωARx)high | [0]:(0, 15) | [0→1]:(15, 30) | [1]:(30, 60) |
Ф(γgBRx)low | [1]:(0, 0.02) | [1→0]:(0.02, 0.0.3) | [0]:(0.03, 0.1) |
Ф(γgBRx)medium | [0→1] :(0, 0.02) | [1]:(0.02, 0.18) | [1→0]:(0.18, 0.2) |
Ф(γgBRx)high | [0]:(0, 0.18) | [0→1]:(0.18, 0.23) | [1]:(0.23, 1) |
Ф(σgAx)low | [0→1]:(0, 0.02) | [1]:(0.02, 0.2) | [1→0]:(0.2, 0.23) |
Ф(σgAx)high | [0→1]:(0.2, 0.3) | [1]:(0.3, 1) | - |
Rule 1 | Rule 2 | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
θA | θB | ωBRx | gBRx | σgBx | Output | θA | θB | γωAx | γgBx | σgAx | Output |
L | L | L | - | - | stand | L | L | L | L | - | stand |
L | H | L | - | - | sit | L | H | L | L | - | sit |
H | H | L | - | - | lie | H | H | L | L | - | lie |
- | - | H | L | L | walk | - | - | H | M | L | walk |
- | - | H | H | H | run | - | - | H | H | H | run |
Feature Set | Lie | Sit | Stand | Walk | Run | Average |
---|---|---|---|---|---|---|
Sample Test | ||||||
SET-1 | 100 | 100 | 100 | 100 | 100 | 100 |
SET-2 | 100 | 100 | 100 | 100 | 100 | 100 |
Blind Test | ||||||
SET-1 | 96 | 100 | 100 | 86.68 | 81.55 | 92.85 |
(static posture:dynamic motion) = (99:84) | ||||||
SET-2 | 98 | 100 | 100 | 96.31 | 90.48 | 96.96 |
(static posture:dynamic motion) = (99:93) |
Initial Status | ||||||||
---|---|---|---|---|---|---|---|---|
Input range: [0, 2] | ||||||||
(σ, μ) of three Gaussian MFs (1) for each input feature: (0.3397, 0), (0.3397, 1), and (0.3397, 2) | ||||||||
Output range: [1, 4] | ||||||||
mf1 | mf2 | mf3 | mf4 | mf5 | mf6 | mf7 | mf8 | mf9 |
1.0769 | 1.1538 | 1.2308 | 1.3077 | 1.3846 | 1.4615 | 1.5385 | 1.6154 | 1.6923 |
mf10 | mf11 | mf12 | mf13 | mf14 | mf15 | mf16 | mf17 | mf18 |
1.7692 | 1.8462 | 1.9231 | 2.0000 | 3.0769 | 3.1538 | 3.2308 | 3.3077 | 3.3846 |
mf19 | mf20 | mf21 | mf22 | mf23 | mf24 | mf25 | mf26 | mf27 |
3.4615 | 3.5385 | 3.6154 | 3.6923 | 3.7692 | 3.8462 | 3.9231 | 4.0000 | 2.5000 |
After training process | ||||||||
Input range | mf1 (σ, μ) | mf2 (σ, μ) | mf3 (σ, μ) | |||||
In1: [0.0045, 1.6348] | (0.1941, −0.0506) | (0.1736, 1.0145) | (0.4448, 1.9665) | |||||
In2: [0.0076, 1.8080] | (0.2840, −0.0230) | (0.2951, 0.9999) | (0.3830, 1.9823) | |||||
In3: [0.0077, 1.5483] | (0.4087, 0.0492) | (0.2451, 1.0645) | (0.3420, 1.9982) | |||||
Output range: [1.01, 4] | ||||||||
mf1 | mf2 | mf3 | mf4 | mf5 | mf6 | mf7 | mf8 | mf9 |
1.6017 | 1.0093 | 171.5 | 2.1715 | 2.8963 | −242.3 | 4.5227 | 3.4755 | 3.5141 |
mf10 | mf11 | mf12 | mf13 | mf14 | mf15 | mf16 | mf17 | mf18 |
−38.4 | 187.5 | 11.7 | 3.1452 | 3.6483 | −0.9198 | −1.855 | 0.9098 | 3.6164 |
mf19 | mf20 | mf21 | mf22 | mf23 | mf24 | mf25 | mf26 | mf27 |
3.3686 | 4.1074 | 2.7958 | −10.763 | 28.7 | 3.7789 | 2.1909 | 15.8998 | −28.038 |
Recognition rate (%) of dynamic motions by (σgBx, σgBy, σgBz). | ||||||||
Group | Walk | Run | Record | Epoch | Uncertain (2) | RMSE (3) | ||
Training | - | - | 187 | 40 | 0.4182 | |||
Testing 1 | 91.7 (100) (4) | 100 (100) | 26 | 1 | 0.3889 | |||
(11/12) (5) | (14/14) | |||||||
Testing 2 | 100 (100) | 92.3 (100) | 24 | 1 | 0.3322 | |||
(11/11) | (12/13) | |||||||
Testing 3 | 90 (100) | 88.9 (100) | 28 | 3 | 0.3386 | |||
(9/10) | (16/18) |
Feature | Action Type | Recognized Angle (Degree) | Threshold (Degree) | Pass/Count |
---|---|---|---|---|
θC | (a) | 40, 42, 39, 40, 40 | 40~45 | 4/5 |
θD | (b) | 45, 78, 130, 172 | {40~45, 85~90, 130~135, 175~180} | 2/4 |
© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chiang, S.-Y.; Kan, Y.-C.; Chen, Y.-S.; Tu, Y.-C.; Lin, H.-C. Fuzzy Computing Model of Activity Recognition on WSN Movement Data for Ubiquitous Healthcare Measurement. Sensors 2016, 16, 2053. https://doi.org/10.3390/s16122053
Chiang S-Y, Kan Y-C, Chen Y-S, Tu Y-C, Lin H-C. Fuzzy Computing Model of Activity Recognition on WSN Movement Data for Ubiquitous Healthcare Measurement. Sensors. 2016; 16(12):2053. https://doi.org/10.3390/s16122053
Chicago/Turabian StyleChiang, Shu-Yin, Yao-Chiang Kan, Yun-Shan Chen, Ying-Ching Tu, and Hsueh-Chun Lin. 2016. "Fuzzy Computing Model of Activity Recognition on WSN Movement Data for Ubiquitous Healthcare Measurement" Sensors 16, no. 12: 2053. https://doi.org/10.3390/s16122053
APA StyleChiang, S. -Y., Kan, Y. -C., Chen, Y. -S., Tu, Y. -C., & Lin, H. -C. (2016). Fuzzy Computing Model of Activity Recognition on WSN Movement Data for Ubiquitous Healthcare Measurement. Sensors, 16(12), 2053. https://doi.org/10.3390/s16122053