Next Article in Journal
Benchmarking Analysis of the Accuracy of Classification Methods Related to Entropy
Next Article in Special Issue
Optimal 3D Angle of Arrival Sensor Placement with Gaussian Priors
Previous Article in Journal
Seeded Ising Model and Distributed Biometric Template Storage and Matching
Previous Article in Special Issue
Driving Risk Assessment Using Near-Miss Events Based on Panel Poisson Regression and Panel Negative Binomial Regression
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Proposal of a Motion Measurement System to Support Visually Impaired People in Rehabilitation Using Low-Cost Inertial Sensors

by
Karla Miriam Reyes Leiva
1,2,*,
Milagros Jaén-Vargas
1,
Miguel Ángel Cuba
1,
Sergio Sánchez Lara
1 and
José Javier Serrano Olmedo
1,3
1
Escuela Superior Técnica de Ingenieros de Telecomunicación, Universidad Politécnica de Madrid, 28013 Madrid, Spain
2
Engineering Faculty, Universidad Tecnológica Centroamericana UNITEC, San Pedro Sula 211001, Honduras
3
Networking Center of Biomedical Research for Bioengineering Biomaterials and Nanomedicine, Instituto de Salud Carlos III, 28029 Madrid, Spain
*
Author to whom correspondence should be addressed.
Entropy 2021, 23(7), 848; https://doi.org/10.3390/e23070848
Submission received: 28 April 2021 / Revised: 21 June 2021 / Accepted: 29 June 2021 / Published: 1 July 2021

Abstract

:
The rehabilitation of a visually impaired person (VIP) is a systematic process where the person is provided with tools that allow them to deal with the impairment to achieve personal autonomy and independence, such as training for the use of the long cane as a tool for orientation and mobility (O&M). This process must be trained personally by specialists, leading to a limitation of human, technological and structural resources in some regions, especially those with economical narrow circumstances. A system to obtain information about the motion of the long cane and the leg using low-cost inertial sensors was developed to provide an overview of quantitative parameters such as sweeping coverage and gait analysis, that are currently visually analyzed during rehabilitation. The system was tested with 10 blindfolded volunteers in laboratory conditions following constant contact, two points touch, and three points touch travel techniques. The results indicate that the quantification system is reliable for measuring grip rotation, safety zone, sweeping amplitude and hand position using orientation angles with an accuracy of around 97.62%. However, a new method or an improvement of hardware must be developed to improve gait parameters’ measurements, since the step length measurement presented a mean accuracy of 94.62%. The system requires further development to be used as an aid in the rehabilitation process of the VIP. Now, it is a simple and low-cost technological aid that has the potential to improve the current practice of O&M.

1. Introduction

People with visual impairments face many daily challenges that limit their quality of life. These challenges include basic life activities such as finding and keeping a job, mobility, and displacement, using public transport, among others. When a person is born with a visual disability or suffers from a traumatism or disease that leads to a visual impairment, they must be assisted trough a rehabilitation process. During this rehabilitation process, the person is provided with tools to help them deal with their visual impairments with greater independence and self-confidence. Tools as learning braille, learning how to use a long cane, sightless feeding, also to optimize the use of residual vision and teaching skills in order to improve visual functioning in daily life as well as other daily activities as O&M trained by specialists [1,2,3,4,5,6]. This process of rehabilitation is specialized according to the cognitive capacities of each user, the regular rehabilitation programs worldwide, as reported by the World Blind Union, which includes several stages, such as activities of daily living services, career exploration services, travel-training services/O&M, and others [7]. In several references [6,8,9,10,11,12] the emphasis and importance of the O&M service and training in order to improve the quality of life, is widely emphasized [5,13]. Therefore, there is a specific health discipline in charge of the study, development, and improvement of the O&M training in VIP [14,15,16]. The latest report of the international approaches to rehabilitation programs from the World Blind Union [7] presents two important challenges on which this project was motivated: (1) the limitation of resources to provide basic rehabilitation services and (2) transportation and geographic limitations, where many VIP must displace themselves to other cities in order to access the rehabilitation services which, in some cases, is impossible for some VIP.
A fundamental part of the mobility training is the use of the long cane, the VIP should learn how to hold it correctly, how to grip it, how to walk with it and sweep it in order to detect obstacles, different techniques of exploration, and other parameters according to the complexity of the environment in which the VIP will navigate [17,18]. This training is usually done in person with an O&M specialist, which, as mentioned before, leads to an accessibility problem in rural communities, also it compromises the rehabilitation duration, as well as the number of VIP that can be rehabilitated at the same time. In this training, depending on the scenario there is a recommended technique and according to the complexity and advances of the training, the scenarios will change [19]. However, the parameters for evaluation of the correct use, regardless of the change of scenario, will remain the same; this allows the possibility to register parameters and quantitative values of the motion of the person and the long cane [20], in order to support the O&M training in the rehabilitation processes.
According to the literature, a diversity of technological proposals have been designed for orientation and mobility, such as ETAS (Electronic Travel Aid Systems) [2], focused on obtaining information from the environment and providing it to the visually impaired in order to assist them in autonomous navigation. There have been many attempts to enhance the long cane with technology [21,22,23,24,25]. These systems are developed from technologies such as Global Positioning System, BLE beacons, RFID or radio frequency identification, to obtain information on position and displacement and optical sensors (RGB-D cameras, laser), inertial sensors, speed sensors among others for obtaining information regarding object detection [26,27,28,29,30,31,32]. However, the use of any of these ETAS requires previous O&M training [33,34], leading to an existing gap, which is the development of assistive technologies specifically focused on evaluation and assistance of the training process, so it can be more accessible for users.
Three articles of assistive rehabilitation tools for O&M were found in the literature; Schloerb et al. [35] developed a virtual environment system named BlindAid, created in order to enhance the O&M training. This is a software with haptic and auditory feedback in which the user can virtually visit different unknown places in order to create cognitive mental maps of the representation of these places. Oliveira et al. [36] created a programming language named GoDonnie, to be used as a tool to aid in the resolution of spatial problems involving O&M. This programming language was developed considering the criteria of accessibility and usability for VIP, with the assumption that by using GoDonnie, the user could improve programming and O&M skills, since the users are able to create mental maps of the environments and related objects. On the other hand, Gong et al. [37] developed HeliCoach an O&M training system created to help VIP to train the ability of audio orientation. This training environment is composed of a drone, which moves through 3D space and is used as a sound source. It is composed of a belt with a set of vibration motors for haptic feedback, the belt also contains an BNO055 IMU and six vibration motors controlled by an Arduino DFRobot Leonardo + Xbee. In this system high accuracy indoor localization system is needed for the perspective-driven interaction. For this goal, Ultra-Wide Bandwidth Microwave is used: the system uses four base stations and two tracking tags which are embedded into the drone and the cap of the user, respectively.
In comparison to the mentioned developed technologies, the aim of this research was to develop a simple-architecture hardware system using low-cost inertial sensors for data acquisition and test its reliability in the quantitative analysis of the parameters evaluated in the rehabilitation process of VIP by obtaining metrics that the O&M specialists personally examined to aid the rehabilitators during current practice of O&M while training travel techniques.
The system can provide information about the hand grip rotation, the safety zone, the hand height during the travel techniques, amplitude and patterns of the sweeping, and gait parameters with a high accuracy using only two inertial sensors.
Technologies based on inertial measurement unit sensors (IMU) are used in a large and ever-growing number of applications, such as intelligence guidance, self-driving robots [38,39], full body motion tracking [40,41,42,43] and navigation [26,44,45,46]. An accelerometer measures the external specific force acting on the sensor, which consists of both the sensor’s acceleration and the acceleration due to the earth’s gravity. A gyroscope measures angular velocity: the rate of change of the sensor’s orientation. Thus, the integration of gyroscope measurements provides information about the orientation on the sensor. Magnetometers complement accelerometers by providing sensor heading (orientation around the gravity vector), which is information that accelerometers or gyroscopes cannot provide. With the fusion of accelerometer, gyroscope and magnetometers, the orientation is estimated based on the direction of the magnetic field [39,44]. In the system presented in this paper, the parameters of O&M are calculated using absolute orientation values of the sensor fusion provided by the BNO055 IMU module. Note that the present article is an extended version of [47], where the algorithms to measure amplitude of the sweeping techniques and the orientation of the long cane were tested with 97% and 98% accuracy, respectively.

2. Materials and Methods

A tool was developed to evaluate the rehabilitation parameters during the experimental procedure. For the data acquisition an Arduino MKR1010 microprocessor was used with two 9DOF BNO055 IMU Bosch sensors. One sensor placed on the outer side of the leg of each participant and the other on the higher part of a 117 cm long cane. Serial communication was done via I2C protocol at a sample rate of 0.01 s. In order to remove noise components from the signal, a low pass filtering was performed, with a cutoff frequency of 20 Hz. The microprocessor was wired to a SD card module via SPI protocol and to two push buttons settled as input parameters to control the acquisitions manually. With the use of the Euler roll angle θ l e g and the interpretation of step detection according to the values of the filtered absolute orientation, an algorithm was developed to calculate step length using the local coordinates of the sensor placed in the leg. Additionally, to obtain the sweeping metrics with the local coordinates of sensor placed in the cane, the Euler roll φ c a n e , pitch θ c a n e and yaw γ c a n e angles were used to provide the grip rotation, the safety zone metrics and sweeping characteristics consecutively.
For the experimental procedure, the acquisitions were performed with 10 blindfolded volunteers. First, the volunteers were instructed and trained for each travel technique while sighted. A floor carrel was marked for the sweep training with an amplitude of around 1 m, they were asked to train each technique walking 20 steps three times. After that, they were blindfolded and asked to perform the travel techniques when displacing around 20 steps in the indicated direction, as described in Table 1. Each acquisition was repeated blindfolded three times, obtaining nine comparative metrics for each participant. The total time and displacement were measured using a 50 m measuring tape and a chronometer. This value served as references values to evaluate the accuracy of the measured gait parameters.

3. Results

3.1. Measurement of the Hand Height and the Safety Zone

The Hand Height (HH) and Safety Zone (SZ) are reference parameters to evaluate the reaction distance in O&M, which refers to the warning distance provided by the cane of an object in one’s path, the time that is provided by the cane to be warmed about an object or danger [48]. By implementing trigonometrical ratios and using the local coordinates of the sensor, the pitch angle θ c a n e (which is the transversal axis, equivalent to the angle produced between the floor plane and the long cane) was continuously measured to obtain the height of the hand during and the distance between the tip of the cane and the leg, in the repetitions of the three different travel techniques. Being the HH, the opposite leg of the θ c a n e , the SZ then is the adjacent leg from the θ c a n e , as shown in Figure 1.
An extract of the measured values for HH and SZ for each subject is presented in Table 2. This value is compared with the real value (RV), which is the self-reported HH and the calculated SZ according to Pythagoras theorem. The mean value is the calculated media of the HH and SZ measurements within the nine travel technique acquisitions. The values of standard deviation (SD) and %Error vary for each subject. The major precision and accuracy obtained was with S01, being the standard deviation of only 1.39 cm, which represents 1.46% of the mean HH and 1.95 cm which represents 2.83% of the mean SZ and the %Error of 0.63% and 1.37%, respectively. Additionally, S09 presented a very low %Error, however a high SD (6.15) which together with S05 presented the less precision on repeatability, the SD being 5.03% of the mean HH and 8.60% of the SZ. On the other hand, the lower accuracy was shown by S06, followed by S07 and S08 with a %Error of 4.62% in the HH and 4.86% in the SZ measurement. Finally, a media accuracy of around 97.62% was obtained by joining all the subjects in both measurements proving that the algorithm applied is reliable to measure these O&M parameters using absolute orientation angles.

3.2. Measurement of the Grip Rotation

A proper grip was one of the first parameters to be observed by the rehabilitators during the very first stage of the O&M training. With the inertial sensors, is not possible to analyze all the characteristics of the grip, but it is possible to determine the variation of the rotation of the cane which is the consequence of the grip rotation by analyzing the absolute orientation angles, as shown in Table 3. In this table, the SD in degrees for each travel technique by subject was calculated and presented. For this, it was taken into account the total raw data of the roll angle φ c a n e , which according to the local coordinates of the placed sensor represents the rotation of the grip of the user during the development of the travelling techniques. As shown in the Table 3, this value can be representative for technical analysis of the performance of the traveling techniques independently of the stage of and scene in which the user is being rehabilitated. It can also provide a numerical representation to establish what is considered as adequate and acceptable grip rotation according to each travel technique.
Note that the variation of the values represents the percentage of rotation of the grip during each experiment which means that each column represents how much variation in the rotation of the hand occurred during the experimental acquisition. In the results, it can be observed that S04 and S10 present less grip rotation in the 2P and 3P techniques, which is an indication of a better execution than for instance for S03 and S09. This is direct indication for the specialist to determine which is the acceptable percentage of rotation for each travelling technique and which technique is more appropriate for the visually impaired; it can also allow to have a tracking of the performance during the rehabilitation stages.

3.3. Representation of the Sweeping

In [47], it was clearly demonstrated that using absolute orientation angles was reliable to measure the amplitude of the sweepings with the long cane. As described by Blasch and LaGrow [48], the performance of the O&M rehabilitation can be evaluated in terms of “coverage” provided by the long cane, where a full coverage includes, for instance, object preview: the capacity to identify objects in the path of travel with a correct sweeping of the long cane. As the carried out traveling techniques consists of sweeping oscillatory movements, by extracting the motion of the yaw angle γ c a n e , it is possible to graphically represent the movement of the long cane beside the value of the sweeping amplitude, as shown in Figure 2.
This graphical representation is indispensable in order to have an estimation of the performance of the travelling techniques while the user is in training, since it is a detailed characterization of the movement of the cane in each millisecond for the dynamic conditions. Additionally, it can help the rehabilitators to evaluate the coverage that is being provided in that moment of the execution of the travelling techniques. As well, for the user to self-correct any lack of coverage with immediate feedback to prevent an accident while correcting the amplitude and execution of the sweeping during the training. It can also help the user and the rehabilitator to quantitatively determinate which is the most appropriate travelling technique for the user. As shown in Figure 2, many differences are observed in the development of the traveling techniques for two subjects (A and B) with the same characteristics. This brings us to one last advantage of this tool, which is the possibility to register the performance of each user during the entire rehabilitation process for future data analysis.

3.4. Measurement of Gait Parameters

In terms of coverage, an appropriate gait is crucial for the development of O&M abilities [49], therefore during the O&M training, the gait velocity and the stride length is being constantly visually evaluated by the rehabilitator. With this tool, the method to evaluate the step length for calculating the gait parameters (Stride Length and Gait Velocity) was developed using also absolute orientation angles. With the inertial sensor placed on the outer side of the leg, with the same local coordinates as the sensor placed in the long cane, the pitch angle was used to calculate the step length in a walking cycle and two of the travelling techniques (see Figure 1). The step length was calculated in an algorithm averaging the estimation of the displacement of the leg during the gait cycle following the difference of each peak-to-peak representation of the oscillatory movements of the pitch angle, where each peak represents the higher value of each phase in the gait cycle. Therefore, by knowledge of the leg length of each user, and constantly laying up the values of θ legmax and θ legmin , the step length could be calculated using the following equation:
SL = 2 × sin ( θ legmax   θ legmin 2 ) × LL
where SL is the length of the step and LL is the length of the leg of the user. The algorithm is capable of detecting if a step is being executed with the θ legmax and θ legmin thresholds. Table 4 summarizes the measurements obtained in each experiment. Note that the value of the measurement of the SL is an average of the three measurements obtained for each repetition and the mean difference (MD) in centimeters is measured with the resulting three values of the average.
The difference in centimeters between the actual value and the measured value is very low in most of the cases (2.704 cm–12.370 cm), which indicates that the system is also reliable to estimate the step length, however, in order to calculate traveled distances using this value, it is necessary to set the measurement error and thus dismiss the accumulated errors. This was not possible because there is an extended variation of the mean %Error of the measurement from one subject to other, from 1.07% to 15.06%. The reason for this variation is unknown, perhaps so the proposed method does not estimate hip displacement in the gait cycle. Another reason could be the reliance on the sensor decalibration, however, the accuracy of the absolute angles sensed varies very little with calibration but, as the step length values lie in the order of centimeters, this can be a factor affecting the variation of the %Error, which has a mean of 94.62%.

4. Discussion

Kim et al. [20], presented a quantification of the characteristics of long cane usage. In this work, similar parameters are evaluated in terms of the coverage of the travelling techniques in relation to the rotation angles of the movement of the long cane. However, to develop this study, optical tracking cameras were needed in addition to an inertial sensor placed in the long cane. The presented tool allows the dynamic quantification of the characteristics of the movement of the long cane with a lower cost dispositive and complexity and with high precision. With the inertial sensors and the presented metrics, it will be possible to obtain outcome measures as stride rate, gate velocity (meters per minute), and grip characteristics. Additionally, the provided coverage and long mechanics will allow interpretations of the sweeping characteristics as amplitude, frequency and the ability to detect obstacles in the path, as it has been done previously either with more complex acquisition systems [15,50,51], simulated [52] or in some cases manually [53].
The presented measure of the SL can be considered for the estimation of the gait parameters in O&M. Considering the limitations of the method, the most remarkable element of this tool is the fact that the system brings a measurement with the simplicity of one inertial sensor placed in the leg, using only one absolute orientation angle. Most of the algorithms found in the literature considered, beside orientation angles, acceleration values for step detection and calculation of displacement [54], as addition of at least another sensing method, which brings many other limitations and complexity in the development of the algorithm [34]. This article presents a simple method for computing clinically relevant gait parameters, with acceptable precision and accuracy, as in [55]. However, it is a fact that more precision can be obtained implementing a new method considering the details of the swing of the gait cycle or implementing artificial intelligence for instance [55,56].
Currently, a motion analysis device able to evaluate the percentage of coverage provided by a travelling technique according to specific parameters of a user cannot be found in the literature. The RoboCane software [48] was not successfully adopted by the O&M research community in the last decade. This software was designed to calculate the coverage according to direct measurement (manual) of the specific variables of the user. On the other hand, the proposed tool will allow the O&M specialists to have a real estimation of the coverage that the users are providing to themselves in dynamic conditions, which will also help them to be more objective in the evaluation of the O&M training. Among O &M specialists and researchers, it is known that there is no standardization in training methods, and these methods may vary according to the experience of each specialist. That is why research in O&M can also benefit from this tool. The development of the presented tool permits evaluating these mobility parameters independently of the environment complexity in which the training is gradually subjected [57], as it is a low cost portable device. Moreover, further development must be done to obtain more quantification characteristics of the O&M performance of the VIP. By adding one more sensor to the body, for instance, parameters of postural stability and balance analysis can be obtained.

5. Conclusions

This article proposed a system able to overview the quantitative parameters of O&M for VIP, which are currently visually analyzed by O&M specialist during rehabilitation, such as sweeping coverage and gait analysis. The proposed tool provides motion analysis of the long cane and the leg by using placed low-cost inertial measurement unit sensors (IMU). The system was tested in laboratory conditions by six blindfolded volunteers following three travel techniques trained by VIP during rehabilitation. The experimental results indicate that this system is reliable for measuring grip rotation, safety zone, sweeping amplitude and hand position using orientation angles with an accuracy of 97%. In terms of future work, a further development is required for the system to be implemented as a rehabilitation aid. Thereby, a more precise method for step length must be obtained, since the mean %Error varies between 1.07% and 15.06% among experiments. Also, more parameters of O&M can be analyzed using IMU’s absolute angles, including postural stability and balance analysis. Finally, as the main purpose, the proposed system is a new, simple and low-cost technological aid that that has the potential to improve the current practice of O&M.

Author Contributions

Conceptualization, K.M.R.L. and J.J.S.O.; methodology, K.M.R.L.; software, K.M.R.L., M.Á.C., S.S.L.; validation, K.M.R.L., M.J.-V. and J.J.S.O.; formal analysis, K.M.R.L. and J.J.S.O.; investigation, K.M.R.L.; resources, J.J.S.O.; data curation, K.M.R.L., M.Á.C., S.S.L.; writing—original draft preparation, K.M.R.L.; writing—review and editing, M.J.-V., M.Á.C., S.S.L. and J.J.S.O.; visualization, K.M.R.L., M.J.-V.; supervision, J.J.S.O. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partially financed by the Ministerio de Ciencia, Innovación y Universidades, Ref.: PGC2018-097531-B-I00.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Ethics Committee of Universidad Politécnica de Madrid (Ref. ID 2020000224, 30 October 2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data supporting reported results can be found in the following link: https://docs.google.com/spreadsheets/d/1-4NMnHSUrvO-NimwVJImX7G-dZibNwdqc_E2CO-AR00/edit?usp=sharing, accessed on 1 July 2021.

Acknowledgments

The author Karla Miriam Reyes acknowledges scholarship support from the Fundación Carolina FC and the Universidad Tecnológica Centroamericana UNITEC. The author Milagros Jaén-Vargas would like to thank the Secretaria Nacional de Ciencia y Tecnología SENACYT for her scholarship in the IFARHU-SENACYT program.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Brady, E.; Morris, M.R.; Zhong, Y.; White, S.; Bigham, J.P. Visual challenges in the everyday lives of blind people. Conf. Hum. Factors Comput. Syst. Proc. 2013, 2117–2126. [Google Scholar] [CrossRef]
  2. Real, S.; Araujo, A. Navigation systems for the blind and visually impaired: Past work, challenges, and open problems. Sensors 2019, 19, 3404. [Google Scholar] [CrossRef] [Green Version]
  3. Aciem, T.M.; Mazzotta, M.J.d. Personal and social autonomy of visually impaired people who were assisted by rehabilitation services. Rev. Bras. Oftalmol. 2013, 72, 261–267. [Google Scholar] [CrossRef] [Green Version]
  4. Kacorri, H.; Kitani, K.M.; Bigham, J.P.; Asakawa, C. People with visual impairment training personal object recognizers: Feasibility and challenges. Conf. Hum. Factors Comput. Syst. Proc. 2017, 5839–5849. [Google Scholar] [CrossRef]
  5. Stelmack, J. Quality of life of low-vision patients and outcomes of low-vision rehabilitation. Optom. Vis. Sci. 2001, 78, 335–342. [Google Scholar] [CrossRef]
  6. Lopera, G.; Aguirre, Á.; Parada, P.; Baquet, J. Manual Tecnico De Servicios De Rehabilitacion Integral Para Personas Ciegas O Con Baja Vision En America Latina Unión Latinoamericana De Ciegos -Ulac. 2010. Available online: http://www.ulacdigital.org/downloads/manual_de_rehabilitacion.pdf (accessed on 20 April 2021).
  7. American Foundation for the Blind. International Approaches to Rehabilitation Programs for Adults who are Blind or Visually Impaired: Delivery Models. In Services, Challenges, and Trends; American Foundation for the Blind: Arlington, VA, USA, 2016; Available online: https://www.foal.es/es/content/international-approaches-rehabilitation-programs-adults-who-are-blind-or-visually-impaired (accessed on 13 February 2021).
  8. National Rehabilitation Center for the disabled Japan. Rehabilitation Manual, tactile ground surface indicators for blind persons. 2003. Available online: http://www.rehab.go.jp/english/whoclbc/pdf/E13.pdf (accessed on 1 February 2021).
  9. Welsh, R.L.; Blasch, B.B. Manpower needs in orientation and mobility. New Outlook Blind 1974, 68, 433–443. [Google Scholar] [CrossRef]
  10. Blasch, B.; Gallimore, D. Back to the Future: Expanding the Profession—O&M for People with Disabilities. Int. J. Orientat. Mobil. 2013, 6, 21–33. [Google Scholar] [CrossRef] [Green Version]
  11. Zijlstra, G.A.R.; Ballemans, J.; Kempen, G.I.J.M. Orientation and mobility training for adults with low vision: A new standardized approach. Clin. Rehabil. 2013, 27, 3–18. [Google Scholar] [CrossRef] [Green Version]
  12. Szabo, J.; Panikkar, R.K. Bridging the gap between physical therapy and orientation and mobility in schools: Using a collaborative team approach for students with visual impairments. J. Vis. Impair. Blind. 2017, 111, 495–510. [Google Scholar] [CrossRef]
  13. Cuturi, L.F.; Aggius-Vella, E.; Campus, C.; Parmiggiani, A.; Gori, M. From science to technology: Orientation and mobility in blind children and adults. Neurosci. Biobehav. Rev. 2016, 71, 240–251. [Google Scholar] [CrossRef] [Green Version]
  14. Teskeredžić, A. The significance of orientantion of blind pupuls to ther body in regard to mobility and space orientation. Human 2018, 8, 10–16. [Google Scholar] [CrossRef]
  15. Ramsey, V.K.; Blasch, B.B.; Kita, A. Effects of Mobility Training on Gait and Balance. J. Vis. Impair. Blind. 2003, 97, 720–726. [Google Scholar] [CrossRef]
  16. Scott, B.S. Opening Up the World: Early Childhood Orientation and Mobility Intervention as Perceived by Young Children Who are Blind, Their Parents, and Specialist Teachers. 2015. Available online: https://search.proquest.com/docview/1925329675?accountid=14548%0Ahttps://julac.hosted.exlibrisgroup.com/openurl/HKU_ALMA/SERVICES_PAGE??url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&genre=dissertations+%26+theses&sid=ProQ:Australian+Ed (accessed on 13 February 2021).
  17. Blasch, B.B.; la Grow, S.; Penrod, W. Environmental Rating Scale for Orientation and Mobility. Int. J. Orientat. Mobil. 2008, 1, 9–16. [Google Scholar] [CrossRef] [Green Version]
  18. Pissaloux, E.; Velázquez, R. Mobility of visually impaired people: Fundamentals and ICT assistive technologies. Mobil. Vis. Impair. People Fundam. ICT Assist. Technol. 2017, 1–652. [Google Scholar] [CrossRef]
  19. Organización Nacional de Ciegos Españoles. Discapacidad Visual y Autonomía Personal. Enfoque Práctico de la Rehabilitación. 2011. Available online: https://sid.usal.es/idocs/F8/FDO26230/discap_visual.pdf (accessed on 1 February 2021).
  20. Kim, Y.; Moncada-Torres, A.; Furrer, J.; Riesch, M.; Gassert, R. Quantification of long cane usage characteristics with the constant contact technique. Appl. Ergon. 2016, 55, 216–225. [Google Scholar] [CrossRef] [Green Version]
  21. Fan, K.; Lyu, C.; Liu, Y.; Zhou, W.; Jiang, X.; Li, P.; Chen, H. Hardware implementation of a virtual blind cane on FPGA. In Proceedings of the 2017 IEEE International Conference on Real-time Computing and Robotics (RCAR), Okinawa, Japan, 14–18 July 2017; pp. 344–348. [Google Scholar] [CrossRef]
  22. Dastider, A.; Basak, B.; Safayatullah, M.; Shahnaz, C.; Fattah, S.A. Cost efficient autonomous navigation system (e-cane) for visually impaired human beings. In Proceedings of the 2017 IEEE region 10 humanitarian technology conference (R10-HTC), Dhaka, Bangladesh, 21–23 December 2017; pp. 650–653. [Google Scholar] [CrossRef]
  23. Meshram, V.V.; Patil, K.; Meshram, V.A.; Shu, F.C. An Astute Assistive Device for Mobility and Object Recognition for Visually Impaired People. IEEE Trans. Human Mach. Syst. 2019, 49, 449–460. [Google Scholar] [CrossRef]
  24. Zhang, H.; Ye, C. A Visual Positioning System for Indoor Blind Navigation. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 9079–9085. [Google Scholar] [CrossRef]
  25. Bernieri, G.; Faramondi, L.; Pascucci, F. Augmenting white cane reliability using smart glove for visually impaired people. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 8046–8049. [Google Scholar] [CrossRef]
  26. Islam, M.M.; Sadi, M.S.; Zamli, K.Z.; Ahmed, M.M. Developing Walking Assistants for Visually Impaired People: A Review. IEEE Sens. J. 2019, 19, 2814–2828. [Google Scholar] [CrossRef]
  27. Biswas, M.; Dhoom, T.; Pathan, R.K.; Chaiti, M.S. Shortest Path Based Trained Indoor Smart Jacket Navigation System for Visually Impaired Person. In Proceedings of the 2020 IEEE International Conference on Smart Internet of Things (SmartIoT), Beijing, China, 14–16 August 2020; pp. 228–235. [Google Scholar] [CrossRef]
  28. Ferrand, S.; Alouges, F.; Aussal, M. An Augmented Reality Audio Device Helping Blind People Navigation. In International Conference on Computers Helping People with Special Needs; Springer International Publishing: Cham, Switzerland, 2018; Volume 10897, Lecture Notes in Computer Science. [Google Scholar]
  29. Ferrand, S.; Alouges, F.; Aussal, M. An electronic travel aid device to help blind people playing sport. IEEE Instrum. Meas. Mag. 2020, 23, 14–21. [Google Scholar] [CrossRef]
  30. Jabbar, M.S.; Hussain, G.; Cho, J. Indoor Positioning System: Improved deep learning approach based on LSTM and multi-stage activity classification. In Proceedings of the 2020 IEEE International Conference on Consumer Electronics-Asia (ICCE-Asia), Seoul, Korea, 1–3 November 2020; pp. 15–18. [Google Scholar] [CrossRef]
  31. Guerreiro, J.; Sato, D.; Asakawa, S.; Dong, H.; Kitani, K.M.; Asakawa, C. Cabot: Designing and evaluating an autonomous navigation robot for blind people. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility, Pittsburgh, PA, USA, 28–30 October 2019; pp. 68–82. [Google Scholar] [CrossRef] [Green Version]
  32. Paredes, N.E.G.; Cobo, A.; Martín, C.; Serrano, J.J. Methodology for building virtual reality mobile applications for blind people on advanced visits to unknown interior spaces. In Proceedings of the 14th International Conference on Mobile Learning, Lisbon, Portugal, 14–16 April 2018; pp. 3–14. [Google Scholar]
  33. Davies, T.C.; Burns, C.M.; Pinder, S.D. Mobility interfaces for the visually impaired: What’s missing? ACM Int. Conf. Proc. Ser. 2007, 254, 41–47. [Google Scholar] [CrossRef]
  34. Kandalan, R.N.; Namuduri, K. Techniques for Constructing Indoor Navigation Systems for the Visually Impaired: A Review. IEEE Trans. Hum. Mach. Syst. 2020, 50, 492–506. [Google Scholar] [CrossRef]
  35. Schloerb, D.W.; Lahav, O.; Desloge, J.G.; Srinivasan, M.A. BlindAid: Virtual environment system for self-reliant trip planning and orientation and mobility training. In Proceedings of the 2010 IEEE Haptics Symposium, Waltham, MA, USA, 25–26 March 2010; pp. 363–370. [Google Scholar] [CrossRef]
  36. Oliveira, J.D.; Campos, M.D.; Bordini, R.H.; Amory, A. Godonnie: A robot programming language to improve orientation and mobility skills in people who are visually impaired. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility, Pittsburgh, PA, USA, 28–30 October 2019; pp. 679–681. [Google Scholar] [CrossRef]
  37. Gong, J.; Ding, Q.; Xu, P.; Zhang, Y.; Zhang, L.; Wang, Q. HeliCoach: An Adaptive Multimodal Orientation and Mobility Training System in a Drone-Based Simulated 3D Audio Space. Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/J. Comput. Des. Comput. Graph. 2020, 32, 1129–1136. [Google Scholar] [CrossRef]
  38. Zheng, Y. Miniature Inertial Measurement Unit. Space Microsyst. Micro/Nano Satell. 2018, 233–293. [Google Scholar] [CrossRef]
  39. Kok, M.; Hol, J.D.; Schön, T.B. Using Inertial Sensors for Position and Orientation Estimation. Found. Trends Signal Process. 2017, 11, 1–153. [Google Scholar] [CrossRef] [Green Version]
  40. Filippeschi, A.; Schmitz, N.; Miezal, M.; Bleser, G.; Ruffaldi, E.; Stricker, D. Survey of motion tracking methods based on inertial sensors: A focus on upper limb human motion. Sensors 2017, 17, 1257. [Google Scholar] [CrossRef] [Green Version]
  41. Ligorio, G.; Zanotto, D.; Sabatini, A.M.; Agrawal, S.K. A novel functional calibration method for real-time elbow joint angles estimation with magnetic-inertial sensors. J. Biomech. 2017, 54, 106–110. [Google Scholar] [CrossRef]
  42. Roetenberg, D.; Luinge, H.; Slycke, P. Xsens MVN: Full 6DOF human motion tracking using miniature inertial sensors. Xsens Motion Technol. BV Tech. Rep. 2009. Available online: http://human.kyst.com.tw/upload/pdfs120702543998066.pdf (accessed on 2 February 2021).
  43. Zhu, R.; Zhou, Z. A real-time articulated human motion tracking using tri-axis inertial/magnetic sensors package. IEEE Trans. Neural Syst. Rehabil. Eng. 2004, 12, 295–302. [Google Scholar] [CrossRef]
  44. Shaeffer, D.K. MEMS inertial sensors: A tutorial overview. IEEE Commun. Mag. 2013, 51, 100–109. [Google Scholar] [CrossRef]
  45. Simdiankin, A.; Byshov, N.; Uspensky, I. A method of vehicle positioning using a non-satellite navigation system. Transp. Res. Procedia 2018, 36, 732–740. [Google Scholar] [CrossRef]
  46. Mahida, P.; Shahrestani, S.; Cheung, H. Deep learning-based positioning of visually impaired people in indoor environments. Sensors 2020, 20, 6238. [Google Scholar] [CrossRef]
  47. Leiva, K.M.R.; Lara, S.S.; Olmedo, J.J.S. Development of a motion measurement system of a white cane for Visually Impaired People rehabilitation. In Proceedings of the XXXVIII Congreso Anual de la Sociedad Española de Ingeniería Biomédica (CASEIB 2020), Virtual Congress, 25–27 November 2020. [Google Scholar]
  48. Blasch, B.B.; LaGrow, S.J.; de L’Aune, W.R. Three aspects of coverage provided by the long cane: Object, surface, and foot-placement preview. J. Vis. Impair. Blind. 1996, 90, 295–301. [Google Scholar] [CrossRef]
  49. Sankako, A.N.; Marília, P.; Lucareli, P.R.G.; de Carvalho, S.M.R.; Braccialli, L.M.P. Temporal spatial parameters analysis of the gait in children with vision impairment. Int. J. Orientat. Mobil. 2016, 8, 90–100. [Google Scholar] [CrossRef] [Green Version]
  50. Ramsey, V.K.; Blasch, B.B.; Kita, A.; Johnson, B.F. A biomechanical evaluation of visually impaired persons’ gait and lone-cane mechanics. J. Rehabil. Res. Dev. 1999, 36, 323–332. [Google Scholar] [PubMed]
  51. Emerson, R.W.; Kim, D.S.; Naghshineh, K.; Myers, K.R. Biomechanics of Long Cane Use. J. Vis. Impair. Blind. 2019, 113, 235–247. [Google Scholar] [CrossRef]
  52. Blasch, B.B.; de L’aune, W.R.; Coombs, F.K. Computer Simulation of Cane Techniques Used by People with Visual Impairments for Accessibility Analysis. In Enabling Environments. Plenum Series in Rehabilitation and Health; Springer: Boston, MA, USA, 1999. [Google Scholar]
  53. LaGrow, S.J.; Blasch, B.B.; de L’Aune, W. Efficacy of the touch technique for surface and foot-placement preview. J. Vis. Impair. Blind. 1997, 91, 47–52. [Google Scholar] [CrossRef]
  54. Rampp, A.; Barth, J.; Schülein, S.; Gaßmann, K.G.; Klucken, J.; Eskofier, B.M. Inertial Sensor-Based Stride Parameter Calculation From Gait Sequences in Geriatric Patients. IEEE Trans. Biomed. Eng. 2015, 62, 1089–1097. [Google Scholar] [CrossRef] [PubMed]
  55. Flores, G.H.; Manduchi, R. WeAllWalk. ACM Trans. Access. Comput. 2018, 11, 1–28. [Google Scholar] [CrossRef]
  56. Xing, H.; Li, J.; Hou, B.; Zhang, Y.; Guo, M. Pedestrian Stride Length Estimation from IMU Measurements and ANN Based Algorithm. J. Sens. 2017, 2017, 6091261. [Google Scholar] [CrossRef] [Green Version]
  57. Finger, R.P.; Ayton, L.N.; Deverell, L.; O’Hare, F.; McSweeney, S.C.; Luu, C.D.; Fenwick, E.K.; Keeffe, J.E.; Guymer, R.H.; Bentley, S.A. Developing a very low vision orientation and mobility test battery (O&M-VLV). Optom. Vis. Sci. 2016, 93, 1127–1136. [Google Scholar] [PubMed]
Figure 1. Local coordinate system of the sensor placed on the cane (A) and local coordinate system of the sensor placed in the leg (B).
Figure 1. Local coordinate system of the sensor placed on the cane (A) and local coordinate system of the sensor placed in the leg (B).
Entropy 23 00848 g001
Figure 2. Sweeping preview ( γ c a n e ) in the 20-step displacement for each travelling technique, S05 (A) and S06 (B).
Figure 2. Sweeping preview ( γ c a n e ) in the 20-step displacement for each travelling technique, S05 (A) and S06 (B).
Entropy 23 00848 g002
Table 1. Description and representation of the top view of the travel techniques for the experimental evaluation of the developed system.
Table 1. Description and representation of the top view of the travel techniques for the experimental evaluation of the developed system.
Constant Contact Technique (CCT)Two Points Touch Technique (2PT)Three Points Touch Technique (3PT)
Entropy 23 00848 i001 Entropy 23 00848 i002 Entropy 23 00848 i003
The CCT travel technique consisted of sweeping the long cane on the floor between two points with constant contact with an approximate amplitude of 1 m in order to provide coverage of the walking path.The 2PT travel technique consisted of sweeping the long cane on the floor between two points taking the cane off the ground and creating an arc of around 5 cm, with an approximate amplitude of 1 m.The 3PT travel technique consisted of sweeping the long cane on the floor between three points. One point on the left, one on the center and one on the right. Taking the cane off the ground in each point and creating an arc of around 5 cm.
Table 2. Extract of the measured Hand Height and Safety Zone and statistic characteristics.
Table 2. Extract of the measured Hand Height and Safety Zone and statistic characteristics.
RV cmMean cmSD cm%ErrorRV cmMean cmSD cm%Error
Hand Height (HH)Safety Zone (SZ)
S0194.0094.591.390.6369.6668.711.951.37
S0289.0090.173.171.3176.0074.343.772.18
S0395.0094.283.370.7668.2969.355.031.55
S0486.0082.642.603.9079.3282.682.594.24
S0594.0092.564.661.5368.6671.046.113.47
S0686.0082.033.754.6279.3283.173.544.86
S0787.0090.852.384.4377.1073.502.864.68
S0888.0084.132.514.4078.2380.214.352.54
S0982.0081.866.150.1783.4683.086.320.46
S1083.0081.152.502.2382.4684.192.392.09
Table 3. Standard deviation of the measured grip rotation for each subject in the acquisitions of the different travelling techniques.
Table 3. Standard deviation of the measured grip rotation for each subject in the acquisitions of the different travelling techniques.
SD in Degrees
S01S02S03S04S05
CCT2PT3PTCCT2PT3PTCCT2PT3PTCCT2PT3PTCCT2PT3PT
4.557.215.343.443.052.824.344.432.883.723.122.134.454.283.01
5.686.474.972.764.123.841.884.453.061.932.112.033.673.493.01
5.316.294.256.144.933.094.845.182.92.662.122.223.433.423.28
Entropy 23 00848 i004 Entropy 23 00848 i005 Entropy 23 00848 i006 Entropy 23 00848 i007 Entropy 23 00848 i008
S06S07S08S09S10
CCT2PT3PTCCT2PT3PTCCT2PT3PTCCT2PT3PTCCT2PT3PT
3.13.923.928.375.613.925.683.633.57.437.436.192.172.882.47
3.473.063.067.46.554.726.84.162.918.847.124.46.152.982.41
2.972.842.847.976.244.56.185.13.116.137.155.754.353.12.21
Entropy 23 00848 i009 Entropy 23 00848 i010 Entropy 23 00848 i011 Entropy 23 00848 i012 Entropy 23 00848 i013
Table 4. Step length measurement analysis.
Table 4. Step length measurement analysis.
ActivitySL mRSL mMD cmSL mRSL mMD cm
S01S06
W0.5530.5007.0460.5600.5462.769
CCT0.5170.4050.4900.443
3PT0.5770.5390.5000.478
S02S07
W0.5530.5512.7040.6780.6254.224
CCT0.5900.6010.7310.716
3PT0.5770.5830.6640.607
S03S08
W0.4470.4324.9370.5670.50212.370
CCT0.5300.5340.5810.536
3PT0.4830.5420.5720.540
S04 S09
W0.6030.5923.3330.5200.5023.047
CCT0.5630.6030.5000.536
3P0.5800.5500.5360.540
S05 S10
W0.6370.4989.8000.5380.5055.152
CCT0.6030.5670.5340.566
3P0.6730.5540.5150.425
W = walking.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Reyes Leiva, K.M.; Jaén-Vargas, M.; Cuba, M.Á.; Lara, S.S.; Olmedo, J.J.S. A Proposal of a Motion Measurement System to Support Visually Impaired People in Rehabilitation Using Low-Cost Inertial Sensors. Entropy 2021, 23, 848. https://doi.org/10.3390/e23070848

AMA Style

Reyes Leiva KM, Jaén-Vargas M, Cuba MÁ, Lara SS, Olmedo JJS. A Proposal of a Motion Measurement System to Support Visually Impaired People in Rehabilitation Using Low-Cost Inertial Sensors. Entropy. 2021; 23(7):848. https://doi.org/10.3390/e23070848

Chicago/Turabian Style

Reyes Leiva, Karla Miriam, Milagros Jaén-Vargas, Miguel Ángel Cuba, Sergio Sánchez Lara, and José Javier Serrano Olmedo. 2021. "A Proposal of a Motion Measurement System to Support Visually Impaired People in Rehabilitation Using Low-Cost Inertial Sensors" Entropy 23, no. 7: 848. https://doi.org/10.3390/e23070848

APA Style

Reyes Leiva, K. M., Jaén-Vargas, M., Cuba, M. Á., Lara, S. S., & Olmedo, J. J. S. (2021). A Proposal of a Motion Measurement System to Support Visually Impaired People in Rehabilitation Using Low-Cost Inertial Sensors. Entropy, 23(7), 848. https://doi.org/10.3390/e23070848

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop