Assessment Methods of Usability and Cognitive Workload of Rehabilitative Exoskeletons: A Systematic Review
Abstract
:Featured Application
Abstract
1. Introduction
1.1. Usability in the Use of Rehabilitative Exoskeletons
1.2. Cognitive Workload in the Use of Rehabilitative Exoskeletons
1.3. Aim of the Study
RQ1: “Which methods are deployed to assess the usability of rehabilitative exoskeletons?”; RQ1a: “How can usability assessment methods be categorized, in terms of the deployed type of measure?”; RQ1b: “What are the psychometric properties (i.e., validity and reliability) of the identified usability assessment methods?”.
RQ2: “Which methods are deployed to assess cognitive workload in the use of rehabilitative exoskeletons?”; RQ2a: “How can cognitive workload assessment methods be categorized, in terms of the deployed type of measure?”; RQ2b: “What are the psychometric properties (i.e., validity and reliability) of the identified cognitive workload assessment methods?”.
2. Materials and Methods
2.1. Keywords and Search Query
2.2. Bibliographical Databases
2.3. Inclusion and Exclusion Criteria
2.4. Framework for Critical Appraisal of Retrieved Assessment Methods
2.5. Data Extraction
3. Results
3.1. Assessment Methods of Usability in the Use of Rehabilitative Exoskeletons
3.1.1. Quantitative and Subjective Assessment Methods of Usability in the Use of Rehabilitative Exoskeletons
3.1.2. Qualitative and Subjective Assessment Methods of Usability in the Use of Rehabilitative Exoskeletons
3.1.3. Quantitative and Objective Assessment Methods of Usability in the Use of Rehabilitative Exoskeletons
3.1.4. Mixed Types of Assessment Methods of Usability in the Use of Rehabilitative Exoskeletons
3.2. Assessment Methods of Cognitive Workload in the Use of Rehabilitative Exoskeletons
3.2.1. Quantitative and Subjective Assessment Methods of Cognitive Workload in the Use of Rehabilitative Exoskeletons
3.2.2. Quantitative and Objective Assessment Methods of Cognitive Workload in the Use of Rehabilitative Exoskeletons
3.3. Critical Appraisal of Retrieved Assessment Methods
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Reference | Construct | Methods | Measure Type | Dimensions | Items | Participants | Psychometrics | Comments |
---|---|---|---|---|---|---|---|---|
Almenara et al. (2017) [31] | Usability | A. Heuristic evaluation and questionnaire for SMEs; B. Questionnaire for end-users. Both administered at T0 and one-year follow-up | Qualitative, subjective for both | 1 for both | A. 21 (open-ended) at T0, 29 at follow-up; B. 13 (6-point Likert) at T0, 14 at follow-up | 37 (28 SMEs and 9 end-users) | N/A | Usability testing for developmental purposes |
Ambrosini et al. (2019) [24] | Usability | System Usability Scale (SUS) | Quantitative, subjective | 1 | 10 (5-point Likert) | 7 | N/A | N/A |
Amirabdollahian et al. (2014) [25] | Usability | A. Think aloud protocol (development); B. SUS (testing) | A. Qualitative, subjective; B. Quantitative, subjective | B. 1 | B. 10 (5-point Likert) | 23 (12 completed) | N/A | Methods may differ based on the phase of exoskeleton development |
Bortole et al. (2015) [39] | Usability | A. Unstructured observation; B. Ratings of ease of use | A. Qualitative, objective; B. Quantitative, subjective | 1 for both | B. 1 (10-point Likert) | 3 | N/A | Usability was conceptualized as ease of use |
Bryce et al. (2015) [44] | Usability | Framework of Usability for Robotic Exoskeletal Orthoses (FUREO) | Quantitative and qualitative, subjective and objective | 6 (applications, personal, device, external factors, activities, and health outcomes) | N/A | N/A | N/A | N/A |
Casey et al. (2019) [26] | Usability | SUS | Quantitative, subjective | 1 | 10 (5-point Likert) | 4 | Reliability. High internal consistency; Validity. Reference to validation study | Learnability was also considered as a dimension of usability |
Deeny et al. (2014) [46] | Cognitive workload | Event-related potential (ERP) amplitude analysis | Quantitative, objective | N/A | N/A | 20 | Validity. The study is a validation study in itself | N/A |
Dudley et al. (2019) [27] | Usability | A. SUS; B. Quebec User Evaluation of Satisfaction with assistive Technology (QUEST) | Quantitative, subjective for both | A. 1; B. 2 (device and services) | A. 10 (5-point Likert); B. 12 (5-point Likert) | 1 | N/A | Only 8 items of the device sub-scale of QUEST were used |
Eicher et al. (2019) [28] | Usability | A. SUS; B. AttrakDiff | Quantitative, subjective for both | A. 1; B. 3 (pragmatic quality, hedonic quality, and attractiveness) | A. 10 (5-point Likert); B. 28 (7-point Likert) | 13 | N/A | N/A |
Heinemann et al. (2020) [41] | Usability | Focus group | Qualitative, subjective | N/A | N/A | 35 | Reliability. A semi-structured facilitator guide was used to standardize facilitator’s conduct across focus groups | Focus group data were analyzed through thematic analysis |
Koumpuros (2016) [29] | Usability | A. SUS; B. QUEST | Quantitative, subjective | N/A | N/A | 31 articles | Validity. Reference to validation studies | N/A |
Lajeunesse et al. (2018) [42] | Usability | Semi-structured individual interview | Qualitative, subjective | 3 (capabilities, life habits, and expected technical characteristics) | N/A | 13 | Reliability. Semi-structured protocol and parallel coding of text data | QUEST was used as a reference |
Liu et al. (2017) [45] | Cognitive workload | A. Subjective Workload Assessment Technique (SWAT); B. NASA Task Load Index (NASA-TLX) | Quantitative, subjective for both | A. 3 (time load, mental effort load, and psychological stress load); B. 6 (mental demand, physical demand, temporal demand, performance, effort, and frustration) | A.3; B. 6 (10-point Likert) | 6 | Validity. Good content and construct validity. Reliability. Cronbach’s α > 8 for both | N/A |
López-Larraz et al. (2016) [37] | Usability | QUEST (modified) | Quantitative, subjective | 2 (device and services) | 9 (5-point Likert) | 4 | N/A | Only the device sub-scale of QUEST was used |
Nam et al. (2019) [32] | Usability | Ad-hoc questionnaire | Quantitative, subjective | 1 | 10 (7-point Likert) | 20 (only 2 patients) | N/A | Usability assessment for developmental purposes |
Ozkul and Barkana (2013) [35] | Usability | A. Perceived Rate of Exertion Scale (PRES); B. Visual Analog Scale (VAS); C. Ad-hoc questionnaire | Quantitative, subjective for all | 1 each | A. 1 (10-point Likert); B. 1 (10-point Likert); C. 10 (5-point Likert) | 9 | N/A | Comfort, performance, and safety were also part of the assessment |
Resquín et al. (2017) [38] | Usability | A. QUEST; B. Self-Assessment Manikin (SAM) | Quantitative, subjective for both | A. 2 (device and services); B. 3 (pleasure, arousal, and dominance) | A. 7 (5-point Likert); B. 3 (9-point Likert) | 3 | N/A | Only the device sub-scale of QUEST was used |
Schneider et al. (2016) [33] | Usability | A. Ad-hoc questionnaire; B. Visual analysis of facial expressions | A. Quantitative, subjective; B. Quantitative, objective | 1 each | A. 3 (5-point Likert) | 8 | N/A | Usability was conceptualized as enjoyment, difficulty, and comfort |
Schrade et al. (2018) [40] | Usability | Field behavioral observation | Qualitative, subjective | N/A | N/A | 2 | N/A | Usability was conceptualized as rehabilitative progress and effectiveness |
Stellin et al. (2018) [34] | Usability | Ad-hoc questionnaire | Quantitative, subjective | 1 | 10 (5-point Likert) | 16 (of which 6 patients) | N/A | Usability was conceptualized as satisfaction |
Tsai et al. (2019) [30] | Usability | A. SUS; B. Ad-hoc questionnaire | Quantitative, subjective for both | 1 each | A. 10 (5-point Likert); B. 5 (5-point Likert) | 22 (of which 18 patients) | Reliability. A. Cronbach’s α = 0.64; B. α = 0.85 | N/A |
Vitiello et al. (2016) [43] | Usability | Experimental characterization | Quantitative, objective | N/A | N/A | 3 | N/A | Torque and speed were used |
Yoo et al. (2019) [36] | Usability | A. Toronto Rehabilitation Institute Hand Function Test (TRI-HFT); B. Korean-Quebec User Evaluation of Satisfaction with assistive Technology 2.0 (K-QUEST 2.0) | A. Quantitative, objective; B. Quantitative, subjective | A. 2 (ability to manipulate and grasp strength); B. 2 (device and services) | A. 19; B. 12 (5-point Likert) | 10 | A. Inter-rater reliability = 1.0 (p < 0.01) | Only the device sub-scale of QUEST was used |
References
- Gams, A.; Petrič, T.; Debevec, T.; Babič, J. Effects of robotic knee exoskeleton on human energy expenditure. IEEE. Trans. Biomed. Eng. 2013, 60, 1636–1644. [Google Scholar] [CrossRef] [PubMed]
- Wall, A.; Borg, J.; Palmcrantz, S. Clinical application of the Hybrid Assistive Limb (HAL) for gait training—A systematic review. Front. Syst. Neurosci. 2015, 9, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Lo, H.S.; Xie, S.Q. Exoskeleton robots for upper-limb rehabilitation: State of the art and future prospects. Med. Eng. Phys. 2012, 34, 261–268. [Google Scholar] [CrossRef]
- Chen, G.; Chan, C.K.; Guo, Z.; Yu, H. A review of lower extremity assistive robotic exoskeletons in rehabilitation therapy. Crit. Rev. Biomed. Eng. 2013, 41, 343–363. [Google Scholar] [CrossRef] [PubMed]
- Kim, T. Factors influencing usability of rehabilitation robotic devices for lower limbs. Sustainability 2020, 12, 598. [Google Scholar] [CrossRef] [Green Version]
- Stirling, L.; Siu, H.C.; Jones, E.; Duda, K. Human Factors considerations for enabling functional use of exosystems in operational environments. IEEE Syst. J. 2018, 13, 1072–1083. [Google Scholar] [CrossRef] [Green Version]
- International Organization for Standardization. Ergonomic Requirements for Office Work with Visual Display Terminals. Part 11: Guidance on Usability; ISO Standard No. 9241-11:1998; International Organization for Standardization: Geneva, Switzerland, 1998; Available online: https://www.iso.org/standard/16883.html (accessed on 1 August 2021).
- Earthy, J.; Jones, B.; Bevan, N. ISO standards for user-centered design and the specification of usability. In Usability in Government Systems; Buie, E., Murray, D., Eds.; Morgan Kauffman: Burlington, MA, USA, 2012; Volume 1, pp. 267–283. [Google Scholar]
- Kaasinen, E.; Roto, V.; Hakulinen, J.; Heimonen, T.; Jokinen, J.P.; Karvonen, H.; Tokkonen, H. Defining user experience goals to guide the design of industrial systems. Behav. Inf. Technol. 2015, 34, 976–991. [Google Scholar] [CrossRef]
- Hill, D.; Holloway, C.S.; Ramirez, D.Z.M.; Smitham, P.; Pappas, Y. What are user perspectives of exoskeleton technology? A literature review. Int. J. Technol. Assess. Health Care 2017, 33, 160–167. [Google Scholar] [CrossRef]
- Dumas, J.S.; Salzman, M.C. Usability assessment methods. Rev. Hum. Factors Ergon. 2006, 2, 109–140. [Google Scholar] [CrossRef]
- Helander, M. Design of human-computer interaction. In A Guide to Human Factors and Ergonomics, 2nd ed.; Helander, M., Ed.; Taylor & Francis: Boca Raton, FA, USA, 2006; Volume 1, pp. 119–143. [Google Scholar]
- Furniss, D.; Blandford, A.; Curzon, P. Usability Evaluation Methods in Practice: Understanding the Context in Which They Are Embedded. In Proceedings of the 14th European Conference on Cognitive Ergonomics: Invent! Explore! New York, NY, USA, 28–31 August 2007; Brinkman, W.P., Ham, D.H., Wong, B.L.W., Eds.; Association for Computing Machinery: New York, NY, USA, 2007; pp. 253–256. [Google Scholar] [CrossRef]
- Pietrantoni, L.; Rainieri, G.; Fraboni, F.; Tušl, M.; Giusino, D.; De Angelis, M.; Tria, A. Usability of Exosystems: A Review. In CHItaly19: Adjunct Proceedings of the 13th Biannual Conference of the Italian SIGCHI Chapter: Designing the Next Interaction, Padua, Italy, 23–25 September 2019; Gamberini, L., Pittarello, F., Spagnolli, A., Eds.; Association for Computing Machinery: New York, NY, USA, 2019; Available online: https://www.researchgate.net/publication/336044737_Usability_of_exosystems_A_review (accessed on 23 June 2021).
- Young, N.; Stanton, N. Mental workload: Theory, measurement, and application. In International Encyclopedia of Ergonomics and Human Factors, 2nd ed.; Karwowski, W., Ed.; Taylor & Francis: Boca Raton, FA, USA, 2006; Volume 3, pp. 866–869. [Google Scholar]
- Stanton, N.A.; Salmon, P.M.; Rafferty, L.A.; Walker, G.H.; Baber, C.; Jenkins, D.P. Mental workload assessment methods. In Human Factors Methods: A Practical Guide for Engineering and Design, 2nd ed.; Stanton, N.A., Salmon, P.M., Rafferty, L.A., Walker, G.H., Baber, C., Jenkins, D.P., Eds.; Taylor & Francis: Boca Raton, FA, USA, 2017; pp. 301–364. [Google Scholar]
- Young, M.S.; Brookhuis, K.A.; Wickens, C.D.; Hancock, P.A. State of science: Mental workload in ergonomics. Ergonomics 2015, 58, 1–17. [Google Scholar] [CrossRef]
- Yakobi, O. Determinants of association and dissociation between subjective and objective measures of workload. Proc. Hum. Factors Ergon. 2018, 62, 222–226. [Google Scholar] [CrossRef]
- Souza, A.C.D.; Alexandre, N.M.C.; Guirardello, E.D.B. Psychometric properties in instruments evaluation of reliability and validity. Epidemiol. Serv. Saúde 2017, 26, 649–659. [Google Scholar] [CrossRef] [PubMed]
- Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Ann. Intern. Med. 2009, 151, 264–269. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mingers, J.; Leydesdorff, L. A review of theory and practice in scientometrics. Eur. J. Oper. Res. 2015, 246, 1–19. [Google Scholar] [CrossRef] [Green Version]
- Francis, D.O.; McPheeters, M.L.; Noud, M.; Penson, D.F.; Feurer, I.D. Checklist to operationalize measurement characteristics of patient-reported outcome measures. Syst. Rev. 2016, 5, 1–11. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- UserFocus. Usability test data. Available online: https://www.userfocus.co.uk/articles/datathink.html (accessed on 25 June 2021).
- Ambrosini, E.; Zajc, J.; Ferrante, S.; Ferrigno, G.; Gasperina, S.D.; Bulgheroni, M.; Baccinellli, W.; Schauer, T.; Wiesener, C.; Russold, M.; et al. A hybrid robotic system for arm training of stroke survivors: Concept and first evaluation. IEEE. Trans. Biomed. Eng. 2019, 66, 3290–3300. [Google Scholar] [CrossRef] [Green Version]
- Amirabdollahian, F.; Ates, S.; Basteris, A.; Cesario, A.; Buurke, J.; Hermens, H.; Hofs, D.; Johansson, E.; Mountain, G.; Nasr, N.; et al. Design, development and deployment of a hand/wrist exoskeleton for home-based rehabilitation after stroke—SCRIPT project. Robotica 2014, 32, 1331–1346. [Google Scholar] [CrossRef] [Green Version]
- Casey, A.; Azhar, H.; Grzes, M.; Sakel, M. BCI controlled robotic arm as assistance to the rehabilitation of neurologically disabled patients. Disabil. Rehabil. Assist. Technol. 2021, 16, 525–537. [Google Scholar] [CrossRef]
- Dudley, D.R.; Knarr, B.A.; Siu, K.C.; Peck, J.; Ricks, B.; Zuniga, J.M. Testing of a 3D printed hand exoskeleton for an individual with stroke: A case study. Disabil. Rehabil. Assist. Technol. 2021, 16, 209–213. [Google Scholar] [CrossRef] [PubMed]
- Eicher, C.; Haesner, M.; Spranger, M.; Kuzmicheva, O.; Gräser, A.; Steinhagen-Thiessen, E. Usability and acceptability by a younger and older user group regarding a mobile robot-supported gait rehabilitation system. Assist. Technol. 2019, 31, 25–33. [Google Scholar] [CrossRef] [PubMed]
- Koumpouros, Y. A systematic review on existing measures for the subjective assessment of rehabilitation and assistive robot devices. J. Healthc. Eng. 2016, 2016, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Tsai, Y.; Huang, J.; Pu, S.; Chen, H.; Hsu, S.; Chang, J.; Pei, Y. Usability assessment of a cable-driven exoskeletal robot for hand rehabilitation. Front. Neurorobot. 2019, 13, 1–5. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Almenara, M.; Cempini, M.; Gómez, C.; Cortese, M.; Martín, C.; Medina, J.; Vitiello, N.; Opisso, E. Usability test of a hand exoskeleton for activities of daily living: An example of user-centered design. Disabil. Rehabil. Assist. Technol. 2017, 12, 84–96. [Google Scholar] [CrossRef]
- Nam, H.S.; Hong, N.; Cho, M.; Lee, C.; Seo, H.G.; Kim, S. Vision-assisted interactive human-in-the-loop distal upper limb rehabilitation robot and its clinical usability test. Appl. Sci. 2019, 9, 3106. [Google Scholar] [CrossRef] [Green Version]
- Schneider, J.C.; Ozsecen, M.Y.; Muraoka, N.K.; Mancinelli, C.; Della Croce, U.; Ryan, C.M.; Bonato, P. Feasibility of an exoskeleton-based interactive video game system for upper extremity burn contractures. PM&R 2016, 8, 445–452. [Google Scholar] [CrossRef]
- Stellin, G.; Sale, P.; Masiero, S.; Becchi, F.; Sieklicki, W. Development and test of fex, a fingers extending exoskeleton for rehabilitation and regaining mobility. Int. J. Mech. Control 2018, 19, 3–15. [Google Scholar]
- Ozkul, F.; Barkana, D.E. Upper-extremity rehabilitation robot RehabRoby: Methodology, design, usability and validation. Int. J. Adv. Robot. Syst. 2013, 10, 1–13. [Google Scholar] [CrossRef]
- Yoo, H.J.; Lee, S.; Kim, J.; Park, C.; Lee, B. Development of 3D-printed myoelectric hand orthosis for patients with spinal cord injury. J. Neuroeng. Rehabil. 2019, 16, 1–14. [Google Scholar] [CrossRef] [Green Version]
- López-Larraz, E.; Trincado-Alonso, F.; Rajasekaran, V.; Pérez-Nombela, S.; Del-Ama, A.; Aranda, J.; Minguez, J.; Gil-Agudo, A.; Montesano, L. Control of an ambulatory exoskeleton with a brain-machine interface for spinal cord injury gait rehabilitation. Front. Neurosci. 2016, 10, 1–15. [Google Scholar] [CrossRef]
- Resquín, F.; Gonzalez-Vargas, J.; Ibáñez, J.; Brunetti, F.; Dimbwadyo, I.; Carrasco, L.; Alves, S.; Gonzalez-Alted, C.; Gomez-Blanco, A.; Pons, J.L. Adaptive hybrid robotic system for rehabilitation of reaching movement after a brain injury: A usability study. J. Neuroeng. Rehabil. 2017, 14, 1–15. [Google Scholar] [CrossRef] [Green Version]
- Bortole, M.; Venkatakrishnan, A.; Zhu, F.; Moreno, J.C.; Francisco, G.E.; Pons, J.L.; Contreras-Vidal, J. The H2 robotic exoskeleton for gait rehabilitation after stroke: Early findings from a clinical study wearable robotics in clinical testing. J. Neuroeng. Rehabil. 2015, 12, 1–14. [Google Scholar] [CrossRef] [Green Version]
- Schrade, S.O.; Dätwyler, K.; Stücheli, M.; Studer, K.; Türk, D.; Meboldt, M.; Gassert, R.; Lambercy, O. Development of VariLeg, an exoskeleton with variable stiffness actuation: First results and user evaluation from the CYBATHLON 2016. J. Neuroeng. Rehabil. 2018, 15, 1–18. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Heinemann, A.W.; Kinnett-Hopkins, D.; Mummidisetty, C.K.; Bond, R.A.; Ehrlich-Jones, L.; Furbish, C.; Field-Fote, E.; Jayaraman, A. Appraisals of robotic locomotor exoskeletons for gait: Focus group insights from potential users with spinal cord injuries. Disabil. Rehabil. Assist. Technol. 2020, 15, 1–11. [Google Scholar] [CrossRef] [PubMed]
- Lajeunesse, V.; Routhier, F.; Vincent, C.; Lettre, J.; Michaud, F. Perspectives of individuals with incomplete spinal cord injury concerning the usability of lower limb exoskeletons: An exploratory study. Technol. Disabil. 2018, 30, 63–76. [Google Scholar] [CrossRef]
- Vitiello, N.; Cempini, M.; Crea, S.; Giovacchini, F.; Cortese, M.; Moisè, M.; Posteraro, F.; Carrozza, M.C. Functional design of a powered elbow orthosis toward its clinical employment. IEEE-ASME T. Mech. 2016, 21, 1880–1891. [Google Scholar] [CrossRef]
- Bryce, T.N.; Dijkers, M.P.; Kozlowski, A.J. Framework for assessment of the usability of lower-extremity robotic exoskeletal orthoses. Am. J. Phys. Med. Rehabil. 2015, 94, 1000–1014. [Google Scholar] [CrossRef]
- Liu, D.; Chen, W.; Pei, Z.; Wang, J. A brain-controlled lower-limb exoskeleton for human gait training. Rev. Sci. Instrum. 2017, 88, 1–13. [Google Scholar] [CrossRef] [PubMed]
- Deeny, S.; Chicoine, C.; Hargrove, L.; Parrish, T.; Jayaraman, A. A simple ERP method for quantitative analysis of cognitive workload in myoelectric prosthesis control and human-machine interaction. PLoS ONE 2014, 9, e112091. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Torricelli, D.; Gonzalez-Vargas, J.; Veneman, J.; Mombaur, K.; Tsagarakis, N.; del Ama, A.; Gil-Agudo, A.; Moreno, J.; Pons, L.J. Benchmarking bipedal locomotion: A unified scheme for humanoids, wearable robots, and humans. IEEE Rob. Autom. Mag. 2015, 22, 103–115. [Google Scholar] [CrossRef] [Green Version]
A. Conceptual Model |
1. Has the construct of usability/cognitive workload been specifically defined? 2. Has the intended respondent population been described? 3. Does the instrument’s conceptual model address whether a single construct/scale or multiple subscales are expected? |
B. Content Validity |
4. Is there evidence that members of the targeted respondent population were involved in the instrument’s development? 5. Is there evidence that content experts were involved in the instrument’s development? 6. Is there a description of the methodology by which items/questions were determined (e.g., focus groups and interviews)? |
C. Reliability |
7. Is there evidence that the instrument’s reliability was tested (e.g., test–retest and internal consistency)? 8. Are the reported indices of reliability adequate (e.g., ideal: r ≥ 0.80; adequate: r ≥ 0.70) or otherwise justified? |
D. Construct Validity |
9. Is there reported quantitative justification that single or multiple subscales exist in the instrument (e.g., factor analysis or item response theory)? 10. Is the instrument intended to measure change over time? If YES, is there evidence of both test–retest reliability AND responsiveness to change? Otherwise, award 1 point if there is an explicit statement that the instrument is NOT intended to measure change over time. 11. Are there findings supporting expected associations with existing instruments or with other relevant data? |
E. Scoring and Interpretation |
12. Is there documentation regarding how to score the instrument (e.g., a scoring method such as summing or an algorithm)? 13. Has a plan for managing and/or interpreting missing responses been described (i.e., how to score incomplete surveys)? 14. Is information provided about how to interpret the instrument’s scores (e.g., scaling/anchors and what high and low scores represent) and/or normative data? |
F. Respondent Burden and Presentation |
15. Is the time to complete reported and reasonable? OR, if it is NOT reported, is the number of questions appropriate for the intended application? 16. Is there a description of the literacy level required by the instrument? 17. Is the entire instrument available for public viewing (e.g., published with the citation or information provided about how to access a copy)? |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
La Bara, L.M.A.; Meloni, L.; Giusino, D.; Pietrantoni, L. Assessment Methods of Usability and Cognitive Workload of Rehabilitative Exoskeletons: A Systematic Review. Appl. Sci. 2021, 11, 7146. https://doi.org/10.3390/app11157146
La Bara LMA, Meloni L, Giusino D, Pietrantoni L. Assessment Methods of Usability and Cognitive Workload of Rehabilitative Exoskeletons: A Systematic Review. Applied Sciences. 2021; 11(15):7146. https://doi.org/10.3390/app11157146
Chicago/Turabian StyleLa Bara, Laura Maria Alessandra, Luca Meloni, Davide Giusino, and Luca Pietrantoni. 2021. "Assessment Methods of Usability and Cognitive Workload of Rehabilitative Exoskeletons: A Systematic Review" Applied Sciences 11, no. 15: 7146. https://doi.org/10.3390/app11157146
APA StyleLa Bara, L. M. A., Meloni, L., Giusino, D., & Pietrantoni, L. (2021). Assessment Methods of Usability and Cognitive Workload of Rehabilitative Exoskeletons: A Systematic Review. Applied Sciences, 11(15), 7146. https://doi.org/10.3390/app11157146