Towards a Refined Heuristic Evaluation: Incorporating Hierarchical Analysis for Weighted Usability Assessment
Abstract
:1. Introduction
2. Background
- Heuristic set: The heuristic set originally contained nine heuristics extracted from the work by Molich and Nielsen [13]. It serves as a way to categorize usability problems. However, it does not provide information about how to solve them.
- Number of evaluators: The authors noticed that the number of evaluators was a critical factor, and they determined that an optimal number of them might be around three to five, and that more than ten evaluators might be unnecessary.
- Evaluator biases: The answers from the evaluators are subject to their expertise, previous experience, and own judgment, providing potential limitations and biases in the results.
3. Materials and Methods
3.1. Heuristic Instrument
- Visibility and system state (five questions): Focuses on ensuring that users are always aware of what the system is doing and their position within it.
- Connection with the real world (four questions): Prioritizes using familiar language, metaphors, and concepts, aligning the system with real-world analogs.
- User control and freedom (three questions): Emphasizes allowing users to navigate and undo actions easily.
- Consistency and standards (six questions): Ensures uniformity in the interface, with consistent actions and standards across different elements.
- Recognition rather than memory (five questions): Aims to design systems that minimize the need for remembering information, enhancing user learning and anticipation.
- Flexibility and efficiency (six questions): Focuses on providing shortcuts and efficient paths for experienced users while remaining accessible to novices.
- Help users recognize, diagnose, and recover from errors (four questions): Focuses on designing systems that provide clear, understandable error messages, aiding users in recognizing and rectifying issues efficiently.
- Error prevention (three questions): Involves designing systems to prevent errors before they occur.
- Aesthetic and minimalist design (four questions): Encourages visually appealing designs and minimal unnecessary elements.
- Help and documentation (five questions): Stresses the importance of accessible, clear help and documentation for users.
- Save the state and protect the work (three questions): Addresses the need to save user progress and protect against data loss.
- Color and readability (four questions): Ensures that text is readable with appropriate color contrast and size.
- Autonomy (three questions): Allows users to make personal choices and customizations in the system.
- Defaults (three questions): Focuses on providing sensible default settings while allowing users to revert to these defaults when needed.
- Latency reduction (two questions): Aims to minimize delays and provide feedback during processes that require time.
3.2. Analytical Hierarchical Process
- Input: Pairwise comparison matrices for criteria and alternatives.
- Output: Priority vector (weights) for criteria and alternatives.
- For each pairwise comparison matrix:
- −
- Normalize the matrix by column.
- −
- Compute the principal eigenvector to determine weights.
- −
- Calculate the consistency ratio.
- −
- If CR is less than 0.1:
- ∗
- Accept the weights.
- −
- Else:
- ∗
- Re-evaluate comparisons.
- Show the weights for final decision making.
3.3. Simulation
3.4. Data Acquisition
4. Results
4.1. Algorithm for Pairwise Comparison
Algorithm Performance
4.2. Weighted Heuristic Application
- H1 Comparisons
- −
- H1 vs. H5: H1 and H5 are considered to have equal importance.
- −
- H1 vs. H2, H4, H9, H12, H13: H1 is moderately more important than H2, H4, H9, H12, and H13.
- −
- H1 vs. H3, H6, H7, H8: H1 is strongly more important than H3, H6, H7, and H8.
- −
- H1 vs. H10, H11, H14, H15: H1 is very strongly more important than H10, H11, H14, and H15.
- H2 Comparisons
- −
- H2 vs. H5: H2 is moderately less important than H5.
- −
- H2 vs. H9, H12, H13: H2 has equal importance compared to H9, H12, and H13.
- −
- H2 vs. H3, H4, H6, H7, H8: H2 is moderately more important than H3, H4, H6, H7, and H8.
- −
- H2 vs. H10, H11, H14, H15: H2 is strongly more important than H10, H11, H14, and H15.
- H3 Comparisons
- −
- H3 vs. H5: H3 is very strongly less important than H5.
- −
- H3 vs. H4, H9, H12, H13: H3 has equal importance compared to H4, H9, H12, and H13.
- −
- H3 vs. H6, H7, H8: H3 is moderately more important than H6, H7, and H8.
- −
- H3 vs. H10, H11, H14, H15: H3 is strongly more important than H10, H11, H14, and H15.
- H4 Comparisons
- −
- H4 vs. H5: H4 is moderately less important than H5.
- −
- H4 vs. H6, H7, H8: H4 has equal importance compared to H6, H7, and H8.
- −
- H4 vs. H10, H11, H14, H15: H4 is moderately more important than H10, H11, H14, and H15.
- H5 Comparisons
- −
- H5 vs. H9, H12, H13: H5 is moderately more important than H9, H12, and H13.
- −
- H5 vs. H6, H7, H8: H5 is strongly more important than H6, H7, and H8.
- −
- H5 vs. H10, H11, H14, H15: H5 is very strongly more important than H10, H11, H14, and H15.
- H6 Comparisons
- −
- H6 vs. H9, H12, H13: H6 is moderately less important than H9, H12, and H13.
- −
- H6 vs. H10, H11, H14, H15: H6 is strongly more important than H10, H11, H14, and H15.
- H7 Comparisons
- −
- H7 vs. H9, H12, H13: H7 is moderately less important than H9, H12, and H13.
- −
- H7 vs. H10, H11, H14, H15: H7 is strongly more important than H10, H11, H14, and H15.
- H8 Comparisons
- −
- H8 vs. H9, H12, H13: H8 is moderately less important than H9, H12, and H13.
- −
- H8 vs. H10, H11, H14, H15: H8 is strongly more important than H10, H11, H14, and H15.
- H9 Comparisons
- −
- H9 vs. H10, H11, H14, H15: H9 is strongly more important than H10, H11, H14, and H15.
- H10 Comparisons
- −
- H10 vs. H11, H14, H15: H10 has equal importance compared to H11, H14, and H15.
- −
- H10 vs. H12, H13: H10 is strongly less important than H12 and H13.
- H11 Comparisons
- −
- H11 vs. H12, H13: H11 is strongly less important than H12 and H13.
- H12 Comparisons
- −
- H12 vs. H14, H15: H12 is strongly more important than H14 and H15.
- H13 Comparisons
- −
- H13 vs. H14, H15: H13 is strongly more important than H14 and H15.
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
AHP | Analytic Hierarchy Process |
CI | Consistency Index |
CR | Consistency Ratio |
DOC | Document (Microsoft Word Format) |
HCI | Human–Computer Interaction |
ISO | International Organization for Standardization |
Portable Document Format | |
PSSUQS | Post-Study System Usability Questionnaire Software |
QUIS | User Interaction Satisfaction |
SUS | System Usability Scale |
SUMI | Usability Measurement Inventory |
UI | User Interface |
UP | Usability Percentage |
UTL | Usability Testing Leader |
XLS | Excel Spreadsheet (Microsoft Excel Format) |
Appendix A. Tailored Algorithm for AHP
References
- Vlachogianni, P.; Tselios, N. Perceived usability evaluation of educational technology using the System Usability Scale (SUS): A systematic review. J. Res. Technol. Educ. 2022, 54, 392–409. [Google Scholar] [CrossRef]
- Giacomin, J. What is human centred design? Des. J. 2014, 17, 606–623. [Google Scholar] [CrossRef]
- Holeman, I.; Kane, D. Human-centered design for global health equity. Inf. Technol. Dev. 2020, 26, 477–505. [Google Scholar] [CrossRef] [PubMed]
- Peruzzini, M.; Carassai, S.; Pellicciari, M. The Benefits of Human-centred Design in Industrial Practices: Re-design of Workstations in Pipe Industry. Procedia Manuf. 2017, 11, 1247–1254. [Google Scholar] [CrossRef]
- Ng, J.; Arness, D.; Gronowski, A.; Qu, Z.; Lau, C.W.; Catchpoole, D.; Nguyen, Q.V. Exocentric and Egocentric Views for Biomedical Data Analytics in Virtual Environments—A Usability Study. J. Imaging 2024, 10, 3. [Google Scholar] [CrossRef] [PubMed]
- Harrison, R.; Flood, D.; Duce, D. Usability of mobile applications: Literature review and rationale for a new usability model. J. Interact. Sci. 2013, 1, 1–16. [Google Scholar] [CrossRef]
- Sari, I.; Tj, H.W.; Wahyoedi, S.; Widjaja, B.T. The Effect of Usability, Information Quality, and Service Interaction on E-Loyalty Mediated by E-Satisfaction on Hallobumil Application Users. KnE Soc. Sci. 2023, 8, 211–229. [Google Scholar] [CrossRef]
- Tullis, T.; Albert, W. Measuring the User Experience, Second Edition: Collecting, Analyzing, and Presenting Usability Metrics, 2nd ed.; Morgan Kaufmann Publishers Inc.: San Francisco, CA, USA, 2013. [Google Scholar]
- Brooke, J. SUS: A `Quick and Dirty’ Usability Scale; CRC Press: Boca Raton, FL, USA, 1996. [Google Scholar] [CrossRef]
- Chin, J.P.; Diehl, V.A.; Norman, K.L. Development of an instrument measuring user satisfaction of the human-computer interface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Washington, DC, USA, 15–19 May 1988. Part F130202. [Google Scholar] [CrossRef]
- Kirakowski, J.; Cierlik, B. Measuring the Usability of Web Sites. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 1998, 42, 424–428. [Google Scholar] [CrossRef]
- Lewis, J.R. Psychometric evaluation of the post-study system usability questionnaire: The PSSUQ. In Proceedings of the Human factors Society Annual Meeting; Sage Publications: Los Angeles, CA, USA; 1992; Volume 2. [Google Scholar] [CrossRef]
- Nielsen, J.; Molich, R. Heuristic Evaluation of User Interfaces; ACM Press: New York, NY, USA, 1990; pp. 249–256. [Google Scholar] [CrossRef]
- Bangor, A.; Kortum, P.T.; Miller, J.T. An empirical evaluation of the system usability scale. Int. J. Hum.-Comput. Interact. 2008, 24, 574–594. [Google Scholar] [CrossRef]
- Păsărelu, C.R.; Kertesz, R.; Dobrean, A. The Development and Usability of a Mobile App for Parents of Children with ADHD. Children 2023, 10, 164. [Google Scholar] [CrossRef]
- Weichbroth, P. Usability Testing of Mobile Applications: A Methodological Framework. Appl. Sci. 2024, 14, 1792. [Google Scholar] [CrossRef]
- Rosenzweig, E. Usability Inspection Methods; Elsevier: Amsterdam, The Netherlands, 2015; pp. 115–130. [Google Scholar] [CrossRef]
- Maqbool, B.; Herold, S. Potential effectiveness and efficiency issues in usability evaluation within digital health: A systematic literature review. J. Syst. Softw. 2024, 208, 111881. [Google Scholar] [CrossRef]
- Generosi, A.; Villafan, J.Y.; Giraldi, L.; Ceccacci, S.; Mengoni, M. A Test Management System to Support Remote Usability Assessment of Web Applications. Information 2022, 13, 505. [Google Scholar] [CrossRef]
- Veral, R.; Macías, J.A. Supporting user-perceived usability benchmarking through a developed quantitative metric. Int. J. Hum. Comput. Stud. 2019, 122, 184–195. [Google Scholar] [CrossRef]
- Bugayenko, Y.; Bakare, A.; Cheverda, A.; Farina, M.; Kruglov, A.; Plaksin, Y.; Pedrycz, W.; Succi, G. Prioritizing tasks in software development: A systematic literature review. PLoS ONE 2023, 18, e0283838. [Google Scholar] [CrossRef] [PubMed]
- Israel, G.D.; Taylor, C. Can response order bias evaluations? Eval. Program Plan. 1990, 13, 365–371. [Google Scholar] [CrossRef]
- Paz, F.; Pow-Sang, J.A. A systematic mapping review of usability evaluation methods for software development process. Int. J. Softw. Eng. Its Appl. 2016, 10, 165–178. [Google Scholar] [CrossRef]
- Saaty, R.W. The analytic hierarchy process-what it is and how it is used. Math. Model. 1987, 9, 161–176. [Google Scholar] [CrossRef]
- Dhillon, B.S. Usability Engineering Life-Cycle Stages and Important Associated Areas; CRC Press: Boca Raton, FL, USA, 2019. [Google Scholar] [CrossRef]
- Alonso-Ríos, D.; Vázquez-García, A.; Mosqueira-Rey, E.; Moret-Bonillo, V. Usability: A Critical Analysis and a Taxonomy. Int. J. Hum.-Comput. Interact. 2009, 26, 53–74. [Google Scholar] [CrossRef]
- Villani, V.; Lotti, G.; Battilani, N.; Fantuzzi, C. Survey on usability assessment for industrial user interfaces. IFAC-PapersOnLine 2019, 52, 25–30. [Google Scholar] [CrossRef]
- Shneiderman, B. Designing the user interface strategies for effective human-computer interaction. ACM SIGBIO Newsl. 1987, 9. [Google Scholar] [CrossRef]
- Norman, D. The Design of Everyday Things; Vahlen: Munich, Germany, 2016. [Google Scholar] [CrossRef]
- Tognazzini, B. First Principles, HCI Design, Human Computer Interaction (HCI), Principles of HCI Design, Usability Testing. 2014. Available online: http://www.asktog.com/basics/firstPrinciples.html (accessed on 1 January 2024).
- Shyr, W.J.; Wei, B.L.; Liang, Y.C. Evaluating Students’ Acceptance Intention of Augmented Reality in Automation Systems Using the Technology Acceptance Model. Sustainability 2024, 16, 2015. [Google Scholar] [CrossRef]
- Nielsen, J.; Molich, R. Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 1–5 April 1990; pp. 249–256. [Google Scholar]
- Scholtz, J. Beyond Usability: Evaluation Aspects of Visual Analytic Environments. In Proceedings of the 2006 IEEE Symposium On Visual Analytics Science And Technology, Baltimore, MD, USA, 31 October–2 November 2006; pp. 145–150. [Google Scholar] [CrossRef]
- Lewis, C.; Poison, P.; Wharton, C.; Rieman, J. Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 1–5 April 1990. [Google Scholar] [CrossRef]
- Thomas, C.; Bevan, N. (Eds.) Usability Context Analysis: A Practical Guide, Version 4.04; Loughborough University: Loughborough, UK, 1996. prepared for Serco Usability Services. Available online: https://hdl.handle.net/2134/2652 (accessed on 1 January 2024).
- Jaspers, M.W. A comparison of usability methods for testing interactive health technologies: Methodological aspects and empirical evidence. Int. J. Med. Inform. 2009, 78, 340–353. [Google Scholar] [CrossRef]
- Rubin, J.; Chisnell, D.; Spool, J. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests, 2nd ed.; Wiley: Hoboken, NJ, USA, 2008; p. 384. [Google Scholar]
- Law, E.L.C.; Hvannberg, E.T. Analysis of combinatorial user effect in international usability tests. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vienna, Austria, 24–29 April 2004; pp. 9–16. [Google Scholar] [CrossRef]
- Molich, R.; Nielsen, J. Improving a Human-Computer Dialogue. Commun. ACM 1990, 33, 338–348. [Google Scholar] [CrossRef]
- Hartson, R.; Pyla, P.S. Rapid Evaluation Methods; Elsevier: Amsterdam, The Netherlands, 2012; pp. 467–501. [Google Scholar] [CrossRef]
- Delice, E.K.; Güngör, Z. The usability analysis with heuristic evaluation and analytic hierarchy process. Int. J. Ind. Ergon. 2009, 39, 934–939. [Google Scholar] [CrossRef]
- Virzi, R.A.; Sorce, J.F.; Herbert, L.B. Comparison of three usability evaluation methods: Heuristic, think-aloud, and performance testing. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting; Sage Publications: Los Angeles, CA, USA, 1993; Volume 1. [Google Scholar] [CrossRef]
- Lazar, J.; Feng, J.H.; Hochheiser, H. Usability Testing; Elsevier: Amsterdam, The Netherlands, 2017; pp. 263–298. [Google Scholar] [CrossRef]
- Quiñones, D.; Rusu, C. How to develop usability heuristics: A systematic literature review. Comput. Stand. Interfaces 2017, 53, 89–122. [Google Scholar] [CrossRef]
- Jaferian, P.; Hawkey, K.; Sotirakopoulos, A.; Velez-Rojas, M.; Beznosov, K. Heuristics for evaluating IT security management tools. In Proceedings of the Seventh Symposium on Usable Privacy and Security, Menlo Park, CA, USA, 9–11 July 2014; Volume 29. [Google Scholar] [CrossRef]
- Lechner, B.; Fruhling, A.L.; Petter, S.; Siy, H.P. The Chicken and the Pig: User Involvement in Developing Usability Heuristics. In 19th Americas Conference on Information Systems, AMCIS 2013 - Hyperconnected World: Anything, Anywhere, Anytime; Association for Information Systems: Chicago, IL, USA, 2013; Volume 5, pp. 3263–3270. [Google Scholar]
- Sim, G.; Read, J.C.; Cockton, G. Evidence based design of heuristics for computer assisted assessment. In Proceedings of the Human-Computer Interaction–INTERACT 2009: 12th IFIP TC 13 International Conference; Uppsala, Sweden, 24–28 August 2009, Springer: Berlin/Heidelberg, Germany, 2009; Volume 5726. [Google Scholar] [CrossRef]
- Ling, C.; Salvendy, G. Extension of heuristic evaluation method: A review and reappraisal. Ergon. IJE HF 2005, 27, 179–197. [Google Scholar]
- Paddison, C.; Englefield, P. Applying heuristics to accessibility inspections. Interact. Comput. 2004, 16, 507–521. [Google Scholar] [CrossRef]
- Inostroza, R.; Rusu, C.; Roncagliolo, S.; Rusu, V.; Collazos, C.A. Developing SMASH: A set of SMArtphone’s uSability Heuristics. Comput. Stand. Interfaces 2016, 43, 40–52. [Google Scholar] [CrossRef]
- Hermawati, S.; Lawson, G. Establishing usability heuristics for heuristics evaluation in a specific domain: Is there a consensus? Appl. Ergon. 2016, 56, 34–51. [Google Scholar] [CrossRef]
- Bailey, R.W.; Wolfson, C.A.; Nall, J.; Koyani, S. Performance-Based Usability Testing: Metrics That Have the Greatest Impact 161 for Improving a System’s. In Human Centered Design; Springer: Berlin/Heidelberg, Germany, 2009; pp. 3–12. [Google Scholar] [CrossRef]
- Mitta, D.A. A Methodology for Quantifying Expert System Usability. Hum. Factors J. Hum. Factors Ergon. Soc. 1991, 33, 233–245. [Google Scholar] [CrossRef]
- Benaida, M. Developing and extending usability heuristics evaluation for user interface design via AHP. Soft Comput. 2023, 27, 9693–9707. [Google Scholar] [CrossRef]
- Granollers, T. Usability Evaluation with Heuristics, Beyond Nielsen’s List; ThinkMind Digital Library: Lisbon, Portugal, 2018; pp. 60–65. [Google Scholar]
- Sharp, H.; Preece, J.; Rogers, Y. Interaction Design: Beyond Human-Computer Interaction, 5th ed.; John Wiley & Sons Inc.: Hoboken, NJ, USA, 2019; p. 656. [Google Scholar]
- Bonastre, L.; Granollers, T. A set of heuristics for user experience evaluation in E-commerce websites. In Proceedings of the 7th International Conference on Advances in Computer-Human Interactions, Barcelona, Spain, 23–27 March 2014. [Google Scholar]
- Paz, F.; Paz, F.A.; Sánchez, M.; Moquillaza, A.; Collantes, L. Quantifying the usability through a variant of the traditional heuristic evaluation process. In Design, User Experience, and Usability: Theory and Practice: 7th International Conference, DUXU 2018, Held as Part of HCI International 2018, Las Vegas, NV, USA, July 15–20, 2018; Springer International Publishing: Cham, Switzerland, 2018; Volume 10918, LNCS. [Google Scholar] [CrossRef]
- Kemp, E.A.; Thompson, A.J.; Johnson, R.S. Interface evaluation for invisibility and ubiquity—An example from E-learning. In Proceedings of the 9th ACM SIGCHI New Zealand Chapter’s International Conference on Human-Computer Interaction: Design Centered HCI, Wellington, New Zealand, 2 July 2008. [Google Scholar] [CrossRef]
- Pierotti, D. Heuristic Evaluation: A System Checklist; Xerox Corporation: Norwalk, CT, USA, 1995; pp. 1–22. Available online: https://users.polytech.unice.fr/~pinna/MODULEIHM/ANNEE2010/CEIHM/XEROX%20HE_CKLST.pdf (accessed on 1 January 2024).
- Khowaja, K.; Al-Thani, D. New Checklist for the Heuristic Evaluation of mHealth Apps (HE4EH): Development and Usability Study. JMIR MHealth UHealth 2020, 8, e20353. [Google Scholar] [CrossRef]
- Holey, R.H. Handbook of Structural Equation Modeling, 1st ed.; The Guilford Press: New York, NY, USA, 2012; pp. 3–16. [Google Scholar]
- Brodsky, S.L.; Lichtenstein, B. The Gold Standard and the Pyrite Principle: Toward a Supplemental Frame of Reference. Front. Psychol. 2020, 11, 562. [Google Scholar] [CrossRef]
- Williamson, K. Questionnaires, Individual Interviews and Focus Group Interviews; Elsevier: Amsterdam, The Netherlands, 2018; pp. 379–403. [Google Scholar] [CrossRef]
- Thiem, A.; Duşa, A. Qualitative Comparative Analysis with R; Springer: New York, NY, USA, 2013; Volume 5, p. 99. [Google Scholar] [CrossRef]
- Contreras-Pacheco, O.E.; Talero-Sarmiento, L.H.; Camacho-Pinto, J.C. Effects of Corporate Social Responsibility on Employee Organizational Identification: Authenticity or Fallacy. Contaduría Adm. 2019, 64, 1–22. [Google Scholar] [CrossRef]
- Leventhal, B.C.; Ames, A.J.; Thompson, K.N. Simulation Studies for Psychometrics. In International Encyclopedia of Education, 4th ed.; Elsevier: Amsterdam, The Netherlands, 2022. [Google Scholar] [CrossRef]
- Tanner, K. Survey Designs. In Research Methods: Information, Systems, and Contexts, 2nd ed.; Elsevier: Amsterdam, The Netherlands, 2018. [Google Scholar] [CrossRef]
- Bubaš, G.; Čižmešija, A.; Kovačić, A. Development of an Assessment Scale for Measurement of Usability and User Experience Characteristics of Bing Chat Conversational AI. Future Internet 2024, 16, 4. [Google Scholar] [CrossRef]
- van der Linden, W.J. Item Response Theory. In Encyclopedia of Social Measurement; Elsevier: Amsterdam, The Netherlands, 2004. [Google Scholar] [CrossRef]
- Gupta, K.; Roy, S.; Poonia, R.C.; Nayak, S.R.; Kumar, R.; Alzahrani, K.J.; Alnfiai, M.M.; Al-Wesabi, F.N. Evaluating the Usability of mHealth Applications on Type 2 Diabetes Mellitus Using Various MCDM Methods. Healthcare 2022, 10, 4. [Google Scholar] [CrossRef]
- Muhammad, A.; Siddique, A.; Naveed, Q.N.; Khaliq, U.; Aseere, A.M.; Hasan, M.A.; Qureshi, M.R.N.; Shahzad, B. Evaluating Usability of Academic Websites through a Fuzzy Analytical Hierarchical Process. Sustainability 2021, 13, 2040. [Google Scholar] [CrossRef]
- Iryanti, E.; Santosa, P.I.; Kusumawardani, S.S.; Hidayah, I. Inverse Trigonometric Fuzzy Preference Programming to Generate Weights with Optimal Solutions Implemented on Evaluation Criteria in E-Learning. Computers 2024, 13, 68. [Google Scholar] [CrossRef]
- Gulzar, K.; Tariq, O.; Mustafa, S.; Mohsin, S.M.; Kazmi, S.N.; Akber, S.M.A.; Abazeed, M.; Ali, M. A Fuzzy Analytic Hierarchy Process for Usability Requirements of Online Education Systems. IEEE Access 2023, 11, 146076–146089. [Google Scholar] [CrossRef]
- Munier, N.; Hontoria, E. Uses and Limitations of the AHP Method; Springer International Publishing: Cham, Switzerland, 2021. [Google Scholar] [CrossRef]
- Sakulin, S.; Alfimtsev, A. Multicriteria Decision Making in Tourism Industry Based on Visualization of Aggregation Operators. Appl. Syst. Innov. 2023, 6, 74. [Google Scholar] [CrossRef]
- Salanti, G.; Nikolakopoulou, A.; Efthimiou, O.; Mavridis, D.; Egger, M.; White, I.R. Introducing the Treatment Hierarchy Question in Network Meta-Analysis. Am. J. Epidemiol. 2022, 191, 930–938. [Google Scholar] [CrossRef]
- Marttunen, M.; Belton, V.; Lienert, J. Are objectives hierarchy related biases observed in practice? A meta-analysis of environmental and energy applications of Multi-Criteria Decision Analysis. Eur. J. Oper. Res. 2018, 265, 178–194. [Google Scholar] [CrossRef]
- Abrahantes, J.C.; Molenberghs, G.; Burzykowski, T.; Shkedy, Z.; Abad, A.A.; Renard, D. Choice of units of analysis and modeling strategies in multilevel hierarchical models. Comput. Stat. Data Anal. 2004, 47, 537–563. [Google Scholar] [CrossRef]
- Saaty, T.L. Decision making—The Analytic Hierarchy and Network Processes (AHP/ANP). J. Syst. Sci. Syst. Eng. 2004, 13, 1–35. [Google Scholar] [CrossRef]
- Ishizaka, A.; Labib, A. Review of the main developments in the analytic hierarchy process. Expert Syst. Appl. 2011, 38, 14336–14345. [Google Scholar] [CrossRef]
- Sluser, B.; Plavan, O.; Teodosiu, C. Environmental Impact and Risk Assessment; Elsevier: Amsterdam, The Netherlands, 2022; pp. 189–217. [Google Scholar] [CrossRef]
- Vaidya, O.S.; Kumar, S. Analytic hierarchy process: An overview of applications. Eur. J. Oper. Res. 2006, 169, 1–29. [Google Scholar] [CrossRef]
- Fishburn, P.C. Nontransitive preferences in decision theory. J. Risk Uncertain. 1991, 4, 113–134. [Google Scholar] [CrossRef]
- Wu, Z.; Tu, J. Managing transitivity and consistency of preferences in AHP group decision making based on minimum modifications. Inf. Fusion 2021, 67, 125–135. [Google Scholar] [CrossRef]
- Bevan, N. Measuring usability as quality of use. Softw. Qual. J. 1995, 4, 115–130. [Google Scholar] [CrossRef]
- Omar, K.; Rapp, B.; Gómez, J.M. Heuristic evaluation checklist for mobile ERP user interfaces. In Proceedings of the 2016 7th International Conference on Information and Communication Systems (ICICS), Irbid, Jordan, 5–7 April 2016; pp. 180–185. [Google Scholar]
- Aballay, L.; Lund, M.I.; Gonzalez Capdevila, M.; Granollers, T. Heurísticas de Usabilidad utilizando una Plataforma Abierta y Colaborativa Práctica Áulica Aplicada a Sitios e-commerce. In V Congreso Internacional de Ciencias de la Computación y Sistemas de Información 2021–CICCSI 2021; CICCSI: Mendoza, Argentina, 2021; Available online: https://www.researchgate.net/publication/358430089_Heuristicas_de_Usabilidad_utilizando_una_Plataforma_Abierta_y_Colaborativa_Practica_Aulica_Aplicada_a_Sitios_e-commerce (accessed on 1 January 2024).
- Yáñez Gómez, R.; Cascado Caballero, D.; Sevillano, J.L. Heuristic Evaluation on Mobile Interfaces: A New Checklist. Sci. World J. 2014, 2014, 434326. [Google Scholar] [CrossRef] [PubMed]
- Komarkova, J.; Visek, O.; Novak, M. Heuristic evaluation of usability of GeoWeb sites. In Web and Wireless Geographical Information Systems: 7th International Symposium, W2GIS 2007, Cardiff, UK, 28–29 November 2007; Proceedings 7; Springer: Berlin/Heidelberg, Germany, 2007; pp. 264–278. [Google Scholar]
- Almenara, A.P.; Humanes, J.; Granollers, T. MPIu+aX, User-Centered Design methodology that empathizes with the user and generates a better accessible experience (From theory to practice). In Proceedings of the XXIII International Conference on Human Computer Interaction, Copenhagen, Denmark, 23–28 July 2023; pp. 1–3. [Google Scholar] [CrossRef]
Method | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
ID | Characteristic | SAW | AHP | TOP | VIK | PRO | MOO | ELE | ANP | LPr | SIM |
1 | Simple scenario | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |
8 | Several DMs (group decision making) | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9 | Ease of changing the initial matrix values | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 |
10 | Large project involving consultations with people | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 |
11 | Linguistic initial matrix | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
12 | Qualitative criteria | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |
14 | Using a particular normalization procedure | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 |
16 | Independent alternatives | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
19 | Many criteria | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 |
20 | Independent criteria (compensatory methods) | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
22 | Necessity of knowing criteria’s validity range | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 1 |
46 | Necessity to evaluate criteria’s relative importance | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 |
47 | Want to use subjective weights | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 |
50 | Sensitivity analysis (SA) with weights | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 |
54 | Not theoretically complex | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 |
Requirement | 16 | 16 | 16 | 16 | 16 | 16 | 16 | 16 | 16 | 16 | |
Match | 10 | 12 | 12 | 8 | 10 | 9 | 9 | 10 | 6 | 8 | |
Best (minimum gap) | 6 | 4 | 4 | 8 | 6 | 7 | 7 | 6 | 10 | 8 |
Heuristic | Effectiveness | Efficiency | Satisfaction |
---|---|---|---|
Visibility and system state | Enhances task accuracy by informing users of system status. | Reduces time spent understanding system state. | Increases user confidence by providing clarity. |
Connection with the real world | Improves task understanding through familiar concepts. | Speeds up task performance by reducing cognitive load. | Enhances comfort using familiar metaphors. |
User control and freedom | Improves task accuracy by allowing error correction. | Reduces effort through efficient task recovery. | Enhances user satisfaction by providing control options. |
Consistency and standards | Maintains task accuracy through uniform interactions. | Increases efficiency by reducing learning time. | Provides a predictable and satisfying user experience. |
Recognition rather than memory | Minimizes reliance on memory, improving task accuracy. | Enhances efficiency by providing cues and reminders. | Reduces user frustration, enhancing satisfaction. |
Flexibility and efficiency | Supports accurate task performance for all user levels. | Provides experienced users with shortcuts to enhance efficiency. | Accommodates diverse user skills, increasing satisfaction. |
Help users recognize, diagnose, and recover from errors | Improves accuracy through effective error management. | Decreases time and effort in error recovery. | Provides a safety net that enhances user satisfaction. |
Error prevention | Prevents potential errors, improving accuracy. | Enhances efficiency by reducing error correction needs. | Improves user experience by minimizing disruptions. |
Aesthetic and minimalist design | Focuses user attention on essential tasks. | Streamlines interactions, improving efficiency. | Delivers an appealing environment that enhances satisfaction. |
Help and documentation | Assists users with completing tasks correctly. | Decreases time spent seeking assistance, boosting efficiency. | Increases user satisfaction with accessible support. |
Save the state and protect the work | Preserves user progress, enhancing accuracy. | Reduces time redoing tasks, improving efficiency. | Protects user effort, increasing satisfaction. |
Color and readability | Improves accuracy by enhancing content clarity. | Lowers effort required for content comprehension. | Provides a visually accessible interface that increases satisfaction. |
Autonomy | Allows personalization, improving task relevance and accuracy. | Enhances efficiency by tailoring the system to user preferences. | Enhances satisfaction through personalized interaction. |
Defaults | Provides reliable starting points, enhancing task accuracy. | Reduces initial setup time, improving efficiency. | Increases satisfaction with dependable system behaviors. |
Latency reduction | Provides immediate feedback, ensuring task accuracy. | Reduces waiting times, significantly improving efficiency. | Boosts satisfaction with a responsive system. |
Heuristic | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | |
8.33 | 6.67 | 5.00 | 10.00 | 8.33 | 10.00 | 6.67 | 5.00 | 6.67 | 8.33 | 5.00 | 6.67 | 5.00 | 5.00 | 3.33 |
Heuristic | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | |
1 | 1.00 | 3.00 | 5.00 | 3.00 | 1.00 | 5.00 | 5.00 | 5.00 | 3.00 | 7.00 | 7.00 | 3.00 | 3.00 | 7.00 | 7.00 |
2 | 0.33 | 1.00 | 3.00 | 3.00 | 0.33 | 3.00 | 3.00 | 3.00 | 1.00 | 5.00 | 5.00 | 1.00 | 1.00 | 5.00 | 5.00 |
3 | 0.2 | 0.33 | 1.00 | 1.00 | 0.14 | 3.00 | 3.00 | 3.00 | 1.00 | 5.00 | 5.00 | 1.00 | 1.00 | 5.00 | 5.00 |
4 | 0.33 | 0.33 | 1.00 | 1.00 | 0.33 | 1.00 | 1.00 | 1.00 | 1.00 | 3.00 | 3.00 | 1.00 | 1.00 | 3.00 | 3.00 |
5 | 1.00 | 3.00 | 7.00 | 3.00 | 1.00 | 5.00 | 5.00 | 5.00 | 3.00 | 7.00 | 7.00 | 3.00 | 3.00 | 7.00 | 7.00 |
6 | 0.2 | 0.33 | 0.33 | 1.00 | 0.20 | 1.00 | 1.00 | 1.00 | 0.33 | 5.00 | 5.00 | 0.33 | 0.33 | 5.00 | 5.00 |
7 | 0.2 | 0.33 | 0.33 | 1.00 | 0.20 | 1.00 | 1.00 | 1.00 | 0.33 | 5.00 | 5.00 | 0.33 | 0.33 | 5.00 | 5.00 |
8 | 0.2 | 0.33 | 0.33 | 1.00 | 0.20 | 1.00 | 1.00 | 1.00 | 0.33 | 5.00 | 5.00 | 0.33 | 0.33 | 5.00 | 5.00 |
9 | 0.33 | 1.00 | 1.00 | 1.00 | 0.33 | 3.00 | 3.00 | 3.00 | 1.00 | 5.00 | 5.00 | 1.00 | 1.00 | 5.00 | 5.00 |
10 | 0.14 | 0.20 | 0.20 | 0.33 | 0.14 | 0.20 | 0.20 | 0.20 | 0.20 | 1.00 | 1.00 | 0.20 | 0.20 | 1.00 | 1.00 |
11 | 0.14 | 0.20 | 0.20 | 0.33 | 0.14 | 0.20 | 0.20 | 0.20 | 0.20 | 1.00 | 1.00 | 0.20 | 0.20 | 1.00 | 1.00 |
12 | 0.33 | 1.00 | 1.00 | 1.00 | 0.33 | 3.00 | 3.00 | 3.00 | 1.00 | 5.00 | 5.00 | 1.00 | 1.00 | 5.00 | 5.00 |
13 | 0.33 | 1.00 | 1.00 | 1.00 | 0.33 | 3.00 | 3.00 | 3.00 | 1.00 | 5.00 | 5.00 | 1.00 | 1.00 | 5.00 | 5.00 |
14 | 0.14 | 0.20 | 0.20 | 0.33 | 0.14 | 0.20 | 0.20 | 0.20 | 0.20 | 1.00 | 1.00 | 0.20 | 0.20 | 1.00 | 1.00 |
15 | 0.14 | 0.20 | 0.20 | 0.33 | 0.14 | 0.20 | 0.20 | 0.20 | 0.20 | 1.00 | 1.00 | 0.20 | 0.20 | 1.00 | 1.00 |
Heuristici | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | |
18.06 | 9.26 | 6.99 | 5.06 | 18.96 | 4.21 | 4.21 | 4.21 | 7.74 | 1.45 | 1.45 | 7.75 | 7.75 | 1.45 | 1.45 |
Evaluator | |||||||
---|---|---|---|---|---|---|---|
Heuristic | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
5.0 | 5.0 | 4.0 | 4.5 | 4.0 | 5.0 | 4.0 | |
3.0 | 2.0 | 4.0 | 4.0 | 3.5 | 4.0 | 2.0 | |
3.0 | 2.0 | 3.0 | 2.0 | 2.0 | 1.0 | 1.5 | |
5.0 | 4.0 | 5.0 | 5.5 | 3.5 | 4.0 | 3.5 | |
5.0 | 4.0 | 4.0 | 5.0 | 5.0 | 5.0 | 4.0 | |
5.0 | 3.0 | 3.0 | 6.0 | 3.0 | 5.0 | 3.0 | |
3.0 | 3.0 | 0.0 | 0.0 | 2.0 | 2.0 | 4.0 | |
2.0 | 2.0 | 2.0 | 0.0 | 2.0 | 2.0 | 1.0 | |
4.0 | 3.0 | 4.0 | 3.0 | 4.0 | 4.0 | 1.0 | |
0.0 | 0.0 | 0.0 | 0.0 | 0.5 | 0.0 | 0.0 | |
2.0 | 0.0 | 0.0 | 1.0 | 1.0 | 0.0 | 0.0 | |
4.0 | 3.0 | 2.5 | 4.0 | 4.0 | 2.0 | 2.0 | |
2.0 | 2.5 | 2.0 | 3.0 | 3.0 | 3.0 | 2.0 | |
0.0 | 0.0 | 0.0 | 2.0 | 1.0 | 0.0 | 2.0 | |
1.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
H1 | H2 | H3 | H4 | H5 | H6 | H7 | H8 | H9 | H10 | H11 | H12 | H13 | H14 | H15 | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
count | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 |
mean | 4.50 | 3.21 | 2.07 | 4.36 | 4.57 | 4.00 | 2.00 | 1.57 | 3.29 | 0.07 | 0.57 | 3.07 | 2.50 | 0.71 | 0.43 |
std | 0.50 | 0.91 | 0.73 | 0.80 | 0.54 | 1.29 | 1.53 | 0.79 | 1.11 | 0.19 | 0.79 | 0.93 | 0.50 | 0.95 | 0.54 |
min | 4.00 | 2.00 | 1.00 | 3.50 | 4.00 | 3.00 | 0.00 | 0.00 | 1.00 | 0.00 | 0.00 | 2.00 | 2.00 | 0.00 | 0.00 |
25% | 4.00 | 2.50 | 1.75 | 3.75 | 4.00 | 3.00 | 1.00 | 1.50 | 3.00 | 0.00 | 0.00 | 2.25 | 2.00 | 0.00 | 0.00 |
50% | 4.50 | 3.50 | 2.00 | 4.00 | 5.00 | 3.00 | 2.00 | 2.00 | 4.00 | 0.00 | 0.00 | 3.00 | 2.50 | 0.00 | 0.00 |
75% | 5.00 | 4.00 | 2.50 | 5.00 | 5.00 | 5.00 | 3.00 | 2.00 | 4.00 | 0.00 | 1.00 | 4.00 | 3.00 | 1.50 | 1.00 |
max | 5.00 | 4.00 | 3.00 | 5.50 | 5.00 | 6.00 | 4.00 | 2.00 | 4.00 | 0.50 | 2.00 | 4.00 | 3.00 | 2.00 | 1.00 |
H1 | H2 | H3 | H4 | H5 | H6 | H7 | H8 | H9 | H10 | H11 | H12 | H13 | H14 | H15 | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
count | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 | 7.00 |
mean | 0.50 | 0.61 | 0.54 | 0.43 | 0.57 | 0.33 | 0.50 | 0.79 | 0.76 | 0.14 | 0.29 | 0.54 | 0.50 | 0.36 | 0.43 |
std | 0.50 | 0.45 | 0.37 | 0.40 | 0.53 | 0.43 | 0.38 | 0.39 | 0.37 | 0.38 | 0.39 | 0.47 | 0.50 | 0.48 | 0.53 |
min | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |
25% | 0.00 | 0.25 | 0.38 | 0.13 | 0.00 | 0.00 | 0.25 | 0.75 | 0.67 | 0.00 | 0.00 | 0.13 | 0.00 | 0.00 | 0.00 |
50% | 0.50 | 0.75 | 0.50 | 0.25 | 1.00 | 0.00 | 0.50 | 1.00 | 1.00 | 0.00 | 0.00 | 0.50 | 0.50 | 0.00 | 0.00 |
75% | 1.00 | 1.00 | 0.75 | 0.75 | 1.00 | 0.67 | 0.75 | 1.00 | 1.00 | 0.00 | 0.50 | 1.00 | 1.00 | 0.75 | 1.00 |
max | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Talero-Sarmiento, L.; Gonzalez-Capdevila, M.; Granollers, A.; Lamos-Diaz, H.; Pistili-Rodrigues, K. Towards a Refined Heuristic Evaluation: Incorporating Hierarchical Analysis for Weighted Usability Assessment. Big Data Cogn. Comput. 2024, 8, 69. https://doi.org/10.3390/bdcc8060069
Talero-Sarmiento L, Gonzalez-Capdevila M, Granollers A, Lamos-Diaz H, Pistili-Rodrigues K. Towards a Refined Heuristic Evaluation: Incorporating Hierarchical Analysis for Weighted Usability Assessment. Big Data and Cognitive Computing. 2024; 8(6):69. https://doi.org/10.3390/bdcc8060069
Chicago/Turabian StyleTalero-Sarmiento, Leonardo, Marc Gonzalez-Capdevila, Antoni Granollers, Henry Lamos-Diaz, and Karine Pistili-Rodrigues. 2024. "Towards a Refined Heuristic Evaluation: Incorporating Hierarchical Analysis for Weighted Usability Assessment" Big Data and Cognitive Computing 8, no. 6: 69. https://doi.org/10.3390/bdcc8060069
APA StyleTalero-Sarmiento, L., Gonzalez-Capdevila, M., Granollers, A., Lamos-Diaz, H., & Pistili-Rodrigues, K. (2024). Towards a Refined Heuristic Evaluation: Incorporating Hierarchical Analysis for Weighted Usability Assessment. Big Data and Cognitive Computing, 8(6), 69. https://doi.org/10.3390/bdcc8060069