The Use of Business Intelligence Software to Monitor Key Performance Indicators (KPIs) for the Evaluation of a Computerized Maintenance Management System (CMMS)
Abstract
:1. Introduction
- Institutionalizing and interconnecting data within a healthcare technology management program;
- Contributing to the organization and monitoring of inspections and preventive maintenance;
- Tracking repairs;
- Monitoring equipment performance indicators;
- Monitoring clinical engineering staff performance indicators;
- Generating reports for planning training programs;
- Providing libraries of regulatory requirements and safety information;
- Generating documentation suitable for accreditation by regulatory and standard organizations [15].
2. Literature Review
- Written in the English language;
- Full articles written in English, excluding reviews, perspectives, and communications;
- Full text available;
- Published from 2014 to September 2023;
- Reporting the indicators;
- Focused on the management of electromedical equipment.
- Articles concerning the management of medical devices that were not electromedical equipment;
- Studies that define a maintenance prioritization index;
- Papers that report cases of management in critical contexts or situations.
3. Materials and Methods
3.1. Indicator Definition
- Administrative management: Related to the costs of activities. Examples of these indices are the costs related to preventive and corrective maintenance;
- Logistics management: Related to CED’s activities as purchase, service, rental, loan donation, spare parts, and accessories, in and out of the warehouse, both from healthcare facilities and from external companies All the indices belonging to this category are time indices. Examples of logistics management indicators are the average arrival time of spare parts or average call closure times;
- Technical management: Related to all activities carried out by technicians on biomedical equipment. They are useful indicators to assess the level of efficiency and coordination of technical staff. Examples of such indices may be “One-Day Action” or “Mean Time To Repair (MTTR)”;
- Training: Indicators measuring the level of staff training.;
- Quality: Measures to assess that the management of biomedical equipment is appropriate, effective, safe, and economical;
- Equipment management: Related to the management of all the equipment managed or used by the CI. Examples are downtime and uptime measures.
- Name: Name of the KPI, using a standard naming system. The name should be self-explanatory, such as “Mean Time to Repair”;
- Number and type: Number associated with the KPI. It is also necessary to indicate to which class it belongs (administrative, logistics, technical, training, quality, equipment management). The two pieces of information can be combined to create an alphanumeric abbreviation identifying the KPI;
- Short definition: Short description of the KPI, similar to a name, e.g., “Average time to repair a device”;
- Detailed definition: A more comprehensive description of the KPI, including sources, formulas, possible limitations, and applicability in the organization. The detailed definition should also include the rationale for choosing and adopting the KPI in the decision-making and review process;
- Formula: Mathematical equation of the KPI;
- Numerator: Description of the numerator, including inclusion and exclusion criteria;
- Denominator: Description of the denominator, including inclusion and exclusion criteria. Often, the denominator is the ‘total’ (to obtain percentage KPIs);
- KPI unit: Format of the KPI result (days, months, or percentage);
- Statistical adjustments: Illustration of statistical techniques used on the dataset to reduce the presence of confounding values (such as outliers often also caused by sampling or transcription errors);
- Reference values (benchmarks): KPI values obtained from external organizations that are similar to the object of study, often indicated as a model for best hospital practice, or from other internal departments that can provide a direct comparison. These are useful for setting a baseline and an ideal target for process improvement.
3.2. Dashboard
- Using the Microsoft ecosystem: If one already uses Microsoft systems, integration with Power BI may be easier. This platform, being proprietary to the US company, is seamlessly integrated with other Microsoft applications, simplifying data sharing and collaboration within the Microsoft environment itself;
- Ease of use: Power BI’s intuitive and user-friendly interface makes the platform a particularly advantageous choice for less experienced users of data analysis;
- Price: In terms of cost, the Microsoft platform offers a free plan with limited functionality. This allows anyone to enjoy the benefits of business intelligence. Other platforms allow access to the service only through subscriptions of different durations and functionalities. Power BI also has different premium plans;
- Scalability: The management of large amounts of data is allowed by all platforms. Compared to other platforms, Power BI is considered less scalable for advanced analysis and the needs of large companies;
- Customization and visualization: In some cases, visualization and customization play a key role in the creation of a dashboard or report. The possibility of having a wide range of visualization tools makes the product more dynamic and intuitive;
- Analysis capabilities: Visualizations and, above all, advanced analysis capabilities and complex and detailed analyses are essential for large companies;
- Community and support: BI platforms today have a solid base of users and online communities. The choice may depend on the availability of support resources and the possibility of finding answers to your questions.
- “Maintenance”: This file contains details of maintenance activities since January 1, 2022. It includes fields such as “Number”, “Description”, “Contact”, and more;
- “Equipment”: This file provides information on various equipment types and their attributes;
- “Criticality”: This file, containing ISO-related criticality assessments, offers insights into equipment descriptions, functions, damages, and more.
- In the “Equipment Table”, two empty columns were removed and transformed into inventory numbers integers for consistency;
- Similarly, in the “Maintenance Table”, an empty column was addressed and ensured data type consistency;
- For the “Criticality Table”, data types were verified and standardized values for clarity.
- “Average Request Closing Time”: This page calculates the average time to close maintenance requests, visualizing data through line charts and stacked column histograms with dynamic filters for enhanced usability;
- “One-Day Action”: This page identifies maintenance actions completed within a day, presenting data through pie and donut charts, accompanied by a detailed table for comprehensive analysis;
- “Mean Time To Repair”: Utilizing bar charts, this page evaluates repair times for technicians and equipment types, facilitating data interpretation through dynamic filters;
- “Supported Devices for Technical Personnel”: Visualizing maintenance assignments and device statuses, this page provides insights into technician workload and equipment conditions;
- “Negligent Action”: By categorizing open calls based on duration and status, this page highlights potential operational inefficiencies, aiding in proactive maintenance management;
- “Number of Requests vs. Installed Devices”: This page compares maintenance requests with installed devices per department, offering insights into resource allocation and operational efficiency;
- “Details of Department Requests”: Building on the previous page, this section provides detailed breakdowns of maintenance requests per department, facilitating deeper analysis;
- “Criticality”: This page focuses on equipment criticality, visualizing device distribution and criticality percentages per operating unit;
- “Obsolescence”: Utilizing device lifespan data, this section assesses equipment obsolescence and provides insights into equipment age and distribution.
4. Results
4.1. Logistics Management KPIs
4.1.1. Average Downtime in the Warehouse
4.1.2. Receiving Spare Parts Meantime
4.1.3. Mean Arrival Time
4.1.4. Percentage of External Downtime
4.2. Technical Management KPIs
4.2.1. Average Time since First Intervention
4.2.2. Mean Time to Repair (MTTR)
4.2.3. Average Request Closing Time
4.2.4. Supported Devices for Technical Personnel
4.2.5. One-Day Action
4.2.6. Negligent Actions
4.3. Equipment Management KPIs
4.3.1. Average Failure Time
4.3.2. Uptime
4.3.3. Uptime for Life-Saving Equipment
4.3.4. Corrective Maintenance Downtime
4.3.5. Preventive Maintenance Downtime
4.3.6. Inventories That Generate Request
4.3.7. Number of Requests over Number of Devices per Hospital Wards
4.3.8. Failure Rate Category
5. Discussion
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A. Description of the Indicators Used in the ASST GOM Niguarda’s CED
Process | Indicator | Frequency of Analysis | |
Administrative management | 1 | (number of contracts approved within 5 months of signing budget sheet/total number of contracts) × 100 | Annually |
Logistic management | 2 | (number of detection of articles under minimum stock/total number of articles in stock with minimum stock) × 100 | Annually |
3 | (number of articles different from actual stock/total number of articles in stock with minimum stock) × 100 | Annually | |
Technical management | 4 | (number of operating theatre and intensive care equipment on which safety checks were carried out/total number of operating theatre and intensive care equipment) × 100 | Annually |
5 | number of intervention sheets assigned to technicians/number of technicians | Annually | |
6 | number of intervention sheets open for more than 7 days but less than 30 days/number of technicians | Annually | |
7 | number of action sheets open for more than 30 days/number of technicians | Annually | |
8 | average first response time of the technician for urgent calls | Annually | |
Training | 9 | (number of CI employees who have attended at least one course in a year/total number of employees) × 100 | Annually |
Quality | 10 | (number of targets achieved in the year/total targets to be achieved in the year) × 100 | Annually |
Appendix B. KPI Description
Name | Description | Scope of Measurement (Process, Outcome, Other) | Formula | Numerator—Inclusion Criteria | Numerator—Exclusion Criteria | Denominator—Inclusion Criteria | Denominator—Exclusion Criteria | Units of Measure |
Logistic management Type | ||||||||
Average downtime in the warehouse | Analysis of the average idle time of a repaired instrument in the warehouse | Process | All devices undergoing maintenance in the workshop | All devices not undergoing maintenance in the workshop | Number of devices that underwent maintenance and are idle in the warehouse | All devices in the warehouse that have not undergone maintenance (Inventory) | Days | |
Receiving spare parts meantime | Analysis of the average time elapsed between the request for spare parts and the arrival of the material | Process | All the orders requesting spare parts | All orders where spare parts are not requested | All orders where spare parts are requested | All orders where spare parts are not requested | Days | |
Arrival Meantime | Analysis of the average time elapsed between the opening of the ticket and the arrival of the instrument in the warehouse | Process | All devices requiring workshop maintenance for which the technician does not perform an on-site inspection. | All devices that do not require workshop maintenance or for which the technician performs an on-site inspection. | All maintenance calls for which the technician does not perform an on-site inspection. | All maintenance calls not in the workshop or for which the technician performs an on-site inspection. | Hours | |
Percentage of external downtime | Analysis of machine downtime due to external maintenance on total machine downtime | Process | All devices requiring maintenance in the company | All devices that are not sent to the company | All devices requiring maintenance in the company | All devices that are not sent to the company | Percentage | |
Technical Management | ||||||||
Average time since first intervention | Analysis of the average time elapsed between the opening of the call and the first intervention | Process | All devices not under contract | All devices under contract | All tickets for devices not under contract | All tickets for devices under contract | Hours | |
Mean Time to Repair | Analysis of the average time elapsed between the opening of the call and the first intervention | Process | All close maintenance request | All open maintenance request | All close maintenance request | All open maintenance request | Hours | |
Average request closing time | Analysis of average call closure time | Process | All close maintenance request which are not tests | All open maintenance request | All close maintenance request which are not tests | All open maintenance request | Hours | |
Supported devices for technical personnel | Number of devices supported by a single technician vs. total number of hospital devices | Process | All request assigned to a technician | All request not assigned to a technician | All request assigned to a technician | All request not assigned to a technician | Percentage | |
One-Day Action | Calculation of the number of interventions carried out in less than 24 h | Process | All open request | All close request | All request | - | Percentage | |
Equipment Management | ||||||||
Negligent Actions | Average time elapsed between the resolution of one fault and the occurrence of a second fault | Other | All equipment | - | All equipment | - | Percentage | |
Average Failure Time | Actual device availability time vs. theoretical time of use | Process | All equipment | - | All equipment | - | Percentage | |
Uptime | Actual device availability time vs. theoretical time of use referred to life saving equipment | Process | All life saving equipment | - | All life saving equipment | - | Percentage | |
Uptime for life saving equipment | Analysis of device unavailability time caused by Corrective Maintenance interventions | Process | All equipment | - | All equipment | - | Percentage | |
Corrective Maintenance Downtime | Analysis of device unavailability time caused by Preventive Maintenance interventions | Process | All equipment | - | All equipment | - | Percentage | |
Preventive Maintenance Downtime | Analysis of inventories (divided by cost centers) generating maintenance calls | Process | All equipment | - | All equipment | - | Percentage | |
Inventories that generate request | Analysis of how many calls is opened in a given department compared to how many installed have | Other | All equipment | - | All equipment | - | Percentage | |
Number of requests over number of devices per hospital wards | Failure analysis for each device class | Process | All equipment | - | All equipment | - | Percentage |
References
- Drechsler, K.; Grisold, T.; Gau, M.; Seidel, S. Digital Infrastructure Evolution: A Digital Trace Data Study. Available online: https://www.researchgate.net/publication/364199024 (accessed on 5 June 2024).
- Limna, P. The Digital Transformation of Healthcare in the Digital Economy: A Systematic Review. Int. J. Adv. Health Sci. Technol. 2023, 3, 127–132. [Google Scholar] [CrossRef]
- Stoumpos, A.I.; Kitsios, F.; Talias, M.A. Digital Transformation in Healthcare: Technology Acceptance and Its Applications. Int. J. Environ. Res. Public Health 2023, 20, 3407. [Google Scholar] [CrossRef] [PubMed]
- Assessing the Impact of Digital Transformation of Health Services Expert Panel on Effective Ways of Investing in Health (EXPH). [CrossRef]
- Cancela, J.; Charlafti, I.; Colloud, S.; Wu, C. Digital health in the era of personalized healthcare: Opportunities and challenges for bringing research and patient care to a new level. In Digital Health: Mobile and Wearable Devices for Participatory Health Applications; Elsevier: Amsterdam, The Netherlands, 2021. [Google Scholar] [CrossRef]
- Junaid, S.B.; Imam, A.A.; Balogun, A.O.; De Silva, L.C.; Surakat, Y.A.; Kumar, G.; Abdulkarim, M.; Shuaibu, A.N.; Garba, A.; Sahalu, Y.; et al. Recent Advancements in Emerging Technologies for Healthcare Management Systems: A Survey. Healthcare 2022, 10, 1940. [Google Scholar] [CrossRef] [PubMed]
- Kruk, M.E.; Gage, A.D.; Arsenault, C.; Jordan, K.; Leslie, H.H.; Roder-DeWan, S.; Adeyi, O.; Barker, P.; Daelmans, B.; Doubova, S.V.; et al. High-quality health systems in the Sustainable Development Goals era: Time for a revolution. Lancet Glob. Health 2018, 6, e1196–e1252. [Google Scholar] [CrossRef] [PubMed]
- Abernethy, A.; Adams, L.; Barrett, M.; Bechtel, C.; Brennan, P. The Promise of Digital Health: Then, Now, and the Future Digital Health in the 21st Century. NAM Perspect. 2022, 2022, 1031478/202206e. [Google Scholar] [CrossRef]
- Iadanza, E. Clinical engineering. In Clinical Engineering Handbook, 2nd ed.; Academic Press: Cambridge, MA, USA, 2019. [Google Scholar] [CrossRef]
- World Health Organization (WHO). Human Resources for Medical Devices: The Role of Biomedical Engineers; WHO: Geneva, Switzerland, 2017; pp. 1–234. Available online: https://apps.who.int/iris/bitstream/handle/10665/255261/9789241565479-eng.pdf (accessed on 5 June 2024).
- Hossain, M.A.; Ahmad, M.; Islam, M.R.; David, Y. Evaluation of Performance Outcomes of Medical Equipment Technology Management and Patient Safety: Skilled Clinical Engineer’s Approach. Glob. Clin. Eng. J. 2019, 1, 4–16. [Google Scholar] [CrossRef]
- Iadanza, E.; Gonnelli, V.; Satta, F.; Gherardelli, M. Evidence-based medical equipment management: A convenient implementation. Med. Biol. Eng. Comput. 2019, 57, 2215–2230. [Google Scholar] [CrossRef]
- Bliznakov, Z.; Pappous, G.; Bliznakova, K.; Pallikarakis, N. Integrated software system for improving medical equipment management. Biomed. Instrum. Technol. 2003, 37, 25–33. [Google Scholar] [CrossRef]
- Almomani, H.; Alburaiesi, M.L. Using Computerized Maintenance Management System (CMMS) in Healthcare Equipments Maintenance Operations. 2020. Available online: https://www.researchgate.net/publication/342707011 (accessed on 5 June 2024).
- WHO. Computerized maintenance. In WHO Medical Device Technical Series; WHO: Geneva, Switzerland, 2011; Available online: https://www.who.int/publications-detail-redirect/9789241501415 (accessed on 5 June 2024).
- Abayazeed, S.A.; Hamza, A.O. Software applications in healthcare technology management: A review. J. Clin. Eng. 2010, 35, 25–33. [Google Scholar] [CrossRef]
- Mobarek, I.; Tarawneh, W.; Langevin, F.; Ibbini, M. Fully automated clinical engineering technical management system. J. Clin. Eng. 2006, 31, 46–60. [Google Scholar] [CrossRef]
- Neven, S.; Rahman, S.A. An Automated Medical Equipment Management System Proposed for Small-Scale Hospitals. J. Clin. Eng. 2017, 42, E1–E8. [Google Scholar] [CrossRef]
- Chen, H.; Chiang, R.H.L.; Storey, V.C.; Robinson, J.M. Special Issue: Business Intelligence Research Business Intelligence and Analytics: From Big Data to Big Impact. Available online: www.freakonomics.com/2008/02/25/hal-varian-answers-your-questions/ (accessed on 5 June 2024).
- Rouhani, S.; Ashrafi, A.; Ravasan, A.Z.; Afshari, S. The impact model of business intelligence on decision support and organizational benefits. J. Enterp. Inf. Manag. 2016, 29, 19–50. [Google Scholar] [CrossRef]
- Sneed, H.; Verhoef, C. Reengineering the Corporation—A Manifesto for IT Evolution. 2001. Available online: https://www.researchgate.net/publication/2366189 (accessed on 5 June 2024).
- Yi, Q.; Hoskins, R.E.; Hillringhouse, E.A.; Sorensen, S.S.; Oberle, M.W.; Fuller, S.S.; Wallace, J.C. Integrating open-source technologies to build low-cost information systems for improved access to public health data. Int. J. Health Geogr. 2008, 7, 29. [Google Scholar] [CrossRef] [PubMed]
- Cucoranu, I.C.; Parwani, A.V.; West, A.J.; Romero-Lauro, G.; Nauman, K.; Carter, A.B.; Balis, U.J.; Tuthill, M.J.; Pantanowitz, L. Privacy and security of patient data in the pathology laboratory. J. Pathol. Inform. 2013, 4, 4. [Google Scholar] [CrossRef] [PubMed]
- Sabherwal, R.; Becerra-Fernandez, I. Business Intelligence Practices Technologies, and Management; Handbook; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
- ISO 9001:2015. Available online: https://www.iso.org/standard/62085.html (accessed on 5 June 2024).
- Bhardwaj, P.; Joshi, N.K.; Singh, P.; Suthar, P.; Joshi, V.; Jain, Y.K.; Charan, J.; Ameel, M.; Singh, K.; Patil, M.S.; et al. Competence-Based Assessment of Biomedical Equipment Management and Maintenance System (e-Upkaran) Using Benefit Evaluation Framework. Cureus 2022, 14, e30579. [Google Scholar] [CrossRef] [PubMed]
- Gonnelli, V.; Satta, F.; Frosini, F.; Iadanza, E. Evidence-based approach to medical equipment maintenance monitoring. In EMBEC-NBC; Springer: Singapore, 2017. [Google Scholar] [CrossRef]
- Camila, R.d.S.; William, C.A.C.; Renan, F.; Renato, G.O. Reliability indicators in the medical equipment management. IFMBE Proc. 2015, 51, 1566–1570. [Google Scholar] [CrossRef] [PubMed]
- Oshiyama, N.F.; Silveira, A.C.; Bassani, R.A.; Bassani, J.W.M. Medical equipment classification according to corrective maintenance data: A strategy based on the equipment age. Rev. Bras. Eng. Biomed. 2014, 30, 64–69. [Google Scholar] [CrossRef]
- Kumar, U.; Galar, D.; Parida, A.; Stenström, C.; Berges, L. Maintenance performance metrics: A state-of-the-art review. J. Qual. Maint. Eng. 2013, 19, 233–277. [Google Scholar] [CrossRef]
- Automation Systems and Integration-Key Performance Indicators (KPIs) for Manufacturing Operations Management-Part 2: Definitions and Descriptions Systèmes D’automatisation et Intégration-Indicateurs de la Performance clé Pour le Management des Opérations de Fabrication-Partie 2: Définitions et Descriptions. 2014. Available online: https://standards.iteh.ai/catalog/standards/sist/8a9efc01-6c74-42a2-ad8f-UNIEN15341:2019 (accessed on 5 June 2024).
- ControlAsset®. Available online: http://www.ellfsrl.it/i-nostri-prodotti/controlasset (accessed on 5 June 2024).
Source | Year | Indicator Used |
---|---|---|
Bhardwaj, P. et al. [26] | 2022 | Uptime; downtime; response time; mean time to repair. |
Iadanza, E. et al. [12] | 2019 | Downtime; uptime; mean time to restoration; mean time between failure; class failure ration; global failure rate; age failure rate; negligent actions; “1 day” actions; scheduled maintenance (SM) with failure; scheduled maintenance coverage rate; percentage of no problems found in SM; number of devices per technician; cost of service ratio; internal maintenance cost (% respect to the total maintenance cost); SM cost (% with respect to total cost); corrective maintenance cost (% with respect to total maintenance cost); cost of spare part (% respect to the total maintenance cost). |
Gonnelli, V. et al. [27] | 2018 | Total CED expense as a percentage of total cost of acquisition (cost of acquisition ratio); CM (and SM) expense as a percentage of total CED expense; in-house (and external contracts) expense as a percentage of total CED expense; spare part (and supplies) costs; hourly cost of technicians (internal and external); repair time; uptime; downtime; class failure rate; age failure rate; number of technicians per number of capital devices; number of SM performed per number of capital devices; percentage of SM with problems (i.e., not coded as NPF); “delinquent work-orders” (i.e., not completed within 30 days). |
Camila, R. S. et al. [28] | 2015 | Mean time between failure; mean time to repair; availability. |
Oshiyama, N. F. et al. [29] | 2013 | Number of corrective maintenance events; total time spent on corrective maintenance; corrective maintenance costs. |
KPI | Formula | Indicator Type |
---|---|---|
Average downtime in the warehouse | Logistics management type | |
Receiving spare parts meantime | Logistics management type | |
Mean arrival time | Logistics management type | |
Percentage of external downtime | Logistics management type | |
Average time since first intervention | Technical management | |
Mean time to repair | Technical management | |
Average request closing time | Technical management | |
Supported devices for technical personnel | Technical management | |
One-day action | Technical management | |
Negligent actions | Technical management | |
Average failure time | Equipment management | |
Uptime | Equipment management | |
Uptime for life-saving equipment | Equipment management | |
Corrective maintenance downtime | Equipment management | |
Preventive maintenance downtime | Equipment management | |
Inventories that generate request | Equipment management | |
Number of requests over number of devices per hospital wards | Equipment management | |
Failure rate category | Equipment management |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Picozzi, P.; Nocco, U.; Pezzillo, A.; De Cosmo, A.; Cimolin, V. The Use of Business Intelligence Software to Monitor Key Performance Indicators (KPIs) for the Evaluation of a Computerized Maintenance Management System (CMMS). Electronics 2024, 13, 2286. https://doi.org/10.3390/electronics13122286
Picozzi P, Nocco U, Pezzillo A, De Cosmo A, Cimolin V. The Use of Business Intelligence Software to Monitor Key Performance Indicators (KPIs) for the Evaluation of a Computerized Maintenance Management System (CMMS). Electronics. 2024; 13(12):2286. https://doi.org/10.3390/electronics13122286
Chicago/Turabian StylePicozzi, Paola, Umberto Nocco, Andrea Pezzillo, Adriana De Cosmo, and Veronica Cimolin. 2024. "The Use of Business Intelligence Software to Monitor Key Performance Indicators (KPIs) for the Evaluation of a Computerized Maintenance Management System (CMMS)" Electronics 13, no. 12: 2286. https://doi.org/10.3390/electronics13122286
APA StylePicozzi, P., Nocco, U., Pezzillo, A., De Cosmo, A., & Cimolin, V. (2024). The Use of Business Intelligence Software to Monitor Key Performance Indicators (KPIs) for the Evaluation of a Computerized Maintenance Management System (CMMS). Electronics, 13(12), 2286. https://doi.org/10.3390/electronics13122286