Next Article in Journal
Peak Assessment and Driving Factor Analysis of Residential Building Carbon Emissions in China’s Urban Agglomerations
Previous Article in Journal
Characterization of Clay Shock Slurry and Its Safety Risk Control in Shield Crossing Project
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Success Evaluation Index Model for Running Healthcare Projects in Hong Kong: A Delphi Approach

1
Department of Building and Real Estate, Hong Kong Polytechnic University, 11 Yuk Choi Road, Hung Hom, Kowloon, Hong Kong
2
Department of Construction Management, University of Washington, Seattle, WA 98195, USA
*
Author to whom correspondence should be addressed.
Buildings 2025, 15(3), 332; https://doi.org/10.3390/buildings15030332
Submission received: 21 November 2024 / Revised: 8 January 2025 / Accepted: 17 January 2025 / Published: 22 January 2025
(This article belongs to the Section Construction Management, and Computers & Digitization)

Abstract

:
Hospital projects or healthcare projects (HPs) are major contributors of greenhouse gas emissions, high energy consumption, and environmental pollution. These problems serve as a clarion call for the development of a standardized list of metrics that define the triple bottom line of sustainability performance, track sustainability progress, and allow for essential comparisons or benchmarking of HPs. Through a comprehensive literature review, a Delphi survey with experts, and a fuzzy synthetic evaluation, the ten most suitable key performance indicators (KPIs) were identified, categorized, and modeled into a normalized HP success index (HPSI). The HPSI comprises relatively weighted (in brackets) KPI categories, namely, ‘project prosecution performance’ (0.287), ‘project purpose performance’ (0.353), and ‘project people performance’ (0.360), for evaluating and comparing success levels of HPs. The HPSI provides understanding on the relative contribution levels of the standardized KPIs to achieve predictable life cycle success levels of HPs. Ultimately, it can be used by policymakers and practitioners to inform life cycle decision-making (e.g., resource/effort allocation toward important contributors to success) in HPs. Future studies should seek to develop a computerized HPSI system, by adding quantitative indicators and ranges of KPIs to current findings, to objectively and practically assess, monitor, benchmark, and improve HP success across the life cycle.

1. Introduction

It is the goal of global organizations and governments to establish effective healthcare systems that will culminate in healthy societies over time. Good health and wellbeing are one of the giant pillars of the universal declaration of the Sustainable Development Goals 2030 Agenda by the United Nations to expand healthcare coverage worldwide. Predictively, the world population will expand to 8.5 billion by 2030 and to 9.7 billion by 2050, a phenomenon which will likely increase the aged/elderly population from 1 out of 11 persons now to about 1 out of 6 persons by 2050 [1]. Although the need for healthcare services is expected to exponentially increase in the near future, basic healthcare services are not even accessible to the majority of the world population in the present time [2]. The Organisation for Economic Co-operation and Development admits that the current capacity of the global healthcare system is significantly undermined by the overriding impact of realistic healthcare needs due to insufficient investment [3]. The global experience of the COVID-19 pandemic and its unforgettable instantaneous impacts on human lives emphasize the wide gaps between effective healthcare systems and actual healthcare needs, undermining the sustainable provision of adequate healthcare services. In addition to the impact of the COVID-19 pandemic, changes in demography have also exacerbated the need for healthcare facilities in the case of Hong Kong.
With close to 7.5 million people, Hong Kong is currently within the top five most densely populated jurisdictions in the world [4]. This implies a growing need for healthcare, particularly regarding the expanding elderly demographic. Hong Kong has responded over the years by regularly expanding its base of healthcare facilities. The Hospital Authority, a statutory organization enacted in 1990 under the Hospital Authority Ordinance, has overseen and managed all public healthcare services in Hong Kong since December 1991. As of March 2023, the Hospital Authority manages 43 hospitals and institutions, 49 specialist out-patient clinics, and 74 general out-patient clinics with about 90,000 workforce members and over 30,000 beds. The records of the 2022/23 year show a total of 21.88 million attendances, outreach visits, and discharges of patients [5]. Additionally, 14 major private healthcare facilities are operating under the Hong Kong Private Hospitals Association to complement the efforts of the Hong Kong government in providing healthcare services to people. However, waiting times are naturally not favorable to patients due to the limitation of healthcare facilities and services.
Hong Kong’s elderly demographic is expected to surpass a quarter of the population (i.e., 26.4%) by the year 2036 [6]. To somewhat deal with this and other healthcare needs, the Hong Kong government is currently undertaking two continuous hospital development plans ranging from 2016 to 2036 estimated to be USD 64 billion in total value to enable the upgrade, expansion, and development of several healthcare facilities [7]. The first 10-year plan entails the construction of a new acute hospital, redevelopment or expansion of eleven existing hospitals, construction of three new community health centers, and construction of one new supporting services center. The second 10-year plan also covers works on nineteen HPs. Both plans are expected to provide fifteen thousand additional beds, more than 90 new operating theatres, and other facilities for addressing the forecasted service demand by 2036 [6,8]. Healthcare projects (HPs) entail the planning, construction, and operation of healthcare facilities and infrastructure for delivering diverse healthcare services. HPs are the most essential components of any effective healthcare system because they serve as the quintessential instrumentation for coordinating and integrating healthcare services to reach people [9]. Global governments and organizations are showing commitment by making significant investments into HPs. As of mid-2024, the noticeable HPs in the pipeline around the globe amount to USD 636.8 billion: North America accounts for USD 249.9 billion, western Europe stands at USD 138.3 billion, and north-east Asia amounts to USD 62.1 billion [10]. HPs have unique features and characteristics, including alignment with rapidly changing healthcare legislations, consistency with best standard value delivery, a dynamic and complex implementation process, large-scale design and planning qualifications, technological sophistication, competitive marketplace, numerous and changing implementation requirements, etc. [11,12]. Owing to the abovementioned issues, HPs are, therefore, among the most challenging projects to plan, construct, and operate. This touches on a lot of performance considerations to make HPs successful in the construction industry.
Performance evaluation is relevant in the planning, construction, and operation of HPs for assessing the effectiveness and efficiency of management efforts and identifying areas for potential improvement. Amid several available evaluation systems such as post-occupancy evaluation, balanced scorecard, and benchmarking, the key performance indicator (KPI) system is probably the most prominent in construction research, particularly for measuring different aspects of HPs [13,14,15,16,17,18,19,20,21]. Performance evaluation requires responsible persons to set objectives, identify relevant KPIs, collect and analyze performance data, and forward feedback to the appropriate authorities or persons [22].
There are a number of limitations to performance evaluation systems in HPs specifically. First, the several frameworks of KPIs proposed for evaluating different aspects and segments of HP performance do not necessarily have universal acceptance [23,24]. The development of performance evaluation systems depends on several reasons covering the underlying purpose, target users of results, nature of organization, and the up-to-date trends in the industry [25]. Additionally, the performance expectations of HPs may change with time based on other factors such as applicable legislations. Thus, the available fragmentary frameworks of KPIs require consolidation, updating, modification, and redefinition to best meet the needs of several organizations in the present industry. Second, numerous scholarly works have identified suitable KPIs for measuring HP performance in many instances [21,26,27]. However, there is limited work on how the proposed KPIs should be specifically evaluated in an objective and reliable manner, rather than depending on the subjective semantic interpretations of individual practitioners. Although knowledge on “the what to measure” (i.e., KPIs) is abundant in literature, limited understanding exists on “the how to measure” specifically, limiting the comprehensiveness and practicality of previously proposed models. Finally, past studies are generally limited by the independent phases of the HPs investigated [28,29,30]. The successive phases of HPs (e.g., planning, construction, operation, etc.) are interlinked in a complex way, such that performance issues experienced at an earlier phase could be transmitted to subsequent phases. Thus, the KPIs do not exist in isolation. KPIs that promote social sustainability could influence KPIs that promote economic sustainability and environmental sustainability. For example, neglecting to incorporate maintainability principles in project designs (i.e., for social sustainability and longevity of the facility) may create many maintenance problems, which could negatively affect environmental sustainability (i.e., more resources for maintenance), economic sustainability (i.e., high cost of maintenance), and circular economy (i.e., less intense use of the facility, early retirement of the facility) [31,32]. Moreover, there could be service delivery complications during facility operation, affecting functional capability and social sustainability. Therefore, failure to treat the whole life cycle of HPs as a seamless chain of value creating activities is detrimental to their success.
This study is a component of a broader research aimed at developing a computer-assisted project success index system to measure, monitor, control, enhance, and benchmark the performance of HPs. The present objectives are to (1) identify the most appropriate KPIs for measuring the life cycle success of HPs and (2) develop a composite HP success index (HPSI) for the construction industry. In separate studies, quantitative indicators will be developed as practical interpretations or definitions of the identified KPIs, and quantitative ranges will be established to grade different performance levels of KPIs according to the expectations of Delphi experts. Finally, the broad research will conclude with the consolidation of all findings into a computerized system to assist practitioners in evaluating, monitoring, benchmarking, and improving HP performance in Hong Kong. Theoretically, the study contributes to research efforts toward attaining perfect harmony on the pragmatic KPIs for measuring performance across the life cycle of HPs. The objectivity and practicality of outcomes will enhance the understanding of several organizations and practitioners on what constitutes a successful HP, in terms of effectiveness, efficiency, efficacy, and experience, helping owners and practitioners to design, construct, operate, and manage more successful HPs in Hong Kong. The paper is structured as follows. Following this introduction (Section 1), the research methods are presented (Section 2). Section 3 presents the results, followed by a discussion of the results (Section 4). Section 5 shows an integration of the various underlying KPIs and then a demonstration of the model application (Section 6). Finally, Section 7 presents the conclusion and recommendations for further research.

2. Research Methodology

A blended philosophical approach was employed, combining interpretivism, to develop a thorough quantitative data collection tool, and positivism, to initially refine this tool through a pilot survey before a cross-sectional collection of data via a questionnaire survey for later analysis [32]. The literature review utilized interpretivism to pinpoint the different KPIs essential for hospital projects. Then, the literature’s findings on KPIs were deployed for a questionnaire design. Subsequently, a postpositivist philosophy was applied through a pilot survey of experts to evaluate the questionnaire on the KPIs, which were further refined by rephrasing, removing, or adding additional KPIs. The refined questionnaire was used for quantitative data collection. The various steps as mentioned are detailed subsequently.

2.1. Systematic Literature Search

A systematic review was carried out by searching, screening, and synthesizing pertinent performance or success studies on HPs. The synonymous terms “healthcare project, healthcare center, health center, healthcare facility, hospital, clinic, infirmary, sanatorium, medical center, medical facility, convalescent home, and convalescent facility”, “success, failure, performance, KPI, benchmark, efficiency, and effectiveness”, and “building project, construction project, infrastructure project, engineering project, and construction industry” were combined and searched in the Scopus, Web of Science, and Google Scholar databases. In the Scopus database, the Boolean string of synonymous terms was searched in the title, abstract, and keyword domains of publications and returned 291 results. The same Boolean string was searched in the Web of Science database in the title, abstract, and keyword domains of publications, but this could only extract a handful of results. To obtain better and manageable results, the search was reconducted in all fields/domains of publications and returned 623 documents. The aforementioned results were complemented by searching different combinations of synonymous terms in the search pane of Google Scholar. The top twenty (20) publications generated and “sorted by relevance” were selected from the list of results. At the end, 419 unique publications were compiled from the Google Scholar search. The combined results of 1333 publications were obtained from Scopus, Web of Science, and Google Scholar searches conducted around mid-March of 2023 (Table 1). These publications were reduced to 806 in number upon eliminating duplicates. The titles and abstracts of the remaining publications were carefully reviewed and unrelated publications were also excluded. The unrelated publications focused on other fields and non-performance related subjects, e.g., effectiveness of hospital care delivery. After this screening process, 85 publications were selected for further consideration. Now, key areas of these selected publications (including introduction, results, discussion, and conclusions) were critically reviewed. Publications that did not identify, propose, or evaluate some success criteria, indicators, measures, parameters, or yardsticks at some life cycle phases (e.g., planning, construction, or operation) of HPs were regarded as less relevant and filtered out of the sample. At the end, 39 publications that centered on success or performance in the context of HPs were adjudged suitable so that the findings would be appropriate for empirical research and application in the HP sector. A detailed review of the final sample and consolidation and synthesis of the findings led to the identification and categorization of KPIs. The research phases, processes, methods, and outcomes of the study are shown in Figure 1.

Healthcare Project Success

Success criteria or KPIs are the performance dimensions that are measured to determine the success level of HPs. Studies have proposed different sets of KPIs for evaluating diverse aspects and phases of HPs. The review follows different categorizations of KPIs at the planning and construction phases, as well as the post-construction phase of HPs. In the planning and construction phases, the most popular KPIs are time performance, cost performance, and quality performance [17,18,33,34,35,36,37,38]. Historically, these three KPIs have developed into the basic criteria for measuring success in the project management field in general. Other classical criteria have been introduced to deepen the definition of success, including safety performance, environmental performance, productivity, and resource management [11,20,39]. HPs are naturally complex and uncertain, and they are planned and executed amidst several unpredictable factors. These unavoidable experiences do create both positive and negative risks for projects. The related KPIs entail planning and risk management effectiveness, rework scope, and occurrence and magnitude of litigation, claim, and change [40,41,42,43].
Project participants play important roles in designing, planning, and constructing HPs by contributing expertise and experiences and receiving corresponding rewards in the process. To verify that HPs engage the suitable participants to make the required inputs and obtain fulfilling outcomes, the criteria comprising human resource management, participant profitability, client/participant satisfaction, and participant professionalism, competency, and reputation have been proposed in the literature [39,42,44]. Obviously, the absence of effective networks of relationships creates many problems for HPs that require a joint effort and a smooth working environment to enable participants to deliver on complex and complicated requirements. Accordingly, KPIs, including trust and respect, teamwork and collaboration, communication effectiveness, conflict/dispute occurrence and magnitude, harmonious working relationships, and long-term business relationships, have been developed to underlie the measurement and improvement of such relationship-based performance areas [18,30,39]. The implementation of HPs undergoes regular changes because of the rapid update of technology and binding legislations [11,12]. As such, HPs ought to apply continual improvement methodologies to align with contemporary developments in the sector. The associated success criteria are learning and development, innovation and improvement, and building code adherence [11,39,45].
In the post-construction phase, a key consideration is how the constructed facilities can meet the requirements of end-users and the objectives of the owners. The derivation of fulfilment from the operation of facilities is the “ultimate verdict” on the fact of HPs reaching their development purposes. The KPIs pertaining to facility usage and satisfaction comprise long-term community/societal benefits, service lifespan, flexibility and adaptability, commercial profitability/value, service performance, maintenance interruptions to operations, stakeholder/end-user satisfaction, and functional suitability, capacity, and utilization [24,46,47,48,49]. In the course of the operation of healthcare facilities, there arises the need to regularly restore the condition of facilities to commendable standards. The restoration works could be about the architecture (e.g., painting), engineering (e.g., fixing structural defects), installations (e.g., replacing systems and equipment), and environment (e.g., upgrading therapeutic experience). The effectiveness of facility management practices is likely to contribute to the quality of the healing process. Series of KPIs have been suggested to evaluate the facility management performance of HPs including maintenance effectiveness, maintenance efficiency, operation and maintenance (O&M) expenditure, O&M safety performance, O&M statutory compliance, maintenance time performance, facility condition, and spare parts management [13,23,26,48,50,51,52].
The organizations operating and maintaining healthcare facilities play significant roles in the overall project success across the life cycle. The higher these organizations perform, the more healthcare facilities are in good shape to deliver required healthcare services, and vice versa. The success criteria in this category comprise O&M policy/guideline deployment, O&M information management/sharing, and O&M organization/management effectiveness [15,27,51,53]. Sustainability is a critical topic in the global construction industry because the built environment makes notable negative impacts on the environment through waste emissions and resource utilization, whereas the same suffers from terrible disasters such as floods, earthquakes, fire outbreaks, wind outbreaks, etc. Scholarly experts believe that HPs should be executed in a way that minimizes potential curses and maximizes the resulting blessings over the life cycle. The sustainability-related KPIs are water and waste management, energy utilization, resilience and sustainability, and current replacement value of facility [14,16,28,54,55]. The surrounding environment is as important as the actual healthcare facilities in delivering result-oriented healthcare service due to its impacts on people’s perception and adjustment. For instance, a noisy surrounding environment can hinder or prolong the healing process of patients. KPIs such as site/location optimization, visual appearance and appeal, facility integration into locality, and healthcare culture/image embeddedness have been established to assess the quality of the surrounding environment of healthcare facilities [14,19,21,29,45].
In summary, 54 KPIs were identified from global scholarly works on HP success, performance, or evaluation. By broadly categorizing them, one set of 27 KPIs is suitable for measuring success at the planning and construction phases and another set of 27 KPIs is appropriate for evaluating success at the post-construction phase of HPs. The KPIs at the planning and construction phases were sub-categorized into classical measures, uncertainty and risk, project relationships, project participants, and project improvements groups. Also, the KPIs relevant for the post-construction phase were sub-categorized into facility usage and satisfaction, facility management, organization, sustainability, and surrounding environment groups. This success evaluation framework is comprehensive and versatile because the selected KPIs have been applied to HPs in different jurisdictions including Hong Kong, Ghana, Malaysia, Indonesia, Australia, USA, China, Singapore, Israel, Canada, Spain, UK, and others. Practically, the success evaluation framework will support the decision-making of organizations across the life cycle of HPs by underlying the measurement and improvement of relevant performance areas.

2.2. Data Collection Technique

Presently, there is limited historical and empirical information on using KPIs to assess and monitor the success of HPs over the lifecycle. It is necessary to adopt an approach that explores factual data from the rich expertise and experiences of qualified respondents. Hence, the Delphi method is the most appropriate choice above other methods such as staticised groups, the nominal group technique (NGT), and focus groups [56,57]. This method solves complicated problems using experts’ opinions, self-validates outcomes through consecutive rounds, eliminates the bias effect and pressure to conform, enhances confidentiality through avoidance of direct communication among experts, preserves heterogeneity of experts for valid outcomes, and allows objective analysis using different statistical techniques [56,57,58]. In terms of limitations, the Delphi method may not be appropriate for empirical problem solving with a small budget and limited resources, a short time requirement, large sample sizes of respondents, less experienced and qualified respondents, and high attrition of respondents over time. Nevertheless, the Delphi method has gained popularity in complicated construction engineering and management (CEM) research, including sustainable development [59], team integration [60], procurement selection [61], performance evaluation [62,63], and risk assessment and allocation [57]. The process followed in this research study is presented in Figure 1.

2.3. Design and Pilot Testing of Survey Instruments

The KPIs (with definitions) identified from the literature were used to design templates of questionnaires for the Delphi rounds in line with Yeung et al. [62]. The initially prepared questionnaire templates were piloted on two international academics and one local practitioner with experience in HP development. The comments received were considered in revising and finalizing the questionnaires.

2.4. Data Collection Process

The main steps involved in the Delphi process include selecting pre-defined experts, setting the number of survey rounds, and structuring the questionnaires for every round [59]. Decisions on the number of survey rounds depend on the targeted convergence level and improvement of accuracy [56,64] to mitigate participatory fatigue, time and resource constraints, and attrition rate [65]. This study used four survey rounds to obtain information from the experts in line with the CEM norm of two to six rounds [66].
The scale used for the survey was a five-point Likert scale described as 1 = least important to 5 = most important. The scale was chosen because of its relative brevity in collecting responses and its suitability for evaluating unipolar dimensions [60,62,63]. Given the expertise requirement, it is believed that the respondents understood the scale labels and assigned scores to the KPIs appropriately.
To exercise proper control over the Delphi process, the experts were broadly educated on what the study was about, the expected benefits, and their required commitment [57]. Again, the experts were kept in close contact, communication with experts was clear and effective, survey instruments were designed to be simple with a completion duration of about 20 min, surveys were administered in person and by email, and follow-ups were conducted through email and personal visits before the completion deadlines [57,61,62]. These measures somewhat sustained the interest of experts, minimized attrition, and boosted response rates and promptness [56].

2.5. Panel Experts’ Selection

The authenticity of a Delphi study depends greatly on how researchers select qualified experts carefully and objectively [61,64]. Experts should be willing and readily available to review previous opinions so that the consensus level can be boosted [64]. Purposive and snowball sampling approaches were adopted to select qualified experts in this study [57]. Initially, formal invitation letters were delivered to targeted organizations that are involved in HP development, e.g., Hospital Authority, Architectural Services Department, and construction firms. The recipients were requested to nominate suitable experts who work within or with the organizations based on the set criteria [59,61,62]. The snowball sampling was complementarily used to opportunistically request the identified experts to recommend other known experts [57]. Apart from their roles in the organizations, the experts were required to meet the following developed criteria to be eligible for the panel [61,62,66]:
  • Knowledge and in-depth understanding of the planning, construction, and/or operation of HPs;
  • Recent hands-on experience in planning, constructing, and/or operating HPs; and
  • Played leading roles in the construction industry.
Based on availability and readiness, 19 experts performing significant roles in their organizations were qualified to join the panel and serve as Delphi respondents. This is adequate and in conformance with CEM research because typical panel sizes range from 3 to 93 experts [66]. The background information revealed that the responses represented a balanced view of experienced construction professionals and key stakeholders involved in HP implementation and operation (Table 2). Proper control was exercised over the Delphi process to extract substantial information from the experts within time and resource constraints.

2.6. Formats of the Delphi Survey Rounds

The Round 1 questionnaire was issued to the 19 qualified experts through email around mid-May 2023. The experts were requested to select a maximum of 6 KPIs separately from the planning and construction phase and post-construction phase perceived as being the most representative for measuring HP success. They were encouraged to inclusively suggest and select new KPIs that were not already covered in the checklist and yet were applicable in the Hong Kong context. Follow-ups were conducted until all experts returned the completed questionnaires by the end of July 2023.
The Round 2 questionnaire was issued to panelists in early August 2023. The essence of this round was for experts to affirm or change the selection of KPIs from Round 1. The consolidated feedback (i.e., frequency percentages of KPIs) from Round 1 was provided for experts’ reference. Necessary follow-ups were conducted on experts who could not complete the questionnaire early. By mid-September 2023, all 19 experts had returned duly filled questionnaires.
The Round 3 questionnaire was issued to the panel experts around mid-September 2023. The experts were asked to rate the importance levels of the ten shortlisted KPIs by using the five-point Likert scale. The consolidated feedback from Rounds 1 and 2 was provided for experts’ reference. Upon numerous follow-ups, only 15 questionnaires were completed and returned by late October 2023. Four experts withdrew due to commitments and workload.
In Round 4, the questionnaire was issued to the remaining experts, and they were given the opportunity to confirm or change their previous ratings of the ten shortlisted KPIs in light of the feedback information provided. The survey duration extended across November 2023, allowing the 15 remaining experts to make contributions upon several follow-ups. Thus, the response rate was 79% minimum across the four rounds. This is comparable to Yeung et al. [62] and Yeung et al. [63] with response rates of 79.5% and 35.56%, respectively. Questionnaires for the four survey rounds are attached as Supplementary Materials.

2.7. Analysis Methods

IBM SPSS 20.0 and Excel 2021 were engaged to perform statistical analysis on the survey data including Cronbach’s reliability analysis (α), Kendall’s coefficient of concordance (W), frequency analysis, mean score ranking (MS), factor analysis (FA), and fuzzy synthetic evaluation (FSE).

2.7.1. Frequency Analysis

The proportions of experts selecting the KPIs formed the basis for computing frequency percentages and selecting appropriate KPIs. Only KPIs preferred by at least 50% of the panelists were considered significant and shortlisted [61]. This approach is reasonable because the essence of the Delphi process is to ensure adequate consistency among the experts’ solutions or perceptions.

2.7.2. Kendall’s (W) Analysis

Kendall’s (W) was computed to statistically test the significance of the null hypothesis that experts’ ratings of KPIs were totally unrelated. An agreement level of 0 means that the experts’ ratings are totally unrelated, whereas an agreement level of 1 means that the experts’ ratings are completely identical [66].

2.7.3. Mean Score Ranking

MS is a commonly used technique in construction management research to establish the relative importance or criticality of factors [67]. In this study, it was adopted to rank the shortlisted KPIs for evaluating HP success based on importance levels.

2.7.4. Factor Analysis

Given the list of interrelated KPIs, a FA was performed to reduce them into manageable KPI components. Specifically, the principal component factor analysis (PCFA), with varimax rotation and Kaiser normalization, was used to extract the factors [68]. Prior to the PCFA, statistical tests including the reliability of dataset, correlation matrix, Bartlett’s test of sphericity, and the Kaiser–Meyer–Olkin (KMO) sampling adequacy measure were computed to check on the appropriateness of the factor model [67]. An eigenvalue benchmark of 1.0 was used to determine the principal factors to retain. Moreover, to adequately represent any significant relationships among the extracted components, only factor loadings that were greater than 0.5 were considered [69].

2.7.5. Fuzzy Synthetic Evaluation (FSE)

FSE is an aspect of fuzzy set theory (FST) that was introduced by Zadeh [70] “for representing and manipulating ‘fuzzy’ terms… [and] uses degrees of membership in sets rather than strict true/false membership” [71], p. 494. With this modeling technique, multi-evaluations and multi-attributes could be quantified appropriately [72]. Evaluating HP success is a multi-criteria decision-making process because there may be many decision makers, and uncertainty, imprecision, and incomplete information surround the decision-making process [73]. Given the unavoidable linguistic terms such as poor, good, and excellent performance in the fuzzy environment, FSE could be utilized to reach credible decisions from vague facts by defining them linguistically [74]. FSE has diverse applications in construction research, including risk evaluation [66], critical success factors evaluation [75], and PPP implementation evaluation [76].
FSE was used to model the role of KPIs in evaluating HP success. The fuzzy modeling method is better and preferred to other probabilistic modeling methods in terms of practicality and complexity of the algorithms [75]. For instance, when compared to the normal weighted method, FSE is appropriate because it can better objectify and handle the subjective judgment prevalent in the natural human thinking process [77]. Additionally, the FSE technique has been used for modeling in research where sample sizes are relatively small, i.e., fewer than ten sample points [78,79]. There is no universally agreed upon sample size criterion for conducting FSE. Hence, coupled with the self-validating Delphi process, the FSE technique is an appropriate choice for this study that is based on 15 expert responses. The steps below were followed for the FSE modeling [67]:
1.
Establish the basic criteria set. = { f 1 , f 2 ,   f 3 ,   f 4 , , f n }; n represents the number of criteria.
2.
Label the set of grade choices as L = { L 1 , L 2 , L 3 , L 4 , , L n } . The set of grade choices represents the points on the scale of measurement. Thus, the five-point scale is represented as: L 1 = least important, L 2 = fairly important, L 3 = important, L 4 = very important, and L 5 = most important.
3.
Compute the weighting for each criterion or factor component. By using the survey results, the formula for computing the weighting (W) is given by:
W i = M i i = 1 5 M i   ,       0 W i 1   ,                                 W i = 1    
where W i = the weighting, M i = mean score of a specific criterion or factor component, and M i = sum of corresponding mean ratings.
4.
Apply the fuzzy evaluation matrix to each factor component. The evaluation matrix is represented as R i = ( r i j ) m x n , where r i j is the extent to which choice L j satisfies the criterion f j .
5.
Derive the final FSE results from the weighting vector and fuzzy evaluation matrix using the formula below:
D = W i R i  
where D = the final FSE matrix and o = the fuzzy composite operator.
6.
The final FSE matrix is normalized and the HPSI for a specific factor component is computed using the following formula:
H P S I = i = 1 5 D × L      

3. Results

3.1. Selecting the Most Relevant KPIs: Delphi Survey Rounds 1 and 2

Table 3 presents the selection prioritization of KPIs among the panel experts. Only the KPIs selected by the majority of the panelists (i.e., meeting the 50% shortlisting criterion) were considered significant for further consideration. The ranking follows the percentage of experts that selected each KPI. In both rounds, the traditional iron triangle of construction quality performance, construction time performance, and construction cost performance were the significant KPIs for evaluating HP success at the planning and construction phase together with construction safety performance and innovation and improvement. Rounds 1 and 2 results both suggest that stakeholder/end-user satisfaction, functional suitability, maintenance effectiveness and efficiency, and functional capacity and utilization are the significant KPIs for measuring HP success at the post-construction phase, in addition to flexibility and adaptability of facility found only in Round 2. Meanwhile, a new factor “* Defects rectifications and improvement extent owing to design afterthought” was suggested but could not rank high enough.

3.2. Rating the Shortlisted KPIs: Delphi Survey Rounds 3 and 4

The ranking of the ten shortlisted KPIs based on mean ratings is indicated in Table 4. The top-ranked KPIs of HP success were construction quality performance (R1 rank = 1; R2 rank = 1), construction safety performance (R1 rank = 3; R2 rank = 2), and stakeholder/end-user satisfaction (R1 rank = 1; R2 rank = 3). Largely, it is observable that the mean scores of the top-ranked KPIs improved in Round 4, indicating that the experts reconsidered these KPIs to be extremely significant for the purpose. Additionally, the high ranking of KPIs from both the planning and construction phase and post-construction phase manifests the relevance of the entire lifecycle in comprehensively assessing HP success. This makes sense because HPs are social projects in nature and their true values are witnessed during operation to improve the health and sustain the lives of global citizens, e.g., global governments were extremely dependent on healthcare facilities to mitigate the impact of the COVID-19 pandemic. Although all KPIs are important, focusing on these ten shortlisted KPIs could help benchmark, track, and improve HP success significantly in Hong Kong.
The Kendall’s (W) values obtained for the panel in Round 3 and Round 4 were 0.395 and 0.594, respectively, and both were significant at the 1% statistical level (Table 4). Accordingly, the null hypothesis that no significant agreement occurs among experts’ ratings was not supported. The results are considerable when compared to Yeung et al. [63], who obtained Kendall’s (W) values of 0.123 and 0.253 in similar Rounds 3 and 4, respectively. Although the agreement level of experts was likely to increase further from the initial 50.38% with extended survey rounds, this study was limited to four rounds to reduce attrition rate and boost response quality.

3.3. Identification of KPI Groupings for Healthcare Project Success

The 10 shortlisted KPIs were grouped by using the PCFA technique. Statistical tests were conducted on the data prior to the PCFA for appropriateness checks. The Cronbach’s (α) value obtained for the 10 selected KPIs was 0.858, exceeding the recommended 0.70 benchmark [80]. This shows that the responses of the panel experts are uniform and consistent, and the scale adopted is reliable. Also, the correlation matrix reveals strong relationships among the 10 selected KPIs, as most of the partial correlation coefficients were above 0.30 [68]. The sampling adequacy was determined by the KMO statistic. The KMO statistic returned an acceptable value of 0.589, which is greater than the recommended 0.50 level [68]. Hence, the sample was considered adequate for satisfactory PCFA. Lastly, the Bartlett’s test of sphericity resulted in a significant (X2) value of 95.87 (p  < 0.01). It can be inferred that the correlation matrix is not an identity matrix [68]. All the tests conducted do confirm that PCFA was appropriate for this study.
The principal component extraction and varimax rotation options were used to generate a three-factor solution. The three factors had eigenvalues of at least 1.0 and 76.25% of total variance explained, and all item loadings had absolute values above 0.50. The resultant three-factor model was an adequate representation of the dataset. The factor groupings were subjectively labeled as follows (Table 5):
  • KPIG 1—Project prosecution performance;
  • KPIG 2—Project purpose performance; and
  • KPIG 3—Project people performance.
These three groupings are believed to sufficiently explain and evaluate HP success.

3.4. Deriving the HPSI for Each KPIG of Healthcare Projects

To generate the HPSIs for the respective KPIGs, two levels were established prior to using the FSE technique for the modeling. The first level was the groups (KPIGs) and the second level was the individual factors (KPIs). Overall, three KPIGs and ten KPIs were established for the modeling process. The FSE procedure for evaluating HP success is demonstrated in the subsequent sections.
Step 1: Calculate the weightings of KPIs and KPIGs
The appropriate weightings of the KPIs and KPIGs were calculated with a formula (Equation (1)) by using the mean scores from Table 4. An illustration of the computations from Table 5 is demonstrated subsequently. For instance, the KPIG 1 (project prosecution performance) was made up of six KPIs with a total mean score of 22.47. Hence, the appropriate weighting for construction time performance (KPI5) was computed as:
W K P I 5 = 4.33 3.13 + 3.13 + 4.13 + 4.33 + 4.00 + 3.73 = 4.33 22.47 = 0.193
Accordingly, the appropriate weightings of all the other KPIs and the KPIGs for construction projects were calculated (see Table 5).
Step 2: Establish the membership functions for the KPIs and KPIGs
In applying the FSE technique, two levels of membership functions (MFs) were determined from level 2 to level 1. The fuzzy MF, which shows the level to which an element belongs to a fuzzy set, usually ranges from 0 to 1. While a value of 1 shows full membership, 0 represents no membership of the corresponding element in the fuzzy set [77]. In determining the MFs of the KPIGs, the MFs of the KPIs were first established, like the procedure in computing the weightings. The MF of a KPI is established based on the percentages of respondents who selected the respective five defined grades, i.e., L 1 = l e a s t   i m p o r t a n t to L 5 = m o s t   i m p o r t a n t . Given the example of “construction quality performance” (KPI1), 13% and 87% of panel experts rated it very important and most important, respectively. Hence, the MF for KPI1 was expressed as:
M F K P I 1 = 0.00 L 1 + 0.00 L 2 + 0.00 L 3 + 0.13 L 4 + 0.87 L 5
In a simpler format, the MF of KPI1 may be expressed as ( 0.00 , 0.00 , 0.00 , 0.13 , 0.87 ) . The MFs of all KPIs were expressed following the procedure already described (Table 5). At level 1, the MFs of the KPIs formed the basis for computing the MFs of the KPIGs by using Equation (2). For instance, the MF of the KPIG 3 (project people performance) was expressed in computation form as:
D K P I G   3 = 0.497 ,   0.503 × 0.00 0.00 0.00 0.00         0.07 0.00       0.13 0.20         0.80 0.80
D K P I G   3 = 0.00 ,   0.00 ,   0.03 ,   0.17 ,   0.80
Similarly, the MFs of KPIGs 1 and 2 were computed and all are presented in Table 5.
The final step was to derive the HPSI for each of the KPIGs. The MFs of the KPIGs at level 1 were used to derive the respective HPSIs with the help of Equation (3). The computations of the HPSIs for KPIG 1, KPIG 2, and KPIG 3 are illustrated below.
H P S I K P I G   1 = 0.05 ,   0.05 ,   0.18 ,   0.50 ,   0.22 × 1 ,   2 ,   3 ,   4 ,   5 = 3.80
H P S I K P I G   2 = 0.00 ,   0.00 ,   0.00 ,   0.32 ,   0.68 × 1 ,   2 ,   3 ,   4 ,   5 = 4.68
H P S I K P I G   3 = 0.00 ,   0.00 ,   0.03 ,   0.17 ,   0.80 × 1 ,   2 ,   3 ,   4 ,   5 = 4.77
Step 3: Developing an overall HPSI model
Since the KPIGs were not significantly correlated with one another, a linear and additive model was adopted to develop the composite HPSI for evaluating HP success (Yeung et al., 2009 [63]). To formulate the linear and additive model, all the KPIGs were first normalized. This is logical and valid because practitioners can easily compare the relative activities among the variables forming the linear equation. Practitioners can focus relative attention on the components of the model to improve HP success predictably. Moreover, this provides flexibility for practitioners in choosing the most fitting measurement scales for accurate evaluation [60,64]. The derivations of the coefficients are demonstrated subsequently (Equation (5)):
C o e f f i c i e n t K P I   1 = 3.80 ÷ 13.24 = 0.287 C o e f f i c i e n t K P I   2 = 4.68 ÷ 13.24 = 0.353   C o e f f i c i e n t K P I   3 = 4.77 ÷ 13.24 = 0.360
In essence, the composite HPSI was expressed in the linear equation as Equation (6):
H P S I = 0.287 × p r o j e c t   p r o s e c u t i o n   p e r f o r m a n c e + 0.353 × p r o j e c t   p u r p o s e   p e r f o r m a n c e + 0.360 × p r o j e c t   p e o p l e   p e r f o r m a n c e    

4. Discussion of Results

The HP success evaluation model (Equation (6)) reveals that “project people performance” had the highest coefficient (0.360) and was closely followed by “project purpose performance” (0.353) and “project prosecution performance” (0.287) (Figure 2). The ranking of the categories of KPIs, as revealed in Figure 2, shows a progression from short-term or fundamental KPI categories to long-term or ultimate KPI categories. The indices were combined to produce a linear model. The linear model offers a more objective and reliable approach that will aid practitioners to evaluate and compare HP success levels. Practitioners can further benchmark, monitor, and improve HP success levels predictably.

4.1. Project Prosecution Performance (KPIG 1)

The KPIG 1 obtained 43.91% of the total variance explained and all the factor loadings ranged from 0.724 to 0.895 in the PCFA. It was assigned the smallest coefficient (0.287) in the linear model upon obtaining an HPSI of 3.80. Its underlying KPIs, in descending order, included ‘construction time performance’, ‘construction cost performance’, ‘maintenance effectiveness and efficiency’, ‘functional capacity and utilization’, ‘innovation and improvement’, and ‘flexibility and adaptability of facility’. These KPIs are mainly related to the processes and actions required to implement and operate HPs successfully. For instance, ‘construction time performance’ and ‘construction cost performance’ are KPIs for assessing whether the time and cost of project execution are within or exceeding schedule and budget, respectively. Such assessments are mostly conducted quantitatively through variance analyses. Cost variance, for instance, is the difference between the earned value amount and the cumulative actual costs of a project [81], p. 258. This metric could also be expressed as an index referred to as cost performance index. The index is the ratio of the earned value amount to the cumulative actual cost of a project. Regarding time or schedule variance, it is the difference between the earned value and the planned value. Similarly, the time variance can be expressed as the schedule performance index, which is the ratio of the earned value to the planned value [81], p. 259. These variance analyses are key to progress the performance assessment of a project, and they could be used to assess project performance regarding the triple bottom line of sustainability, specifically economic sustainability. The other KPIs, such as ‘maintenance effectiveness and efficiency’, ‘functional capacity and utilization’, ‘innovation and improvement’, and ‘flexibility and adaptability of facility’, are suitable for assessing the project before or during its use. Tushar et al. [82] identified maintenance effectiveness as an important factor that can enhance the service quality of hospitals, ensure affordable and reliable service, and optimal equipment functionality. Similarly, Ebekozien [83], p. 32, stated that “maintenance is a massive investment if functionality and quality are to be sustained”. Fotovatfard and Heravi [84] emphasized the impact of maintenance on saving energy. They concluded that the most effective maintenance which leads to energy saving is condition-based maintenance. Condition-based maintenance entails replacing outdated equipment to improve functionality and to reduce maintenance costs. Condition-based maintenance could be achieved through inspection of equipment at fixed time intervals.
Quantitative metrics could be deployed for assessing these indicators. For example, ‘flexibility and adaptability of facility’ can be measured by assessing how possible it is to adapt the facility and its installation systems easily to accommodate additional demands from the end-users. This possibility can be expressed using a Likert scale. Similarly, ‘maintenance effectiveness and efficiency’ can be assessed based on the frequency or quality of maintenance activities, e.g., zero, less, or more maintenance backlog (expressed in a Likert scale). Regarding the triple bottom line of sustainability, these indicators are essential for assessing the level of social sustainability attainment. Additionally, they could also serve as indicators for environmental sustainability assessments. For instance, ‘flexibility and adaptability’ assesses how easily a healthcare facility can adapt to accommodate additional demands from the users. This could prevent building appendages and ensure a longer use of the facility, thus promoting circular economy. Moreover, with the embracing of environmental sustainability as the de facto standard in the healthcare sector [85], ‘innovation and improvement’ is an essential KPI to assess HPs thereof, although this KPI is a multifaceted concept. Thus, innovation regarding reuse, recycling, and reduction of resource consumption is an essential KPI for assessing environmental sustainability performance of HPs. It also entails assessments of carbon emissions to track greenhouse gas emissions from HPs. ‘Innovation and improvement’ could be assessed by the number of environmental excellence awards received by the HP [85].

4.2. Project Purpose Performance (KPIG 2)

The KPIG 2 explained about 17.99% of the total variance and the two item loadings were −0.802 and 0.856. It had a higher HPSI of 4.68 and a corresponding linear equation coefficient of 0.353. The underlying KPIs included ‘construction quality performance’ and ‘functional suitability’. This set of KPIs relate to how the implementation of HPs meets the defined purpose and objectives. ‘Construction quality performance’ can be measured as the cost of rectifying major defects or nonconformances over the total project cost. This metric can be expressed percentagewise by multiplying it by 100. Based on the estimated percentage value, a Likert scale (i.e., poor performance expectation, average performance expectation, good performance expectation, very good performance, and excellent performance) could be deployed to rate the ‘construction quality performance’ of a project. To improve this KPI, training of the workers to enhance their expertise and compliance of safety issues, purchasing and using of the right materials, and equipment are all necessary to contribute to the required level of quality. These activities have cost implications and are, therefore, referred to as the cost of conformance to quality [81]. However, cost of nonconformance to quality or cost of failure is the associated cost for not satisfying the quality expectations of the project. Such a cost could result from not using the right materials for construction or from a lack of adequate training of the workforce [81]. Regarding the ‘functional suitability’, it can be measured as the cost of modifications of the facilities (i.e., facilities include building, components, installations, machinery, systems, etc.) to meet relevant functional requirements as part of the current plan (expressed in percentage). Likewise, a Likert scale (i.e., poor performance expectation, average performance expectation, good performance expectation, very good performance, and excellent performance) could be utilized to rate the ‘functional suitability’ of a facility. Both ‘construction quality performance’ and ‘functional suitability’ are suitable for achieving social sustainability, as these KPIs contribute to wellbeing and satisfaction. As such, they can also be assessed on the level of satisfaction (i.e., very satisfied, satisfied, neutral, not satisfied, very dissatisfied) using a Likert scale [86]. ‘Construction quality performance’ leads to economic sustainability, since quality performance prevents cost implications of rework due to failure or nonconformance.

4.3. Project People Performance (KPIG 3)

The KPIG 3 accounted for 14.36% of the total variance explained, with two KPIs having factor loadings of −0.717 and 0.820, respectively. Being the highest ranked, the HPSI for the KPIG 3 was 4.77 and the associated coefficient in the linear model was 0.360. The underlying KPIs entailed ‘stakeholder/end-user satisfaction’ and ‘construction safety performance’. This group of KPIs concerns how the implementation and operation of projects bring best experiences to stakeholders. ‘Stakeholder/end-user satisfaction’ can be measured as the ‘usable floor area of facility spaces categorized as satisfactory regarding amenity and comfort engineering over the total usable floor area of a facility’. This could be expressed percentagewise and be rated using a Likert scale. Regarding ‘construction safety performance’, it can be measured as the number of accidents (injuries or casualties) per assessment period, i.e., man-hours (expressed in percentage). Similarly, a Likert scale could be employed to measure the level of safety (i.e., from low safety to high safety). These KPIs serve as a direct assessment of social sustainability for the sustainable development of HPs. The social sustainability of a hospital is reflected in its capacity to maintain people’s health and offer them a healthy life [82].

5. Integration of the Various Underlying KPIs

Although the three groups are separated, they do not occur in isolation. Rather, each group or category influences one another through its underlying KPIs. The interactions among the underlying KPIs were qualitatively depicted based on the consolidation of findings from the literature review (see Figure 3). Regarding the interactions among the various groups, the underlying KPIs of the category ‘project prosecution performance’ influenced one another within the same group, in addition to influencing other underlying KPIs in the other two categories. For example, delays in ‘construction time performance’ could influence ‘construction cost performance’ through cost overruns. Similarly, ‘flexibility and adaptability of facility’ could influence ‘maintenance effectiveness and efficiency’ of a facility [87]. Furthermore, ‘innovation and improvement’ of a facility will enhance ‘functional capacity and utilization’ and ‘maintenance effectiveness and efficiency’ [87] (see Figure 3). Moreover, ‘maintenance effectiveness and efficiency’ influences ‘functional capacity and utilization’. Keeping a healthcare facility in its functional state and good conditions demands effective and efficient maintenance [83].
Concerning the interaction among categories, the underlying KPIs of ‘project prosecution performance’ could influence the underlying KPIs of the category ‘project people performance’. ‘Functional capacity and utilization’ could affect ‘stakeholder/end-user satisfaction’ [87,88]. Likewise, an underlying KPI such as ‘construction safety performance’ could influence KPIs such as ‘construction time performance’ and ‘construction cost performance’. Accidents on construction sites could lead to loss of life or injuries which could also lead to loss of productive labor and job site closure by safety authorities for investigation. Consequently, these safety issues affect ‘construction time performance’ and ‘construction cost performance’. Moreover, ‘project purpose performance’ could influence ‘project prosecution performance’ and ‘project people performance’ via their underlying KPIs. ‘Construction quality performance’, for instance, influences ‘construction cost performance’ and ‘construction time performance’. Poor quality of work could lead to a cost of nonconformance to quality or a cost of poor quality. This is the cost associated with not satisfying the quality expectations of the project. Nonconformance to quality could result in reworks, which could cost the project additional time and money. The results of these could be cost and time overruns on the project. Additionally, ‘construction quality performance’ and ‘functional suitability’ in the category ‘project purpose performance’ could both influence ‘stakeholder/end-user satisfaction’ in the category ‘project people performance’ [89] (see Figure 3).

6. Demonstration of Model Application

A few steps must be followed to properly compute the composite HPSI for a particular HP using Equation (6). The linear and additive model enables the choice of different units of measurement for the KPIGs in a single assessment process. In this demonstration, the adapted unit of measurement for the KPIGs and KPIs was a seven-point Likert scale defined as 1 = very dissatisfied, 2 = dissatisfied, 3 = slightly dissatisfied, 4 = neutral, 5 = slightly satisfied, 6 = satisfied, and 7 = very satisfied [40]. Although this scale may be sub-optimal due to the inherent subjectivity of the assessment, it is useful for demonstration purposes now, pending the establishment of applicable objective scales in future research. Nevertheless, the subjectivity could be reduced by the choice of appropriate respondents, data collection methods (multi-round survey, e.g., Delphi, focus group etc.), and fuzzy analysis methods that account for the limitations of the human cognition processes. It is assumed that a multi-round Delphi survey was conducted on relevant stakeholders about their perceived (dis-)satisfaction levels with the 10 KPIs in two projects, A and B.
The score of each KPIG was derived from the average of the mean ratings of the comprising KPIs. For project A, the scores of KPIG 1, KPIG 2, and KPIG 3 were assumed to be 3, 4, and 5, respectively. For project B, the scores of KPIG 1, KPIG 2, and KPIG 3 were assumed to be 6, 7, and 6, respectively. These sets of scores were then separately substituted into the model to derive the overall HPSI values. For project A, the HPSI value was 0.287 × 3 + 0.353 × 4 + 0.360 × 5 = a p p r o x i m a t e l y   4   o u t   o f   7 . The result means that the surveyed stakeholders are neither dissatisfied nor satisfied with the success of project A. Regarding project B, the HPSI value was computed as 0.287 × 6 + 0.353 × 7 + 0.360 × 6 = a p p r o x i m a t e l y   6   o u t   o f   7 . Thus, the stakeholders are generally satisfied with the success of project B. The computations allow comparison and benchmarking of the success levels of HPs.

7. Conclusions and Recommendations

In this study, KPIs were identified toward developing a composite HPSI model to evaluate and compare life cycle success levels of HPs. A comprehensive literature review was first conducted to identify KPIs across the life cycle of HPs. Then, a Delphi survey was conducted among experts and a consensus was reached on the ten most essential KPIs for representing the life cycle of HPs. These included: ‘construction safety performance’, ‘stakeholder/end-user satisfaction’, ‘construction quality performance’, ‘functional suitability’, ‘construction cost performance’, ‘flexibility and adaptability of facility’, ‘construction time performance’, ‘maintenance effectiveness and efficiency’, ‘innovation and improvement’, and ‘functional capacity and utilization’. Three categories were developed from the KPIs, viz: ‘project prosecution performance’, ‘project purpose performance’, and ‘project people performance’. Through a fuzzy synthetic evaluation technique, a normalized composite HPSI was developed. The normalized indices of the three categories of indicators (in brackets) revealed their relative importance as follows: ‘project prosecution performance’ (0.287), ‘project purpose performance’ (0.353), and ‘project people performance’ (0.360). These indices were combined through a linear additive model to develop a composite HPSI evaluation model. The HPSI has both practical and theoretical implications. The HPSI can be used to track HP performance regarding the tipple bottom line of sustainable development. Moreover, it can be deployed for comparison or benchmarking of HPs for performance improvement, and, therefore, can be used to inform decision-making around HPs by policymakers and practitioners. Furthermore, through the relative importance of the indices, the HPSI seeks to inform policymakers and practitioners working on HPs on the allocation of project resources/efforts among the three categories of KPIs. For instance, among the three categories of KPIs, more resources (i.e., financial resources, material resources, and human resources) should be allocated to attain ‘project people performance’, followed by ‘project purpose performance’ and ‘project prosecution performance’. This prioritization concerning the allocation of resources among the KPI categories is based on the relative weightings.
Notwithstanding the relevance of the study, there are limitations which are worth noting. The interactions among the underlying KPIs were depicted based on the synthesis of findings from the literature. Future research could empirically assess the cause–effect relationships among the KPIs. The fuzzy synthetic evaluation technique could be laborious regarding the computation of indices. Accordingly, future study should seek to add other variables (i.e., quantitative indicators and ranges of KPIs) to current findings to develop a computerized success evaluation system for evaluating, monitoring, improving, and benchmarking the life cycle success of HPs in an objective, reliable, and practical manner. As this study demonstrated the model with hypothetical projects, it is recommended that future research incorporates case studies of HPs that have implemented the selected KPIs and achieved sustainability goals. Finally, this study focused on the perspectives of construction experts concerning the KPIs. Future studies could focus on the perspectives of other stakeholders such as patients, healthcare providers, and community members on the importance of sustainability indicators or KPIs in HP success.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/buildings15030332/s1, Questionnaire Appendix S1: Hospital Project Delphi Questionnaires Appendix; Questionnaire Round 1 S2: Hospital Project Delphi Questionnaires Round 1; Questionnaire Round 2 S3: Hospital Project Delphi Questionnaires Round 2; Questionnaire Round 3 S4: Hospital Project Delphi Questionnaires Round 3; Questionnaire Round 4 S5: Hospital Project Delphi Questionnaires Round 4.

Author Contributions

Conceptualization, A.P.-C.C., M.-W.C. and A.D.; data curation, G.D.O.; formal analysis, G.D.O.; funding acquisition, A.P.-C.C., M.-W.C. and A.D.; investigation, G.D.O.; methodology, A.P.-C.C., M.-W.C. and A.D.; project administration, M.-W.C. and A.D.; supervision, A.P.-C.C.; validation, A.P.-C.C., M.-W.C. and A.D.; visualization, G.D.O. and M.A.A.; writing—original draft, G.D.O. and M.A.A.; writing—review and editing, A.P.-C.C. and A.D. All authors have read and agreed to the published version of the manuscript.

Funding

This paper is a component of the Research Grants Council (RGC) funded study “Developing a computerized project success index system to monitor and benchmark the performance of hospital projects” (RGC General Research Fund: 15205421). Accordingly, this paper shares similar background and literature, but dissimilar scope, objectives, and outcomes, with other papers that have been/may be published elsewhere.

Data Availability Statement

The study’s dataset will be made available to interested persons upon request.

Acknowledgments

The authors would like to thank the Delphi experts who contributed their HP experiences to the study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. United Nations. World Population Prospects 2019: Highlights (ST/ESA/SER.A/423); United Nations: New York, NY, USA, 2019. [Google Scholar]
  2. World Health Organization. World Health Statistics 2019: Monitoring Health for the SDGs, Sustainable Development Goals; World Health Organization: Geneva, Switzerland, 2019. [Google Scholar]
  3. Organisation for Economic Co-Operation and Development (OECD). Investing in Health Systems to Protect Society and Boost the Economy: Priority Investments and Order-of-Magnitude Cost Estimates; Organisation for Economic Co-Operation and Development (OECD): Paris, France, 2022. [Google Scholar]
  4. World Cities Culture Forum. The Creative Economy: A Cornerstone of Hong Kong’s Future; World Cities Culture Forum: London, UK, 2024. [Google Scholar]
  5. Hospital Authority. Introduction; Hospital Authority: Hong Kong, 2024. [Google Scholar]
  6. International Trade Administration. Hong Kong: Healthcare; International Trade Administration: Washington, DC, USA, 2024. [Google Scholar]
  7. Legislative Council Panel on Health Services (LCPHS). Second Ten-Year Hospital Development Plan; LC Paper No. CB (2)1167/18-19(07); Hong Kong Government: Hong Kong, 2019. [Google Scholar]
  8. Legislative Council Panel on Health Services (LCPHS). The First and Second 10-Year Hospital Development Plan; LC Paper No. CB (4)600/20-21(08); Hong Kong Government: Hong Kong, 2021. [Google Scholar]
  9. World Health Organization. Hospitals; World Health Organization: Geneva, Switzerland, 2021. [Google Scholar]
  10. GlobalData. Project Insight: Global Healthcare Construction Projects (Q2 2024); GlobalData: London, UK, 2024. [Google Scholar]
  11. Sharma, V.; Caldas, C.H.; Mulva, S.P. Development of metrics and an external benchmarking program for healthcare facilities. Int. J. Constr. Manag. 2021, 21, 615–630. [Google Scholar] [CrossRef]
  12. Soliman-Junior, J.; Tzortzopoulos, P.; Baldauf, J.P.; Pedo, B.; Kagioglou, M.; Formoso, C.T.; Humphreys, J. Automated compliance checking in healthcare building design. Autom. Constr. 2021, 129, 103822. [Google Scholar] [CrossRef]
  13. Amos, D.; Au-Yong, C.P.; Musa, Z.N. The mediating effects of finance on the performance of hospital facilities management services. J. Build. Eng. 2021, 34, 101899. [Google Scholar] [CrossRef]
  14. Lavy, S.; Garcia, J.A.; Dixit, M.K. Establishment of KPIs for facility performance measurement: Review of literature. Facilities 2010, 28, 440–464. [Google Scholar] [CrossRef]
  15. Shohet, I.M. Key performance indicators for maintenance of hospital buildings. In Proceedings of the CIB W070 2002 Global Symposium, Glasgow, Scotland, 18–20 September 2002; Volume 70, pp. 79–90. [Google Scholar]
  16. Steinke, C.; Webster, L.; Fontaine, M. Evaluating building performance in healthcare facilities: An organizational perspective. HERD Health Environ. Res. Des. J. 2010, 3, 63–83. [Google Scholar] [CrossRef] [PubMed]
  17. Rosacker, K.M.; Zuckweiler, K.M.; Buelow, J.R. An Empirical Evaluation of Hospital Project Implementation Success. Acad. Health Care Manag. J. 2010, 6, 37–53. [Google Scholar]
  18. Ling, F.Y.Y.; Li, Q. Managing the development and construction of public hospital projects. IOP Conf. Ser. Mater. Sci. Eng. 2019, 471, 022001. [Google Scholar] [CrossRef]
  19. Adamy, A.; Abu Bakar, A.H. Developing a building-performance evaluation framework for post-disaster reconstruction: The case of hospital buildings in Aceh, Indonesia. Int. J. Constr. Manag. 2021, 21, 56–77. [Google Scholar] [CrossRef]
  20. Chan, A.P.L.; Chan, A.P.C.; Chan, D.W.M. An empirical survey of the success criteria for running healthcare projects. Archit. Sci. Rev. 2005, 48, 61–68. [Google Scholar] [CrossRef]
  21. Adamy, A. Disaster-Resilient Building: Lesson Learned from a Building Performance Evaluation of Meuraxa Hospital in Aceh, Indonesia. In Resilient and Responsible Smart Cities; Springer: Berlin/Heidelberg, Germany, 2021; pp. 179–193. [Google Scholar] [CrossRef]
  22. Gimbert, X.; Bisbe, J.; Mendoza, X. The role of performance measurement systems in strategy formulation processes. Long Range Plan. 2010, 43, 477–497. [Google Scholar] [CrossRef]
  23. Amos, D.; Musa, Z.N.; Au-Yong, C.P. Performance measurement of facilities management services in Ghana’s public hospitals. Build. Res. Inf. 2020, 48, 218–238. [Google Scholar] [CrossRef]
  24. Adamy, A.; Abu Bakar, A.H. Key criteria for post-reconstruction hospital building performance. IOP Conf. Ser. Mater. Sci. Eng. 2019, 469, 012072. [Google Scholar] [CrossRef]
  25. Lavy, S.; Garcia, J.A.; Dixit, M.K. KPIs for facility’s performance assessment, Part I: Identification and categorization of core indicators. Facilities 2014, 32, 256–274. [Google Scholar] [CrossRef]
  26. Amos, D.; Au-Yong, C.P.; Musa, Z.N. Developing key performance indicators for hospital facilities management services: A developing country perspective. Eng. Constr. Archit. Manag. 2020, 27, 2715–2735. [Google Scholar] [CrossRef]
  27. Gómez-Chaparro, M.; García-Sanz-Calcedo, J.; Aunión-Villa, J. Maintenance in hospitals with less than 200 beds: Efficiency indicators. Build. Res. Inf. 2020, 48, 526–537. [Google Scholar] [CrossRef]
  28. Lai, J.H.; Hou, H.C.; Edwards, D.J.; Yuen, P.L. An analytic network process model for hospital facilities management performance evaluation. Facilities 2021, 40, 333–352. [Google Scholar] [CrossRef]
  29. Wai, S.H.; Aminah, M.Y.; Syuhaida, I. Social infrastructure project success criteria: An exploratory study. Int. J. Constr. Manag. 2013, 13, 95–104. [Google Scholar] [CrossRef]
  30. Iskandar, K.A.; Hanna, A.S.; Lotfallah, W. Modeling the performance of healthcare construction projects. Eng. Constr. Archit. Manag. 2019, 26, 2023–2039. [Google Scholar] [CrossRef]
  31. Adabre, M.A.; Chan, A.P.; Darko, A.; Hosseini, M.R. Facilitating a transition to a circular economy in construction projects: Intermediate theoretical models based on the theory of planned behaviour. Build. Res. Inf. 2023, 51, 85–104. [Google Scholar] [CrossRef]
  32. Adabre, M.A.; Chan, A.P.; Darko, A.; Edwards, D.J.; Yang, Y.; Issahaque, S. No Stakeholder Is an Island in the Drive to This Transition: Circular Economy in the Built Environment. Sustainability 2024, 16, 6422. [Google Scholar] [CrossRef]
  33. Chan, A.P.C.; Chan, E.H.; Chan, A.P.L. Managing Health Care Projects in Hong Kong: A Case Study of the North District Hospital. Int. J. Constr. Manag. 2003, 3, 1–13. [Google Scholar] [CrossRef]
  34. Chan, A.P.L.; Chan, A.P.C.; Chan, D.W.M. A study of managing healthcare projects in Hong Kong. In Proceedings of the 19th Annual ARCOM Conference, Brighton, UK, 3–5 September 2003; Greenwood, D.J., Ed.; Association of Researchers in Construction Management: Brighton, UK, 2003; Volume 2, pp. 513–522. [Google Scholar]
  35. Zuo, J.; Zillante, G.; Zhao, Z.Y.; Xia, B. Does project culture matter? A comparative study of two major hospital projects. Facilities 2014, 32, 801–824. [Google Scholar] [CrossRef]
  36. Choi, J.; Leite, F.; de Oliveira, D.P. BIM-based benchmarking for healthcare construction projects. Autom. Constr. 2020, 119, 103347. [Google Scholar] [CrossRef]
  37. Buelow, J.R.; Zuckweiler, K.M.; Rosacker, K.M. Evaluation methods for hospital projects. Hosp. Top. 2010, 88, 10–17. [Google Scholar] [CrossRef] [PubMed]
  38. Liu, W.; Chan, A.P.; Chan, M.W.; Darko, A.; Oppong, G.D. Key performance indicators for hospital planning and construction: A systematic review and meta-analysis. Eng. Constr. Archit. Manag. 2024. ahead-of-print. [Google Scholar] [CrossRef]
  39. Do, D.; Ballard, G.; Tillmann, P. Part 1 of 5: The Application of Target Value Design in the Design and Construction of the UHS Temecula Valley Hospital; Project Production Systems Laboratory, University of California: Berkeley, CA, USA, 2015. [Google Scholar]
  40. Chan, A.P.C.; Chan, A.P.L. Key performance indicators for measuring construction success. Benchmarking Int. J. 2004, 11, 203–221. [Google Scholar] [CrossRef]
  41. Choi, J.; Leite, F.; de Oliveira, D.P. BIM-based benchmarking system for healthcare projects: Feasibility study and functional requirements. Autom. Constr. 2018, 96, 262–279. [Google Scholar] [CrossRef]
  42. Ahmad, H.; Abdul Aziz, A.R.; Jaafar, M. Success criteria for design-and-build public hospital construction project in Malaysia: An empirical study. Appl. Mech. Mater. 2015, 749, 410–414. [Google Scholar]
  43. Chan, A.P.L. Critical Success Factors for Delivering Healthcare Projects in Hong Kong. Ph.D. Thesis, Department of Building and Real Estate, Hong Kong Polytechnic University, Hong Kong, China, 2004. [Google Scholar]
  44. Omar, M.F.; Ibrahim, F.A.; Omar, W.M.S.W. Key performance indicators for maintenance management effectiveness of public hospital building. MATEC Web Conf. 2017, 97, 01056. [Google Scholar] [CrossRef]
  45. Talib, Y.; Yang, R.J.; Rajagopalan, P. Evaluation of building performance for strategic facilities management in healthcare: A case study of a public hospital in Australia. Facilities 2013, 31, 681–701. [Google Scholar] [CrossRef]
  46. Pei, Y. A framework of output specifications and evaluation method for hospital PPP projects. Open J. Bus. Manag. 2019, 7, 167–179. [Google Scholar] [CrossRef]
  47. Gokhale, S.; Gormley, T.C. Construction Management of Healthcare Projects; McGraw-Hill Education: New York, NY, USA, 2014. [Google Scholar]
  48. Edum-Fotwe, F.T.; Egbu, C.; Gibb, A.G.F. Designing facilities management needs into infrastructure projects: Case from a major hospital. J. Perform. Constr. Facil. 2003, 17, 43–50. [Google Scholar] [CrossRef]
  49. Lai, J.; Yuen, P.L. Performance evaluation for hospital facility management: Literature review and a research methodology. J. Facil. Manag. Educ. Res. 2019, 3, 38–43. [Google Scholar] [CrossRef] [PubMed]
  50. Li, Y.; Cao, L.; Han, Y.; Wei, J. Development of a conceptual benchmarking framework for healthcare facilities management: Case study of shanghai municipal hospitals. J. Constr. Eng. Manag. 2020, 146, 05019016. [Google Scholar] [CrossRef]
  51. Omar, M.F.; Ibrahim, F.A.; Omar, W.M.S.W. An assessment of the maintenance management effectiveness of public hospital building through key performance indicators. Sains Hum. 2016, 1, 51–56. [Google Scholar] [CrossRef]
  52. Marzouk, M.; Hanafy, M. Modelling maintainability of healthcare facilities services systems using BIM and business intelligence. J. Build. Eng. 2022, 46, 103820. [Google Scholar] [CrossRef]
  53. Shohet, I.M. Key performance indicators for strategic healthcare facilities maintenance. J. Constr. Eng. Manag. 2006, 132, 345–352. [Google Scholar] [CrossRef]
  54. Lai, J.; Yuen, P.L. Identification, classification and shortlisting of performance indicators for hospital facilities management. Facilities 2021, 39, 4–18. [Google Scholar] [CrossRef]
  55. Lai, J.H.; Hou, H.C.; Chiu, B.W.; Edwards, D.; Yuen, P.L.; Sing, M.; Wong, P. Importance of hospital facilities management performance indicators: Building practitioners’ perspectives. J. Build. Eng. 2022, 45, 103428. [Google Scholar] [CrossRef]
  56. Hallowell, M.R.; Gambatese, J.A. Qualitative Research: Application of the Delphi Method to CEM Research. J. Constr. Eng. Manag. 2010, 136, 99–107. [Google Scholar] [CrossRef]
  57. Ameyaw, E.E. Risk Allocation Model for Public-Private Partnership Water Supply Projects in Ghana. Ph.D. Thesis, The Hong Kong Polytechnic University, Hong Kong, China, 2015. [Google Scholar]
  58. Sourani, A.; Sohail, M. The Delphi method: Review and use in construction management research. Int. J. Constr. Educ. Res. 2015, 11, 54–76. [Google Scholar] [CrossRef]
  59. Manoliadis, O.; Tsolas, I.; Nakou, A. Sustainable construction and drivers of change in Greece: A Delphi study. Constr. Manag. Econ. 2006, 24, 113–120. [Google Scholar] [CrossRef]
  60. Ibrahim, C.K.I.C.; Costello, S.B.; Wilkinson, S. Development of a conceptual team integration performance index for alliance projects. Constr. Manag. Econ. 2013, 31, 1128–1143. [Google Scholar] [CrossRef]
  61. Chan, A.P.C.; Yung, E.H.; Lam, P.T.; Tam, C.M.; Cheung, S.O. Application of Delphi method in selection of procurement systems for construction projects. Constr. Manag. Econ. 2001, 19, 699–718. [Google Scholar] [CrossRef]
  62. Yeung, J.F.; Chan, A.P.C.; Chan, D.W.; Li, L.K. Development of a partnering performance index (PPI) for construction projects in Hong Kong: A Delphi study. Constr. Manag. Econ. 2007, 25, 1219–1237. [Google Scholar] [CrossRef]
  63. Yeung, J.F.; Chan, A.P.C.; Chan, D.W. Developing a performance index for relationship-based construction projects in Australia: Delphi study. J. Manag. Eng. 2009, 25, 59–68. [Google Scholar] [CrossRef]
  64. Hsu, C.; Sandford, B.A. The Delphi technique: Making sense of consensus. Pract. Assess. Res. Eval. 2007, 12, 1–8. [Google Scholar]
  65. Hasson, F.; Keeney, S.; McKenna, H. Research guidelines for the Delphi survey technique. J. Adv. Nurs. 2000, 32, 1008–1015. [Google Scholar] [CrossRef]
  66. Ameyaw, E.E.; Hu, Y.; Shan, M.; Chan, A.P.C.; Le, Y. Application of Delphi method in construction engineering and management research: A quantitative perspective. J. Civ. Eng. Manag. 2016, 22, 991–1000. [Google Scholar] [CrossRef]
  67. Osei-Kyei, R.; Chan, A.P.; Javed, A.A.; Ameyaw, E.E. Critical success criteria for public-private partnership projects: International experts’ opinion. Int. J. Strateg. Prop. Manag. 2017, 21, 87–100. [Google Scholar] [CrossRef]
  68. Oppong, G.D.; Chan, A.P.; Ameyaw, E.E.; Frimpong, S.; Dansoh, A. Fuzzy evaluation of the factors contributing to the success of external stakeholder management in construction. J. Constr. Eng. Manag. 2021, 147, 04021142. [Google Scholar] [CrossRef]
  69. Hair, J.F.; Anderson, R.E.; Tatham, R.L.; Black, W.C. Multivariate Data Analysis, 5th ed.; Prentice Hall: Upper Saddle River, NJ, USA, 1998. [Google Scholar]
  70. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef]
  71. Tah, J.H.M.; Carr, V. A proposal for construction project risk assessment using fuzzy logic. Constr. Manag. Econ. 2000, 18, 491–500. [Google Scholar] [CrossRef]
  72. Hu, Y.; Chan, A.P.C.; Le, Y.; Xu, Y.; Shan, M. Developing a program organization performance index for delivering construction megaprojects in China: Fuzzy synthetic evaluation analysis. J. Manag. Eng. 2016, 32, 05016007. [Google Scholar] [CrossRef]
  73. Singh, D.; Tiong, R.L.K. A fuzzy decision framework for contractor selection. J. Constr. Eng. Manag. 2005, 131, 62–70. [Google Scholar] [CrossRef]
  74. Boussabaine, A. Risk Pricing Strategies for Public-Private Partnership Projects, 1st ed.; John Wiley and Sons: New York, NY, USA, 2014. [Google Scholar]
  75. Lo, S.M. A fire safety assessment system for existing buildings. Fire Technol. 1999, 35, 131–152. [Google Scholar] [CrossRef]
  76. Gebremeskel, M.N.; Kim, S.Y.; Thuc, L.D.; Nguyen, M.V. Forming a driving index for implementing public-private partnership projects in emerging economy: Ethiopian perception. Eng. Constr. Archit. Manag. 2021, 28, 2925–2947. [Google Scholar] [CrossRef]
  77. Ameyaw, E.E.; Chan, A.P.C. Critical success factors for public-private partnership in water supply projects. Facilities 2016, 34, 124–160. [Google Scholar] [CrossRef]
  78. Liu, J.; Li, Q.; Wang, Y. Risk analysis in ultra deep scientific drilling project—A fuzzy synthetic evaluation approach. Int. J. Proj. Manag. 2013, 31, 449–458. [Google Scholar] [CrossRef]
  79. Onkal-Engin, G.; Demir, I.; Hiz, H. Assessment of urban air quality in Istanbul using fuzzy synthetic evaluation. Atmos. Environ. 2004, 38, 3809–3815. [Google Scholar] [CrossRef]
  80. Nunnally, J.C. Psychometric Theory, 2nd ed.; McGraw-Hill: New York, NY, USA, 1978. [Google Scholar]
  81. Phillips, J. CAPM/PMP Project Management All-in-One Exam Guide; McGraw-Hill, Inc.: New York, NY, USA, 2007. [Google Scholar]
  82. Tushar, S.R.; Moktadir, M.A.; Kusi-Sarpong, S.; Ren, J. Driving sustainable healthcare service management in the hospital sector. J. Clean. Prod. 2023, 420, 138310. [Google Scholar] [CrossRef]
  83. Ebekozien, A. Maintenance practices in Nigeria’s public health-care buildings: A systematic review of issues and feasible solutions. J. Facil. Manag. 2021, 19, 32–52. [Google Scholar] [CrossRef]
  84. Fotovatfard, A.; Heravi, G. Identifying key performance indicators for healthcare facilities maintenance. J. Build. Eng. 2021, 42, 102838. [Google Scholar] [CrossRef]
  85. Han, S.; Jeong, Y.; Lee, K.; In, J. Environmental sustainability in health care: An empirical investigation of US hospitals. Bus. Strategy Environ. 2024, 33, 6045–6065. [Google Scholar] [CrossRef]
  86. Adabre, M.A.; Chan, A.P.; Wuni, I.Y. Modeling Sustainable Housing for Sustainable Development in Cities and Communities: The Perspective of a Developing Economy. In Circular Economy for Buildings and Infrastructure: Principles, Practices and Future Directions; Springer International Publishing: Cham, Switzerland, 2024; pp. 97–115. [Google Scholar]
  87. Chan, A.P.; Adabre, M.A. Bridging the gap between sustainable housing and affordable housing: The required critical success criteria (CSC). Build. Environ. 2019, 151, 112–125. [Google Scholar] [CrossRef]
  88. Adabre, M.A.; Chan, A.P. Towards a sustainability assessment model for affordable housing projects: The perspective of professionals in Ghana. Eng. Constr. Archit. Manag. 2020, 27, 2523–2551. [Google Scholar] [CrossRef]
  89. Adabre, M.A.; Chan, A.P. Critical success factors (CSFs) for sustainable affordable housing. Build. Environ. 2019, 156, 203–214. [Google Scholar] [CrossRef]
Figure 1. Overview of the research framework.
Figure 1. Overview of the research framework.
Buildings 15 00332 g001
Figure 2. Framework of KPI categories for performance assessment of healthcare projects.
Figure 2. Framework of KPI categories for performance assessment of healthcare projects.
Buildings 15 00332 g002
Figure 3. Integration of the underlying KPIs of healthcare project success.
Figure 3. Integration of the underlying KPIs of healthcare project success.
Buildings 15 00332 g003
Table 1. Search of publications in selected academic databases.
Table 1. Search of publications in selected academic databases.
Academic DatabaseSearch String/Synonymous TermsSearch DomainNumber of Results
Scopus(“healthcare project” OR “healthcare center” OR “health center” OR “healthcare facility” OR “hospital” OR “clinic” OR “infirmary” OR “sanatorium” OR “medical center” OR “medical facility” OR “convalescent home” OR “convalescent facility”) AND (“success” OR “failure” OR “performance” OR “KPI” OR “benchmark” OR “efficiency” OR “effectiveness”) AND (“building project” OR “construction project” OR “infrastructure project” OR “engineering project” OR “construction industry”)Title, abstract, and keywords291
Web of Science(“healthcare project” OR “healthcare center” OR “health center” OR “healthcare facility” OR “hospital” OR “clinic” OR “infirmary” OR “sanatorium” OR “medical center” OR “medical facility” OR “convalescent home” OR “convalescent facility”) AND (“success” OR “failure” OR “performance” OR “KPI” OR “benchmark” OR “efficiency” OR “effectiveness”) AND (“building project” OR “construction project” OR “infrastructure project” OR “engineering project” OR “construction industry”)All fields623
Google Scholar(“healthcare project”, “healthcare center”, “health center”, “healthcare facility”, “hospital”, “clinic”, “infirmary”, “sanatorium”, “medical center”, “medical facility”, “convalescent home”, “convalescent facility”) AND (“success”, “failure”, “performance”, “KPI”, “benchmark”, “efficiency”, “effectiveness”) AND (“building project”, “construction project”, “infrastructure project”, “engineering project”, “construction industry”)All fields419
Table 2. Demographic information of the panel experts.
Table 2. Demographic information of the panel experts.
Demographic CharacteristicsNo.%Demographic CharacteristicsNo.%
Professional background Level of experience
          Project/Construction Manager526.32%          1–5 years947.37%
          Quantity Surveyor421.05%          6–10 years 526.32%
          Architect421.05%          11–15 years15.26%
          Facility/Property Manager15.26%          Above 15 years421.05%
          Engineer315.79%Total19100%
          Hospital Administrator15.26%
          Medical Professional15.26%Number of healthcare projects
Total19100%          1–2631.58%
          3–4315.79%
Sector of client           5–6315.79%
          Public1578.95%          ≥6736.84%
          Private526.32%Total19100%
          Quasi-public315.79%
Phase of healthcare project
          Planning phase1578.95%
          Construction phase1789.47%
          Post-construction phase1052.63%
Table 3. Results on the selection of the most representative KPIs.
Table 3. Results on the selection of the most representative KPIs.
Round 1Round 2
Key Performance Indicators (KPIs)Count%RankCount%Rank
Planning and Construction Phases
Construction quality performance1789.47%11684.21%1
Construction time performance1684.21%21684.21%1
Construction safety performance1368.42%41684.21%1
Construction cost performance1578.95%31578.95%4
Innovation and improvement1157.89%51368.42%5
Risk management effectiveness526.32%6842.11%6
Teamwork and collaboration526.32%6631.58%7
Change occurrence and magnitude421.05%8526.32%8
Environmental performance315.79%10421.05%9
Planning effectiveness421.05%8421.05%9
Building codes adherence315.79%10315.79%11
Conflict/dispute occurrence and magnitude315.79%10210.53%12
Participant professionalism and competency315.79%10210.53%12
Construction productivity15.26%1815.26%14
Construction resource management15.26%1815.26%14
Communication effectiveness210.53%1615.26%14
Client/participant satisfaction315.79%1015.26%14
* Defects rectifications and improvement extent owing to design afterthought15.26%1815.26%14
Litigation occurrence and magnitude315.79%1000.00%19
Claim occurrence and magnitude210.53%1600.00%19
Scope of rework00.00%2100.00%19
Long-term business relationships00.00%2100.00%19
Harmonious working relationships00.00%2100.00%19
Trust and respect00.00%2100.00%19
Participant profitability00.00%2100.00%19
Professional reputation/image attainment00.00%2100.00%19
Human resource management00.00%2100.00%19
Learning and development00.00%2100.00%19
Post-Construction Phase
Stakeholder/end-user satisfaction1578.95%11789.47%1
Functional suitability1368.42%21473.68%2
Maintenance effectiveness and efficiency1052.63%31473.68%2
Functional capacity and utilization1052.63%31263.16%4
Flexibility and adaptability of facility842.11%51052.63%5
Service performance842.11%5842.11%6
Resilience and sustainability of facility736.84%7842.11%6
Energy utilization631.58%8631.58%8
Long-term community/societal benefits421.05%10421.05%9
Service lifespan of facility526.32%9315.79%10
Maintenance interruptions to operations421.05%10315.79%10
Operation and maintenance (O&M) expenditure315.79%13210.53%12
O&M safety performance15.26%17210.53%12
O&M organization/management effectiveness210.53%15210.53%12
Healthcare culture/image embeddedness421.05%10210.53%12
Maintenance time performance15.26%1715.26%16
Facility condition315.79%1315.26%16
Site/location optimization15.26%1715.26%16
Facility integration into locality15.26%1715.26%16
Commercial profitability/value210.53%1500.00%20
O&M statutory compliance00.00%2100.00%20
Spare parts management00.00%2100.00%20
O&M policy/guideline deployment00.00%2100.00%20
O&M information management/sharing00.00%2100.00%20
Water and waste management00.00%2100.00%20
Current replacement value of facility00.00%2100.00%20
Visual appearance and appeal00.00%2100.00%20
* Newly suggested KPI.
Table 4. Results on the rating of the shortlisted KPIs.
Table 4. Results on the rating of the shortlisted KPIs.
Round 3Round 4
S/NShortlisted Key Performance Indicators (KPIs)MeanRankMeanRank
KPI1Construction quality performance4.6714.871
KPI2Construction safety performance4.6034.802
KPI3Stakeholder/end-user satisfaction4.6714.733
KPI4Functional suitability4.2054.474
KPI5Construction time performance4.4044.335
KPI6Construction cost performance4.0074.136
KPI7Maintenance effectiveness and efficiency4.1364.007
KPI8Functional capacity and utilization3.6783.738
KPI9Innovation and improvement3.33103.139
KPI10Flexibility and adaptability of facility3.5393.139
N = 15
Agreement level (Kendall’s W)0.3950.594
Asymp. sig.0.0000.000
Improvement in Agreement Level 50.38%
Table 5. Summary of fuzzy analysis results.
Table 5. Summary of fuzzy analysis results.
Principal Component Factor AnalysisMean ScoreWeightingEstimated Membership Function (MFs)Index Value
(M)(W)
S/NLoadingEigenvalue% of Var. Expl.Cum. % of Var. Expl.KPIKPIGKPIKPIGMFs at Level 2 (KPIs)MFs at Level 1 (KPIGs)IndexNormalized
Value
Rank (Grade)
KPIG14.39
(4.67)
43.91
(46.74)
43.91
(46.74)
22.4670.5440.050.050.180.500.223.8030.2873rd
(V. Imp.)
KPI90.895 3.1330.1390.130.000.470.400.00
KPI100.868 3.1330.1390.200.000.270.530.00
KPI60.848 4.1330.1840.000.070.070.530.33
KPI50.822 4.3330.1930.000.070.070.330.53
KPI70.817 4.0000.1780.000.070.130.530.27
KPI80.724 3.7330.1660.000.070.200.670.07
KPIG21.80
(1.78)
17.99
(17.76)
61.89
(64.50)
9.3330.2260.000.000.000.320.684.6750.3532nd
(M. Imp.)
KPI10.856 4.8670.5210.000.000.000.130.87
KPI4−0.802 4.4670.4790.000.000.000.530.47
KPIG31.44
(1.17)
14.36
(11.75)
76.25
(76.25)
9.5330.2310.000.000.030.170.804.7670.3601st
(M. Imp.)
KPI30.820 4.7330.4970.000.000.070.130.80
KPI2−0.717 4.8000.5030.000.000.000.200.80
Total mean for KPIGs 41.333
Note: initial values in the parentheses; extraction method: principal component analysis; rotation method: varimax with kaiser normalization; rotation converged in five iterations; cum. = cumulative; var. expl. = variance explained, v. imp. = very important; and m. imp. = most important.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Oppong, G.D.; Chan, A.P.-C.; Chan, M.-W.; Darko, A.; Adabre, M.A. Success Evaluation Index Model for Running Healthcare Projects in Hong Kong: A Delphi Approach. Buildings 2025, 15, 332. https://doi.org/10.3390/buildings15030332

AMA Style

Oppong GD, Chan AP-C, Chan M-W, Darko A, Adabre MA. Success Evaluation Index Model for Running Healthcare Projects in Hong Kong: A Delphi Approach. Buildings. 2025; 15(3):332. https://doi.org/10.3390/buildings15030332

Chicago/Turabian Style

Oppong, Goodenough D., Albert Ping-Chuen Chan, Man-Wai Chan, Amos Darko, and Michael A. Adabre. 2025. "Success Evaluation Index Model for Running Healthcare Projects in Hong Kong: A Delphi Approach" Buildings 15, no. 3: 332. https://doi.org/10.3390/buildings15030332

APA Style

Oppong, G. D., Chan, A. P.-C., Chan, M.-W., Darko, A., & Adabre, M. A. (2025). Success Evaluation Index Model for Running Healthcare Projects in Hong Kong: A Delphi Approach. Buildings, 15(3), 332. https://doi.org/10.3390/buildings15030332

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop