Next Article in Journal
Dielectric Hybrid Optimization Model Based on Crack Damage in Semi-Rigid Base Course
Next Article in Special Issue
Integrating Generative Artificial Intelligence and Problem-Based Learning into the Digitization in Construction Curriculum
Previous Article in Journal
A Numerical Study of Dynamic Behaviors of Graphene-Platelet-Reinforced ETFE Tensile Membrane Structures Subjected to Harmonic Excitation
Previous Article in Special Issue
Dilemmas and Solutions for Sustainability-Based Engineering Ethics: Lessons Learned from the Collapse of a Self-Built House in Changsha, Hunan, China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluation of Instructor Capability: Perspective from Building Information Modeling Competition Students in Mainland China

1
Department of Civil Engineering, College of Civil Engineering and Mechanics, Yanshan University, Qinhuangdao 066104, China
2
Hebei Key Laboratory of Green Construction and Intelligent Maintenance for Civil Engineering, Yanshan University, Qinhuangdao 066104, China
*
Author to whom correspondence should be addressed.
Buildings 2024, 14(11), 3598; https://doi.org/10.3390/buildings14113598
Submission received: 10 October 2024 / Revised: 7 November 2024 / Accepted: 11 November 2024 / Published: 12 November 2024

Abstract

:
Building information modelling (BIM) technology, which has experienced rapid development, has become the focus of digital education learning in architecture, engineering, and construction (AEC) related disciplines. However, BIM education is still confronted with the disconnection between theoretical education and engineering practice. In mainland China, BIM competitions are an important digital platform for higher education practice teaching. As the organizer and leader of student participants, BIM instructors, especially their BIM capability, have an important impact on digital education. Previous research on BIM capability did not involve the field of BIM education, and the existing BIM capability framework is not entirely applicable to evaluating the BIM capability of instructors. Semi-structured interviews based on grounded theory (GT) and structural equation modeling (SEM) were used to construct a five-dimensional model containing 23 capability indicators. The research findings highlighted the multi-dimensional nature of BIM capability for instructors and indicated that personnel capability was the most important for instructors, while process capability was considered the most dispensable. This is significantly different from the emphasis on technical and process capability in BIM capability research in the traditional AEC industry. It was also found that students from different levels of universities and different educational backgrounds had different demands for the BIM capability of instructors. The results of this study will help universities select excellent instructors, improve the quality of BIM education in universities, and cultivate more outstanding BIM talents for the development of BIM in the AEC industry, thereby promoting the digital practice of BIM education in universities.

1. Introduction

The interdependence between different stakeholders is becoming increasingly complex, and construction projects are becoming more difficult to manage [1,2,3]. To cope with the increasingly complex situation of projects, information and communication technology (ICT) has been developing at a very fast pace, and building information modeling (BIM) has emerged [4]. BIM is a new method of approaching the design, construction, and maintenance of buildings. It is defined as “a set of interacting policies, processes, and technologies that generate a method for managing basic architectural design and project data in digital format throughout the entire lifecycle of a building” [4]. BIM is also regarded as a digitalized technology under a collaborative working platform [5].
As a revolutionary technology and process, BIM has received enormous attention from both academia and industry [6]. And BIM has been regarded by many people as a significant opportunity in the architecture, engineering and construction (AEC) industry [7]. BIM technology has enormous advantages in the modern construction industry. BIM not only brings technical benefits to the development process, but delivers an innovative and integrated working platform to improve productivity and sustainability throughout the project life cycle [8]. The virtual construction capabilities of BIM technology provide project teams with a powerful tool to assess the feasibility and constructability of buildings during the pre-construction phase. BIM enables project owners to review designs and provide feedback through the visualization of three-dimensional (3D) building information models before facility construction. By leveraging this technology, uncertainties are eliminated, designs are optimized, resource waste is minimized, and passive design strategies are effectively implemented. This comprehensive approach ensures seamless project progression, significantly elevates building quality, and promotes sustainability throughout the construction industry, all while adhering to the rigorous standards expected by the scientific community [9].
BIM promotes project constructability, improves productivity, and ensures time and money savings. A study by Stanford University’s Center for Integrated Facilities Engineering (CIFE) showed that using BIM can reduce the project budget estimation time by 80%, total project duration by 7%, contract value by 10%, and project total cost estimation with a 3% profit margin [10]. Due to the practicality and economic value of BIM, it has been widely recognized in the AEC industry [7]. Companies and educational institutions related to the AEC industry have widely adopted BIM technology to meet market demands [11,12,13]. China is a huge AEC market, accounting for 47.9% of the construction industry in the Asia Pacific region. It is expected that the construction industry will continue to grow at an average rate of 12.6% from 2013 to 2018 [14]. The Chinese AEC market is also constantly increasing its demand for BIM applications [15,16].
As more and more companies use BIM, the demand for BIM talents in the market is steadily increasing significantly [6,17]. However, improving the utilization rate of medium BIM still faces numerous difficulties. Research has shown that the lack of knowledgeable and experienced professionals in the AEC industry is a key obstacle to adopting BIM in projects [18,19,20]. In the past few years, countries such as the United States, the United Kingdom, Singapore, and Australia have all carried out BIM standardization and policy-making activities, thereby increasing the adoption rate of BIM [21]. However, industry and academia recognize that BIM education in university courses is an important requirement to meet industry and educational needs [22]. Based on the expectations of the current AEC industry and strong government efforts, many educational institutions are researching the establishment of BIM courses in higher education systems [23,24]. Globally active BIM educators and researchers are also striving to provide BIM education frameworks and design BIM courses [25]. After several years of development, the widespread presence of BIM in today’s university curriculum has led to significant progress in BIM education [25,26]. This is considered a win-win situation, which not only reduces the cost of BIM adoption but also significantly improves the employment opportunities for AEC graduates [27,28,29].
However, BIM education still faces the problem of a shortage of BIM learning resources. And BIM education in universities mostly focuses on the teaching of basic BIM skills and theories. There is still a lack of interaction between theoretical education and engineering practice in BIM learning [30,31]. The practical BIM capabilities of students cannot meet the requirements of the industry. Therefore, the focus of BIM teaching has shifted from technical aspects to emphasizing the implementation process and value proposition of BIM [32]. Project based learning (PBL) is one of the effective methods to improve students’ practical abilities [33]. This method actually refers to learning through practice. Borkowski also emphasized the importance of experiential learning in BIM education. Solving practical problems in projects can more effectively enhance students’ BIM capabilities [34]. Team members work together to solve real-world challenges and acquire knowledge and skills [35].
BIM competition is one of the most common PBL activities, which allows participating teams to solve BIM challenges designed by the organizers in a certain period of time, such as designing a building (BIM Valladolid), designing energy efficient construction process, or urbanizing an urban space [36]. There are various types of BIM competitions in China, including the National BIM Graduation Design Innovation Competition and BIM Technology Application Competition. These competitions have attracted active participation from numerous university students, such as Yanshan University. Participants usually need to form a team, use Revit 2023to complete designated architectural design tasks, and submit corresponding design proposals and technical reports with the help of the instructors. Ao et al. [37] thought that the performance of students participating in competitions will be affected by the instructors. In this team, as the organizer and leader of the participating students, instructors play a crucial role. Their role is not only to provide technical support and knowledge transfer, but also to guide students’ thinking patterns, stimulate innovative thinking, cultivate teamwork skills, and enhance project management capabilities. Hence, the selection of instructors has a significant impact on the BIM practice and competition results of participating teams. However, there is no research on the selection criteria for excellent instructors. Due to the diversity of team roles played by instructors, this article quantifies the important role played by mentors as relevant BIM capability indicators based on BIM capability theory.
There are not many studies that provide a precise definition of BIM capability, and a small number of studies that define BIM capability are based on Succar’s concept of BIM capability: “BIM capability is the basic capability required to perform tasks or provide BIM services and products. BIM capability is a direct reflection of BIM requirements and deliverables” [38]. On the basis of this definition, Munianday et al. [39] provided a definition of individual BIM capability: “A person’s personal BIM capability depends on their personal qualities, professional knowledge, and the technical ability required to integrate BIM activities or generate BIM related outputs”. Yilmaz et al. [40] believed that BIM capability was the characteristic of the ability to achieve the defined BIM results and BIM attribute outcomes. Many scholars have studied and developed the theory of BIM capability, proposing various BIM capability frameworks and evaluation indicators for designers [41], contractors [42], and owners [43]. In construction projects, managers with excellent BIM capabilities can effectively improve project productivity and performance [44,45]. In BIM competitions, participants usually need to use BIM technology for modeling, analysis, and optimization tasks to solve practical engineering problems or propose innovative solutions. From an academic perspective, BIM competitions are actually a typical BIM activity. However, the BIM capability of the BIM competition instructors is not directly equivalent to the BIM capability of any party in the construction project. The capability of instructors in BIM competitions are more comprehensive, which can be seen as a combination of partial BIM capabilities for various roles such as owners, project managers, and supervisors in actual engineering projects. This study compared their capability from multiple dimensions to establish preliminary indicators for evaluating the capabilities of instructors in the BIM competition scenario.
To achieve the research objectives, the structure of this paper is outlined as follows. The existing literature on BIM capability was analyzed and analogized to the role of instructors in digital teaching practice, obtaining a preliminary list (V1) of BIM capability indicators. Subsequently, semi-structured interviews based on GT were employed to validate the content and completeness of the BIM capability indicators (V1), leading to the refinement of the instructor BIM capability indicators (V2). Afterward, utilizing the relevant data collected from the questionnaire survey, structural equation modeling (SEM) was applied to verify the BIM capability indicators (V2) and determine the weight coefficients of various dimensions and secondary capability indicators. Ultimately, five primary dimensions and 23 key BIM capability indicators for instructors were identified. The theoretical contributions and practical implications are presented in the discussion section. This study aimed to construct an evaluation system for instructors’ BIM capability, providing guidance for enhancing their BIM capability and thus promoting digital education practices in AEC-related disciplines in universities.
Various BIM capability and maturity models have been developed to assist architecture, engineering, construction, and facility management (AEC/FM) organizations in measuring their performance in BIM utilization. However, the applicability and focus of these models are different, and there is no universally accepted or widely used model [46]. The existing BIM capability model is not suitable for BIM education scenarios. In addition, there is currently no research on the BIM capability model of instructors. Therefore, it is necessary to evaluate instructor BIM capabilities. This article constructed a BIM capability evaluation system for competition instructors from the perspective of participating students using a questionnaire and SEM, filling the gap in the field of BIM education. The research results indicated that the personnel capability of instructors is the most important, followed by their support capability, while the importance of process capability is the lowest. This is significantly different from the emphasis on technical and process capabilities in BIM capability research of the traditional AEC industry. This also proves that the existing BIM capability framework is not suitable for screening excellent instructors. This research result will help universities select more excellent competition instructors, improve the quality of BIM education in universities, and cultivate more outstanding BIM talents for the development of BIM in the AEC industry. What’s more, it has laid a theoretical foundation for research on BIM capabilities in the field of BIM education, thereby promoting the digital practice of BIM education in universities. The research framework of this article is shown in Figure 1.

2. Literature Review

BIM is a microcosm of policies, processes, and technologies that will enable the construction industry to generate, manage, and store project data in digital format for lifecycle management [47]. BIM can effectively solve communication bottlenecks and lack of collaboration in the traditional construction industry, thereby improving the performance of the construction industry [48]. The communication gap and low utilization of lifecycle data are the core issues of low process efficiency related to the industry, and BIM is expected to solve these problems [4]. Although BIM has significant advantages over traditional building technologies, the lack of experience in organizing BIM has hindered its widespread use [49]. Therefore, it is necessary for organizations to evaluate and understand their current BIM capability, improving project performance [50]. In recent years, the theory of BIM capability has emerged, but there are not many studies that provide a precise definition of BIM capability. The definition of BIM capability in most of the literature originates from Succar’s concept of BIM capability: “BIM capability is the basic capability required to perform tasks or provide BIM services and products. BIM capability is a direct reflection of BIM requirements and deliverables” [38]. On this basis, scholars have defined the BIM capability of individuals and various organizations. Munianday et al. [39] argued that individual BIM capability depended on their personal qualities, professional knowledge, and technical abilities required to integrate BIM activities or generate BIM related outputs. Ahuja et al. [42] believed that organizational BIM capability referred to the ability of an organization to know how to use technical expertise and manage BIM implementation, and to transform BIM technical knowledge into expected project outcomes. To meet the different BIM evaluation objectives of BIM users, various BIM capability and maturity models have been developed, such as owners [43,51,52], consultant companies [53], assessors [54] and contractors [42]. However, there have been few studies on the capability assessment of instructors in the education field. Next, based on the role played by instructors in competitions, we analyzed the previous literature and preliminarily established the BIM capability indicator system V1.
An emerging core competitive area for pre-qualification and selection was BIM maturity of organizations [55]. This was also why some BIM standards and national implementation agendas required evaluating the BIM capability of project teams before selection. For example, in the UK, the BIM execution plan must include a Supply Chain Capability Summary (SCCS), which indicated the BIM capability of all companies in the supply chain of major suppliers and contractors. Scotland had also proposed a similar framework for evaluating an organization’s BIM capability before project selection or commencement [56]. These emerging standards, frameworks, and tools laid the foundation for determining BIM capability pre-qualification and standards [4,51,57].
Succar et al. [38] proposed a BIM capability framework of criteria, namely, technology, process, and policy. The technology dimension of this framework refers to physical artifacts, including software, hardware and network capability, while process dimension encompasses attributes such as BIM resources, activities and workflows. The policy dimension covers procedures related to contracts, benchmarks and guidance documents that support BIM implementation. In the BIM competition scenario, it is evident that the policy dimensions and subordinate indicators in the BIM capability framework are not applicable to evaluating the abilities of competition instructors. Dib et al. [58] classified BIM capability as planning, management, process, team structure, hardware, process definition and information management by Principal Component Factor Analysis (PCFA). On the other hand, Mahamadu et al. [57] classified BIM capability attributes for pre-qualification and selection activities, namely, competence, resources, culture, attitude, and cost. Obviously, attitude and cost indicators are not applicable as capability evaluation indicators for instructors.
To assess the BIM capability of design, construction and facilities management processes in the construction industry, Yilmaz et al. [40] developed the BIM Capability Assessment Reference Model (BIM-CAREM) and demonstrated its usability. This model divided BIM capability into five categories: technical, process, organizational, human factors, and BIM standards, and decomposed them into 48 capability assessment indicators. BIM QuickScan was a tool for providing a way for companies to quickly assess the BIM level of their construction projects in the Netherlands. This tool contained an online Survey with 44 questions that divided BIM capability into four categories: organization and management, mentality and culture, information structure, and information flow. Every category was composed of weighted key performance indicators (KPI) [59]. Mahamadu et al. [60] divided BIM capability into 11 criteria, which were divided into four main dimensions: capability, capability and resources, culture and attitude, and cost. These standards included qualifications, specific BIM modeling capability, organizational experience, administrative and strategic capability, technical (physical) resources, employee experience, suggested methods, organizational structure, technical readiness, organizational reputation, and costs. They emphasized the importance of BIM software and software availability for modeling.
In the framework of Messner [61], strategic and personnel capability were considered the most important criteria in capability assessment. Giel and Issa [51] argued that operational capability was the most important BIM capability in owner organizations, followed by strategic capability. On the other hand, Van Berlo et al. [59] argued that mindset and BIM cultural attributes were the most important, followed by information processing abilities. Although these frameworks are proposed for different assessment environments, they highlight the significant differences and importance of BIM capability assessment standards. This also emphasized the necessity and value of this study. Yilmaz et al. [40] concluded that BIM standards and human factors were unnecessary evaluation factors for evaluating BIM capability. Usually, completing the modeling tasks of actual projects was complex and time-consuming. Siebelink et al. [49] explained the importance of goals and visions, which provided employees with motivation and belief in promotion and salary increases. Juul and Jensen [62] emphasized the important role of teacher relationship capability in the educational environment. In the context of BIM competitions, the relationship capability of instructors is more reflected in their teamwork and communication capability with participating students.
Although these frameworks were proposed for different assessment environments, they highlighted significant differences in BIM capability assessment standards and their importance. However, there was no research on the selection criteria for instructors. Therefore, this study aimed to construct a framework for evaluating the BIM capability indicators of instructors from the perspective of participating students and identify the most important capability of instructors. This also explains the necessity and practical value of constructing a framework for evaluating the capabilities of instructors. The BIM capability indicators V1 of instructors obtained from literature review are shown in Appendix A.

3. Research Methodology

The research methodology is schematically outlined in Figure 2. It started with the identification of 19 instructor BIM capability indicators and their attributes through literature review. It is worth noting that some of the BIM capability frameworks in Appendix A were designed to evaluate organizations such as owners and contractors, and were considered to expand the scope of indicators. The decision to merge these models was based on the understanding that certain aspects of these models are relevant and applicable to instructor BIM capability. Then, semi-structured interviews based on GT were conducted on the award-winning participating students, with the aim of collecting their opinions on the definition and modification of BIM capability indicators for instructors, examining the content validity and saturation of the indicators. Using the identified indicators, an online survey questionnaire was designed and distributed to the participating students of the 10th National BIM Graduation Design Competition in China. To assess the validity of the factor structure, confirmatory factor analysis (CFA) was utilized. Finally, path analysis utilizing structural equation modeling (SEM) was conducted to establish the weights assigned to every dimension and indicator.

3.1. Instructor BIM Capability Indicators V2

Through literature review, the BIM capability indicators V1 for instructors was obtained, as shown in Appendix A. Semi-structured interviews were considered as other qualitative data collection techniques because interviews are the fundamental source of evidence [63] and an important mechanism for establishing structure in the data collection process [64]. The results of the interview helped to obtain in-depth information on the research content [65]. In order to test the validity and saturation of the capability indicators V1, semi-structured interviews based on GT were conducted until there were no different opinions. GT is a general methodology for developing theory based on empirical data that are systematically gathered and analyzed [66]. Content analysis is a research tool used to determine the existence of concepts, which includes concept analysis and concept mapping [67]. The target audience was award-winning students from universities at different levels. After every interview, we provided a summary to the interviewees to help validate the results. Interviews were terminated after the eighth interview as a result of saturation, as suggested by Guest et al. [68]. Finally, 12 online semi-structured interviews were conducted to be more rigorous.
The information of the 12 respondents is shown in Table 1. Although the number of interviews was limited, they covered award-winning students from different levels of universities, and they all provided valuable feedback on the BIM capability indicators of instructors [69].
To promote outstanding research, the Chinese government has been providing significant funding to support a small group of selected universities through three national research projects (Project 211, Project 985, and Double First Class). The “Double First Class” initiative was launched by the Chinese government in 2017, aiming to establish an outstanding higher education system encompassing “First-Class” universities and “First-Class” disciplines, following the official conclusion of Project 985 and Project 211. However, the concepts of Project 985 and Project 211 still remain in use, not only in academia but also in the industry, where graduation from universities under these projects was often considered a threshold for recruitment. These universities had higher financial support and more academic achievements compared to ordinary colleges, but it was more difficult for students to obtain admission qualifications [70].
Based on the opinions of award-winning students during the interview, the definition of the BIM capability indicators for instructors was determined, and four new capability indicators were added: psychological counseling capability, civil engineering professional knowledge, financial support, and technical assistance. Ultimately, the instructor BIM capability indicators V2 was formed, as shown in Appendix B.
Munianday et al. [39] explained the importance of financial support provided by organizational management in BIM organizational activities, which was beneficial for the sustainable development of organizational operations. In the interview, participating students from Project 985 universities emphasized the importance of instructors applying for funding support. And this was widely recognized by other interviewees. They believed that having financial support allowed them to participate in professional training or seminars, thereby mastering more BIM skills and knowledge. This helped to enhance the competitiveness of students in competitions, giving them an advantage in competition with other teams. In addition, it also helped students reduce their financial burden, allowing them to focus more on the competition itself and improve the effectiveness of the competition. Respondents from vocational colleges explained the necessity of having a solid knowledge of civil engineering as a supervising teacher. When interviewed with other award-winning students, they also explained that they would seek advice from their supervising teachers on civil engineering knowledge during the competition process. Therefore, knowledge of civil engineering was also an effective indicator for evaluating the capability of instructors.
Maintaining close and trustworthy connections with other organizations with rich BIM related knowledge can provide more professional knowledge in BIM applications [71]. Respondents from Project 211 universities mentioned the significance of inviting technical assistance for competition results. They believed that inviting technical assistance played an important role in enhancing team technical strength, broadening student perspectives, enhancing team collaboration, and improving competition results. Respondents from other universities also stated that inviting technical assistance from instructors would be helpful for the competition performance. On the other hand, due to the tight competition time and heavy tasks, participating students face great pressure. Respondents from regular undergraduate universities hoped that their mentors could provide psychological counseling to alleviate stress. When conducting follow-up visits to other interviewees, they stated that they also faced similar issues. Therefore, psychological counseling capability was also an important capability that instructors possess.

3.2. Data Collection and Screening

3.2.1. Design of Questionnaire

On the basis of preliminary research and the tracking survey of the 10th National BIM Graduation Design Innovation Competition, the questionnaire design of this study mainly included two parts: the first part comprised basic information of participating students, including university level, education level, major, competition track, years of exposure to BIM, and the number of times participating in BIM related competitions; the second part consisted of the evaluation table for the importance of instructor BIM capability indicators. The survey respondents were asked to use a five-point Likert scale to rate the importance of these 23 identified indicators, ranging from 1 (disagree), 2 (slightly disagree), 3 (neutral), 4 (slightly agree) to 5 (strongly agree). The 5-point Likert scale is very popular because it can provide clear results [72]. The 5-point Likert scale is more popular than the 7-point Likert scale because it improves response rates and quality, while reducing the level of frustration among respondents [73]. It is more simple for the interviewer to read out the complete list of a scale [74]. Nevertheless, at the end of the survey questionnaire, the respondents were given space to describe and evaluate any other factors related to the instructors’ capability.

3.2.2. Questionnaire Distribution

The questionnaire data were mainly collected online. The target audience included participating students from different universities, educational backgrounds, majors, and tracks. Due to the lack of a sampling framework, the samples used in this study were non probability samples [75], which were used to create representative samples. Respondents might choose to participate in the study based on their interest in participating in the questionnaire [76]. Therefore, the snowball sampling method was used to determine the overall size of the sample, which had the advantage of collecting and sharing data and responses through recommendations or social networks [77]. In the end, a total of 628 questionnaires were distributed online.

3.2.3. Collection and Screening

The importance of an evaluation of BIM capability indicators for instructors by participating students in BIM competitions was collected through an online questionnaire. After distributing 628 questionnaires, a total of 358 responses were received. The response rate was 57%, which was considered acceptable [78]. Before conducting data analysis, the raw data were screened to identify potential issues, including missing data, low data reliability, and collinearity. The screening criteria were as follows: (1) If the questionnaire was not fully completed and there were omissions, the missing data were not used [79]; (2) Due to the short response time, questionnaires completed within less than 120 seconds (with at least 3 seconds allocated to each question) were excluded. The implementation of this standard was to ensure that participants had sufficient time to provide thoughtful and meaningful responses to the survey form; (3) When participants consistently followed specific patterns in their responses, it might introduce bias and affect the accuracy of the study. These questionnaires were considered invalid. Finally, a total of 118 invalid questionnaires were screened out from 358 questionnaires, resulting in 240 valid questionnaires. To ensure sufficient sample size, it is generally recommended that the ratio of sample size to scale items in the survey form be greater than 5:1 [80,81]. In this study, the sample size met this requirement. The profile of the 240 respondents is shown in Figure 3.

3.3. Data Analysis and Conclusions

Data analysis consisted of conducting multivariate data analysis on the collected survey questionnaire data to determine the key BIM capability of instructors. This article developed a BIM capability framework for instructors through two program steps of multivariate analysis: confirmatory factor analysis (CFA) and SEM. Finally, path analysis was used to calculate the weights of each level of instructor BIM capability indicators. A similar process was also used to develop knowledge management maturity [82] and construction project success [83]. The following sections provide a more detailed explanation of each analysis.
The Kaiser Meyer Olkin (KMO) test and Bartlett sphericity test were first used to test the suitability of instructor BIM capability scale data for factor analysis. The KMO test is used to evaluate the sampling adequacy of factor analysis. KMO values range from 0 to 1, with higher values indicating greater suitability for factor analysis. The data obtained had a KMO of 0.938, indicating that these data are very suitable for factor analysis. The Bartlett sphericity test is used to evaluate whether the correlation matrix is significant. The significance level of Bartlett’s test is usually set to 0.05 in accordance with predetermined criteria. The obtained significance value was 0.000, significantly lower than 0.05. These results indicated a significant correlation between variables, and the dataset supported the use of factor analysis.
In this study, the main factors were measured in the form of scales, so testing the data quality of the measurement results was an important prerequisite for ensuring the significance of subsequent analysis. To ensure the internal consistency within the attributes grouped under factors and reliability of the data, the most commonly used Cronbach’s alpha reliability analysis was performed using SPSS 27.0 software [84]. The internal consistency was explained by the reliability coefficient, which is based on the average correlation between the attributes and the number of total attributes in the factor. The value of Cronbach’s alpha varies from 0 to 1. A higher value of Cronbach’s alpha indicates greater internal consistency and vice versa. The whole Cronbach’s alpha coefficient for the questionnaire data was 0.941, which was greater than the required minimum value of 0.90 [80]. Moreover, the indicators within every dimension exhibited a value exceeding 0.6 [85], indicating strong internal consistency within their respective dimensions. Table 2 summarizes the Cronbach alpha coefficients.

3.3.1. Confirmatory Factor Analysis (CFA)

To promote the theoretical understanding of the multidimensionality of instructor BIM capability, strict confirmatory factor analysis was conducted using AMOS 28 to construct the CFA model, as shown in Figure 4. The BIM capability measurement model for instructors is an endogenous latent variable, while technical capability, organizational capability, personnel capability, process capability, support capability are exogenous latent variables. Different goodness-of-fitness (GOF) measures different aspects of a model’s capability to represent the data. As shown in Table 3, χ2/df = 1.156 < 3, GFI = 0.918 > 0.9, AGFI = 0.897 > 0.8, CFI = 0.992 > 0.9, IFI = 0.992 > 0.9, TLI = 0.990 > 0.9, and RMSEA = 0.026 < 0.05 [86,87]. The GOF test results offered good support for the fit of the model.
To ensure the accuracy of the testing structure model, it was crucial to establish an effective measurement model and conduct a comprehensive evaluation of its validity and reliability. Under the premise that the CFA model of the instructors’ BIM capability scale had good fit, the Average Variance Extracted (AVE) and Composite Reliability (CR) of every dimension of the scale needed to be further examined. The standardized factor loadings of every indicator measurement on its corresponding dimension were calculated through the established CFA model. Subsequently, the AVE and CR values of every dimension were calculated accordingly. As shown in Table 4, the AVE values for all five dimensions exceeded the recommended threshold of 0.5, indicating satisfactory convergent validity of the indicators and constructs. Moreover, the Composite Reliability (CR) values for all dimensions were higher than 0.7, indicating excellent internal consistency and reliability [80].
Discriminant validity refers to the degree to which a given structure differs from other structures [88]. An adequate discriminant validity means the square root of the AVE of each construct should be higher than the inter-construct correlation, and a measurement item’s loading on its respective construct should exceed the cross loadings [89]. From the analysis results in Table 5, it was concluded that the standardized correlation coefficients between each dimension were less than the square root of the AVE values corresponding to the dimension, indicating good discriminant validity between each dimension.

3.3.2. Structural Equation Modeling (SEM)

SEM is a multivariate statistical technique that includes two types of models: measurement models (confirmatory factor analysis) and structural models (regression or path analysis) [90]. It allows researchers to explore complex relationships between variables by analyzing covariance matrices. AMOS 28 software was used for variable setting and analysis. After conducting CFA on the BIM capability measurement model of the instructors mentioned above, the next step was to conduct path analysis and ultimately determine the weight coefficients of each potential variable and observation variable in the model. Figure 5 shows the SEM model of instructor BIM capability.
Similarly, Amos 28 was used to test the model fit and path relationship hypothesis of the SEM model. The test results are shown in Table 6 and Table 7. It can be observed that the model has good adaptability and significant path relationships.
Path coefficient is a quantitative way to represent the relationship between latent variables and the relationship between latent variables and measured variables. The larger the value of the path coefficient, the deeper the impact. Based on standardized path coefficients, the path coefficients for the five dimensions of instructor BIM capability (technical capability, organizational capability, personnel capability, process capability, and support capability) were 0.69, 0.63, 0.77, 0.54 and 0.75, respectively. These coefficients represent the strength and direction of the relationship between variables. Table 7 shows the modified model path coefficients and significance tests.
Finally, the standardized path coefficients obtained from the structural equation model were used to determine the weight coefficients of potential and observed variables in the model, as shown in Table 7. The weighting coefficients for every indicator level are shown in Table 8.

4. Discussion

This article constructed a BIM capability evaluation framework for competition instructors from the perspective of participating students. This framework included five dimensions, namely technical capability, organization capability, personnel capability, process capability, and support capability, with a total of 23 capability indicators, further proving the multidimensionality of the BIM capability of instructors and the importance of instructors in BIM competitions. Based on the weighted ranking of the five dimensions, personnel capability was assigned with the highest weighting, followed by support capability, while process capability was the least important. This is significantly different from the emphasis placed on technical and process capability in BIM capability research in the traditional AEC industry [52,91]. Therefore, the existing BIM capability framework is not suitable for screening excellent instructors. This further emphasizes the necessity of this study.
Another important finding of this study was that participating students from different levels of universities and different educational backgrounds have diverse requirements for the BIM capability of their instructors. Personnel capability was the dimension with the highest weight coefficient, including encouragement and support, vision and objectives, and psychological counseling. BIM capability models constructed in different contexts have emphasized the importance of personnel capability. Although different research subjects are targeted, this highlights the necessity of personnel capability in project management [40,49]. Based on the importance and support of Chinese education for similar innovative practice activities such as BIM competitions, most participating students recognized the importance of instructors’ personnel capability, especially their vision and objectives.
Different types of students participate in BIM competitions for different purposes. Students from the Double First-Class Initiative had multiple purposes for participating in the competition. Firstly, they hoped to master BIM technology through participation and enhance their competitiveness.; Secondly, obtaining postgraduate exemption was also an important goal; due to the importance of enhancing students’ innovative practical abilities in Chinese universities, award-winning students would receive grade point bonuses, which would help them obtain scholarships. This was also an important goal for students from junior colleges and ordinary undergraduate colleges. With the increasing demand for BIM talent in the AEC industry, obtaining more employment opportunities was also an important motivation for them. Setting goals and significance for winning awards could enhance their motivation and drive.
The support capability of instructors was also highly valued by participating students, including information collection and integration capability, training and education, acquisition of competition venues, hardware configuration, network environment, adaptability, software usage authorization, financial support, and technical assistance. In other contexts, this ability is considered a resource and hardware. Dib et al. [58] classified BIM capability as planning and management, process, team structure, hardware, process definition, and information management. Mahamadu et al. [57] classified BIM capability attributes for pre-qualification and selection activities, namely competence and resources, culture and attitude, and cost. Obviously, in BIM competitions, there is a higher demand for the support capability of instructors.
Different types of students attached varying degrees of importance to these capability indicators. Students from the Double First-Class Initiative had good competition conditions due to the sufficient funding of their respective universities. Therefore, they believed that the adaptability and technical assistance of instructors were the most important. The adaptability of the instructors helped to solve unexpected problems during the competition process, such as personnel changes, competition rules changes, etc., which helped them to efficiently complete the competition. Although Double First-Class Initiative students attached great importance to BIM, some of its technologies were still lacking. Inviting technical assistance could provide them with cutting-edge BIM knowledge and technical guidance, which kept them competitive in the competition.
Participating students from vocational colleges placed greater emphasize on instructors to provide support for basic competition facilities, such as funding support, venue, network environment, and software usage authorization. This was because ordinary universities have limited educational funding and limited support for innovative practical activities such as BIM competitions. Compared to students from regular undergraduate colleges, participating students believed that the training and education provided by instructors to the students was the most important aspect of competition preparation and support capability. Compared to vocational colleges, undergraduate colleges have more funds but do not have the same level of BIM education as the Double First-Class Initiative. Therefore, participating students from undergraduate colleges attached more importance to instructors in their BIM technology training and education. This was considered an effective way to improve their level of BIM.
Surprisingly, the technical capability of the instructors was the factor with the lowest weighting coefficient, which was different from the existing BIM capability frameworks’ emphasis on technical and process capability [38]. The main reason for this was China’s emphasis on and popularization of BIM technology. Most universities had launched BIM related courses, and most participating students had a certain basic knowledge of BIM technology. In addition, as BIM technology was operated on computer software, students generally had strong operational and learning abilities. However, students from different levels of schools also had different opinions on technical capability. The technical capability of instructors included BIM theoretical knowledge, software application capability, problem-solving capability, innovation capability, and professional knowledge of civil engineering. Participating students from the Double First-Class Initiative placed greater emphasis on the innovative capability of their instructors, expecting that instructors would cultivate their innovative thinking and practical abilities to achieve innovative results in BIM competitions. Students from undergraduate institutions believed that instructors’ capability to solve practical problems was more helpful to them. However, vocational students, due to their relatively weak professional knowledge, preferred to receive guidance from instructors with civil engineering knowledge during the competition process.
In addition, graduate students had different views on the BIM capability of instructors compared to undergraduate students. Graduate students owned good self-discipline and relatively good practical capability. They participated in innovative practical activities similar to BIM competitions to be exposed to the forefront of civil engineering and professional technology. Therefore, the capability of instructors to invite technical assistance was more recognized. Through contact with technical assistance, graduate students could be exposed to the cutting-edge technologies of BIM, which not only opened up new ideas for their research but also enhanced their competitiveness in the digital technology field.
In summary, the BIM capability assessment model is not based on established standards, but rather on mutual inspiration. Therefore, there is no universally accepted and widely used model [46]. Moreover, students from different educational backgrounds also have varying demands for the BIM capability of their instructors. Evaluating the BIM capabilities of BIM competition instructors can help improve the teaching effectiveness of BIM courses in universities and innovate the course content and teaching methods. In addition, this also helps to collect timely feedback from students, peers, and industry experts to evaluate and improve BIM education projects. Adjusting teaching methods and content based on feedback is necessary to ensure that educational programs keep pace with industry development. BIM education is an important means to meet the demand for BIM talents in the AEC industry. This research result will help cultivate more outstanding BIM talents for the AEC industry and alleviate the shortage of BIM talents in the AEC industry. BIM competitions are an important carrier for digital education practices in AEC related disciplines in universities. As the organizer and leader of digital education practice in universities, this article constructed a BIM capability evaluation model for instructors from the perspective of participating students, effectively promoting the digital practice and sustainable development of AEC related disciplines in universities.
Due to the lack of a BIM capabilities research framework for education personnel in the field of BIM education, this article only compared the theoretical results with the BIM capabilities research results of stakeholders in the AEC industry (such as owners, contractors, etc.) [42,43]. We found that there were differences between the BIM capabilities framework of instructors and stakeholders in the AEC industry, such as the psychological counseling capability of instructors, which is a typical difference. This difference arises from the different purposes of constructing BIM capabilities indicators between the two. For example, the goal of building a BIM capability framework for facility owners is to ensure the performance, efficiency, and profitability of assets in commercial activities. The construction of the BIM capabilities framework for instructors is aimed at improving students’ professional skills and competitiveness in educational activities. In other words, the ultimate goal of instructors is to cultivate talents through BIM competitions, and psychological counseling is an activity in competition guidance and also a responsibility of Chinese teachers. The capability framework established in the AEC industry is mostly built for commercial activities, and owners have no responsibility to consider the psychological feelings of participants. In addition, in terms of selecting capability indicators, the principles and decisions for facility owners’ BIM capability indicator selection are based on long-term investment returns, risk management, and asset performance. The indicator selection decisions of instructor BIM capabilities is more based on educational practice effectiveness, student participation, and competition results.
However, whether in the AEC industry or BIM competitions, the evaluation of BIM capability frameworks emphasizes the importance of the capability to collaborate effectively. In the AEC industry, this helps ensure close communication and collaboration among all parties involved in the project process, achieving a win-win situation; In BIM competitions, it helps to improve the overall performance of the participating teams. At the same time, both attach great importance to the indicator of “innovation capability”. Evaluating BIM capabilities can motivate all parties to continuously learn and master new BIM technologies and promote technological innovation and progress, which is crucial for the long-term development of the AEC industry and BIM education.

5. Conclusions and Recommendations

BIM is an important carrier for the digitalization and sustainable development of disciplines related to the AEC industry. With the popularization of BIM, the demand for BIM talents in the AEC industry is increasing [19]. BIM education is considered an important way to cultivate BIM talents [22]. It is an important digital platform for combining theory with practice in BIM education. As the direct organizer and leader of the competition, instructor BIM capability has a significant impact on competition results and learning outcomes. However, previous studies have not focused on the BIM capability of instructors. From the perspective of participating students, this article constructed a BIM capability evaluation framework of competition instructors using survey questionnaires and SEM methods, filling this research gap.
The framework proposed in this article includes five main dimensions, namely technical capability, organization capability, personnel capability, process capability, and support capability, with a total of 23 secondary capability indicators. This confirmed the multidimensionality of the BIM capability of instructors. Through path analysis, the weights of every capability dimension were determined. The results showed that personnel capability was the most important BIM capability for instructors, followed by support capability, and process capability was the least important, which was significantly different from the emphasis on BIM technology and process competence in the AEC industry. The weights of secondary capability indicators in different dimensions were variable. It should be noted that “Vision and objectives” was the most important in the dimension of personnel capability. Domestic universities attach great importance to innovation practice activities similar to BIM competitions, and award-winning students could receive GPA bonus points or exemption from postgraduate entrance exams. Students who master higher levels of BIM knowledge and practical abilities will not only have more career choices, but also be more competitive in the job market. Due to fierce market competition, possessing BIM skills can also contribute to future career development. With the increasing demand for BIM talents in the AEC industry, obtaining more employment opportunities was also an important motivation for them. Setting goals and defining the benefits for winning awards to participating students would greatly enhance their motivation. This reflects the characteristics of Chinese higher education that emphasizes innovation and practice.
This article also enriches the theoretical knowledge of BIM education and provides the concepts of BIM capability indicators for instructors. The proposed BIM capability evaluation framework for instructors provides a reference for universities to select excellent instructors, and also guides the direction for instructors to improve their BIM capability. In addition, this study laid a foundation for research related to relevant capability of instructors in other disciplines and BIM education. It also provides strong support for future research and practical work related to BIM education.

6. Limitations and Future Research

However, some limitations of this research need to be acknowledged. One issue was that the data collected by this research institute came from China. The BIM capability evaluation framework for instructors might vary with different compositions of participants with distinct experiences and backgrounds. Therefore, conducting more extensive data collection in different countries and regions could enhance the applicability of the BIM capability evaluation framework for instructors. Another limitation was that the results of this study were based on the current competitive education scenarios. With the improvement of BIM education in universities and the development of BIM technology, the framework for evaluating the BIM capability of instructors might change. The use of snowball sampling may have potential limitations and impacts on research results, and it is recommended that future studies use more extensive sampling methods to validate our findings.
Future research can first expand the coverage of the study, explore the adaptability of the framework in different countries, and conduct a more comprehensive study of the BIM capabilities of instructors from the perspectives of multiple stakeholders, including instructors, event organizers, and BIM training institutions. Furthermore, it is also possible to explore how to improve the quality of BIM education in different types and levels of universities through measures such as optimizing personnel and resource allocation. This includes comparing the differences in supporting BIM education among different higher education institutions, establishing and implementing mechanisms to enhance the BIM capabilities of instructors, and developing corresponding improvement measures. It also includes exploring the costs and challenges involved in implementing these research results in universities to ensure that they can be effectively translated into educational practice. Through continuous in-depth research and accumulation, we hope to provide a solid theoretical and practical foundation for the development of BIM education, thereby promoting the progress of the entire industry.

Author Contributions

Conceptualization, P.P.; Methodology, P.P., M.L. and C.L.; Software, X.Z.; Validation, M.L. and P.P.; Data curation, M.L., C.L. and X.Z.; Writing—original draft, M.L.; Writing—review & editing, P.P.; Funding acquisition, P.P. All authors have read and agreed to the published version of the manuscript.

Funding

This study is jointly supported by the University–Industry Collaborative Education Program of China’s Ministry of Education [Construction of Civil Engineering Training Conditions Based on BIM Technology] (Grant Number: 202002256025), Innovation and Entrepreneurship Training Project for College Students of Yanshan University [Construction of Capability System for BIM Competition Instructors Based on the Perspective of Students] (Grant Number: CXXL20240441) and Innovation and Entrepreneurship Courses of Hebei Province in 2023 (Course Name: BIM Modeling Technology).

Data Availability Statement

Materials and data designed and/or generated in the study are available from the corresponding author on reasonable request.

Acknowledgments

We acknowledge the Academic Affairs Office, Innovation and Entrepreneurship Education Guidance Center and undergraduate students (Jiabin Zong and Jing Xu) of Yanshan University for their support in distributing questionnaires and collecting data.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Appendix A. Instructor BIM Capability Indicators V1

DimensionNumberIndicatorSources
Technical
capability
1BIM theoretical
knowledge
[6,38]
2Software application
capability
[43,60]
3Capability of solving
practical problems
[4,60]
4Innovation
capability
[92,93]
Organization
capability
5Teamwork
capability
[94]
6Communication
capability
[62]
7Coordination
capability
[42,95]
Personnel capability8Encouragement
and support
[49]
9Vision and
objectives
[96,97]
Process
capability
10Time management
capability
[42]
11Leadership
capability
[49,98]
12Project management
capability
[38,99]
Competition preparation and support13Information collection and integration capability[57,58,59]
14Training and
education
[40,100,101]
15Acquisition of
competition venues
[49,98]
16Hardware
configuration
[58,102]
17Network
environment
[40,97]
18Adaptability[103,104]
19Obtaining software usage authorization[60]

Appendix B. Instructor BIM Capability Indicators V2

DimensionCodeIndicatorDescription
Technical
capability
C1BIM
theoretical
knowledge
Instructors possess comprehensive BIM theoretical knowledge, which can better guide students in practical operations.
C2Software
application
capability
Instructors are proficient in commonly used BIM software and can guide students in modeling, collaboration, and simulation analysis within the software.
C3Capability
of solving
practical problems
BIM competitions often involve practical engineering projects, so instructors need to have rich practical experience and problem-solving capability and provide effective guidance to students on the problems they encounter in practice.
C4Innovation
capability
BIM competitions have high requirements for the innovative capability of participating students, which requires instructors to propose innovative ideas and solutions in project design and competition strategies.
C5Professional knowledge in civil engineeringThe civil engineering professional knowledge of the instructor can help students accurately understand and complete tasks, and also play a crucial role in solving technical problems and improving students’ overall quality.
Organization
capability
C6Teamwork
capability
BIM competitions require group members to collaborate and complete tasks, so instructors need to have the ability to collaborate effectively and organize students to complete tasks together.
C7Communication
capability
Instructors are able to clearly express their thoughts and ideas and have good communication with the event organizers and universities.
C8Coordination
capability
Instructors have the ability to coordinate possible conflicts and contradictions between students to ensure the smooth progress of the project.
Personnel capabilityC9Encouragement
and support
Encouragement and support from instructors can enhance students’ confidence, stimulate their enthusiasm, help them better cope with challenges, and achieve excellent results.
C10Vision and
objectives
By establishing a clear and challenging shared vision for students, instructors can stimulate their initiative and creativity, and promote cooperation and effort among students.
C11Psychological counseling
capability
The psychological counseling capability of instructors can not only help students relieve stress and enhance confidence, but also stimulate their potential, enhance team cohesion, jointly cope with challenges, and achieve excellent results.
Process
capability
C12Time
management
capability
The instructors can arrange time reasonably to ensure the smooth progress of the project and competition preparation work.
C13Leadership
capability
The instructors lead the team to clarify goals, develop strategies, provide strong support to students, and ensure efficient collaboration within the team.
C14Project
management
capability
The instructor is the actual manager of the entire project, which requires the instructor to plan the project schedule, allocate tasks and resources reasonably, and ensure that the project is completed on time.
Support
capability
C15Information collection and integration capabilityInstructors can timely obtain competition related informatio, and organize it into useful materials for students.
C16Training and
education
Through training and education, instructors can help students deeply grasp the principles and applications, enhance their practical and problem-solving abilities, and prepare them for competitions and the workplace.
C17Acquisition of
competition
venues
Instructors provide specific competition venues for participating students, which can provide them with a stable, safe, and professional competition environment, and provide them with necessary training and communication opportunities.
C18Hardware
configuration
The operation of BIM software requires high-performance hardware configuration. Providing good hardware configuration by instructors is beneficial for improving the efficiency of student homework completion.
C19Network
environment
Providing a good network environment for participating students ensures smooth data transmission and real-time collaboration, helps students obtain competition information and resources in a timely manner, and improves competition efficiency and quality.
C20AdaptabilityThe theme and requirements of BIM competitions on different tracks may constantly change, and team members may also change. Therefore, instructors need adaptability to adjust guidance strategies and methods in a timely manner to meet the new requirements of competitions.
C21Obtaining
software usage authorization
The instructors can apply for software usage authorization from a software company that cooperates with competition organizers for participating students, ensuring that the format of their submitted entries is correct and receiving better software support and services.
C22Financial
support
Applying for financial support can ensure that student teams have sufficient resources, enhance competition competitiveness, reduce the economic burden, and increase student motivation.
C23Technical
assistance
Inviting technical assistance to students in BIM competitions can significantly enhance the technical level and innovation ability of student teams, broaden their horizons, enhance team collaboration, and ultimately improve the competition results.

References

  1. Sears, S.K.; Clough, R.H.; Sears, G.A. Construction Project Management: A Practical Guide to Field Construction Management; John Wiley & Sons: Hoboken, NJ, USA, 2008. [Google Scholar]
  2. Alshawi, M.; Ingirige, B. Web-enabled project management: An emerging paradigm in construction. Autom. Constr. 2003, 12, 349–364. [Google Scholar] [CrossRef]
  3. Seyis, S. Pros and Cons of Using Building Information Modeling in the AEC Industry. J. Constr. Eng. Manag. 2019, 145, 04019046. [Google Scholar] [CrossRef]
  4. Succar, B. Building information modelling framework: A research and delivery foundation for industry stakeholders. Autom. Constr. 2009, 18, 357–375. [Google Scholar] [CrossRef]
  5. Chong, H.-Y.; Lee, C.-Y.; Wang, X. A mixed review of the adoption of Building Information Modelling (BIM) for sustainability. J. Clean. Prod. 2017, 142, 4114–4126. [Google Scholar] [CrossRef]
  6. Eastman, C.M. BIM Handbook: A Guide to Building Information Modeling for Owners, Managers, Designers, Engineers and Contractors; John Wiley & Sons: Hoboken, NJ, USA, 2011. [Google Scholar]
  7. Lu, Y.; Wu, Z.; Chang, R.; Li, Y. Building Information Modeling (BIM) for green buildings: A critical review and future directions. Autom. Constr. 2017, 83, 134–148. [Google Scholar] [CrossRef]
  8. Elmualim, A.; Gilder, J. BIM: Innovation in design management, influence and challenges of implementation. Archit. Eng. Des. Manag. 2014, 10, 183–199. [Google Scholar] [CrossRef]
  9. Bynum, P.; Issa, R.R.A.; Olbina, S. Building Information Modeling in Support of Sustainable Design and Construction. J. Constr. Eng. Manag. 2013, 139, 24–34. [Google Scholar] [CrossRef]
  10. Azhar, S. Building information modeling (BIM): Trends, benefits, risks, and challenges for the AEC industry. Leadersh. Manag. Eng. 2011, 11, 241–252. [Google Scholar] [CrossRef]
  11. Wong, J.K.-W.; Kuan, K.-L. Implementing ‘BEAM Plus’ for BIM-based sustainability analysis. Autom. Constr. 2014, 44, 163–175. [Google Scholar] [CrossRef]
  12. Gholizadeh, P.; Esmaeili, B.; Goodrum, P. Diffusion of Building Information Modeling Functions in the Construction Industry. J. Manag. Eng. 2018, 34, 04017060. [Google Scholar] [CrossRef]
  13. Wang, H.; Meng, X. BIM-Supported Knowledge Management: Potentials and Expectations. J. Manag. Eng. 2021, 37, 04021032. [Google Scholar] [CrossRef]
  14. MarketLine Industry Profile. Construction in China; MarketLine: London, UK, 2014; Reference Code: 0099-2801. [Google Scholar]
  15. Azhar, S.; Khalfan, M.; Maqsood, T. Building information modeling (BIM): Now and beyond. Australas. J. Constr. Econ. Build. 2012, 12, 15–28. [Google Scholar]
  16. Francom, T.C.; El Asmar, M. Project quality and change performance differences associated with the use of building information modeling in design and construction projects: Univariate and multivariate analyses. J. Constr. Eng. Manag. 2015, 141, 04015028. [Google Scholar] [CrossRef]
  17. Hardin, B.; McCool, D. BIM and Construction Management: Proven Tools, Methods, and Workflows; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
  18. Sacks, R.; Barak, R. Teaching building information modeling as an integral part of freshman year civil engineering education. J. Prof. Issues Eng. Educ. Pract. 2010, 136, 30–38. [Google Scholar] [CrossRef]
  19. Chien, K.-F.; Wu, Z.-H.; Huang, S.-C. Identifying and assessing critical risk factors for BIM projects: Empirical study. Autom. Constr. 2014, 45, 1–15. [Google Scholar] [CrossRef]
  20. Turk, Ž. Ten questions concerning building information modelling. Build. Environ. 2016, 107, 274–284. [Google Scholar] [CrossRef]
  21. Edirisinghe, R.; London, K. Comparative analysis of international and national level BIM standardization efforts and BIM adoption. In Proceedings of the 32nd CIB W78 Conference, Eindhoven, The Netherlands, 26–29 October 2015; pp. 149–158. [Google Scholar]
  22. Wu, W.; Mayo, G.; McCuen, T.L.; Issa, R.R.A.; Smith, D.K. Building Information Modeling Body of Knowledge. I: Background, Framework, and Initial Development. J. Constr. Eng. Manag. 2018, 144, 04018065. [Google Scholar] [CrossRef]
  23. Salman, H. Preparing Architectural Technology Students for BIM 2016 Mandate; Robert Gordon University: Aberdeen, UK, 2013. [Google Scholar]
  24. Rooney, K. BIM Education-Global Summary 2015 Update Report; NATSPEC Construction Information: Sydney, Australia, 2015. [Google Scholar]
  25. Chegu Badrinath, A.; Chang, Y.T.; Hsieh, S.H. A review of tertiary BIM education for advanced engineering communication with visualization. Vis. Eng. 2016, 4, 9. [Google Scholar] [CrossRef]
  26. Abdirad, H.; Dossick, C.S. BIM curriculum design in architecture, engineering, and construction education: A systematic review. J. Inf. Technol. Constr. 2016, 21, 250–271. [Google Scholar]
  27. Ganah, A.; John, G.A. Achieving Level 2 BIM by 2016 in the UK. In Computing in Civil and Building Engineering (2014); ASCE: Reston, VA, USA, 2014; pp. 143–150. [Google Scholar]
  28. Wu, W.; Issa, R.R. BIM education and recruiting: Survey-based comparative analysis of issues, perceptions, and collaboration opportunities. J. Prof. Issues Eng. Educ. Pract. 2014, 140, 04013014. [Google Scholar] [CrossRef]
  29. Russell, D.; Cho, Y.K.; Cylwik, E. Learning opportunities and career implications of experience with BIM/VDC. Pract. Period. Struct. Des. Constr. 2014, 19, 111–121. [Google Scholar] [CrossRef]
  30. AboWardah, E. Bridging the gap between research and schematic design phases in teaching architectural graduation projects. Front. Archit. Res. 2019, 9, 82–105. [Google Scholar] [CrossRef]
  31. Puolitaival, T.; Forsythe, P. Practical challenges of BIM education. Struct. Surv. 2016, 34, 351–366. [Google Scholar] [CrossRef]
  32. Sacks, R.; Pikas, E. Building information modeling education for construction engineering and management. I: Industry requirements, state of the art, and gap analysis. J. Constr. Eng. Manag. 2013, 139, 04013016. [Google Scholar] [CrossRef]
  33. Wu, W.; Hyatt, B. Experiential and project-based learning in BIM for sustainable living with tiny solar houses. Procedia Eng. 2016, 145, 579–586. [Google Scholar] [CrossRef]
  34. Borkowski, A.S. Experiential learning in the context of BIM. STEM Educ. 2023, 3, 190–204. [Google Scholar] [CrossRef]
  35. López-Querol, S.; Sánchez-Cambronero, S.; Rivas, A.; Garmendia, M. Improving civil engineering education: Transportation geotechnics taught through project-based learning methodologies. J. Prof. Issues Eng. Educ. Pract. 2015, 141, 04014007. [Google Scholar] [CrossRef]
  36. Bellido-Montesinos, P.; Lozano-Galant, F.; Castilla, F.J.; Lozano-Galant, J.A. Experiences learned from an international BIM contest: Software use and information workflow analysis to be published in: Journal of Building Engineering. J. Build. Eng. 2019, 21, 149–157. [Google Scholar] [CrossRef]
  37. Ao, Y.; Peng, P.; Li, J.; Li, M.; Bahmani, H.; Wang, T. What Determines BIM Competition Results of Undergraduate Students in the Architecture, Engineering and Construction Industry? Behav. Sci. 2022, 12, 360. [Google Scholar] [CrossRef]
  38. Succar, B.; Sher, W.; Williams, A. Measuring BIM performance: Five metrics. Archit. Eng. Des. Manag. 2012, 8, 120–142. [Google Scholar] [CrossRef]
  39. Munianday, P.; Radzi, A.R.; Esa, M.; Rahman, R.A. Optimal strategies for improving organizational BIM capabilities: PLS-SEM approach. J. Manag. Eng. 2022, 38, 04022015. [Google Scholar] [CrossRef]
  40. Yilmaz, G.; Akcamete, A.; Demirors, O. BIM-CAREM: Assessing the BIM capabilities of design, construction and facilities management processes in the construction industry. Comput. Ind. 2023, 147, 103861. [Google Scholar] [CrossRef]
  41. Koo, H.J.; O’Connor, J.T. A strategy for building design quality improvement through BIM capability analysis. J. Constr. Eng. Manag. 2022, 148, 04022066. [Google Scholar] [CrossRef]
  42. Ahuja, R.; Sawhney, A.; Arif, M. Developing organizational capabilities to deliver lean and green project outcomes using BIM. Eng. Constr. Archit. Manag. 2018, 25, 1255–1276. [Google Scholar] [CrossRef]
  43. Giel, B.; Issa, R.R. Framework for evaluating the BIM competencies of facility owners. J. Manag. Eng. 2016, 32, 04015024. [Google Scholar] [CrossRef]
  44. Al-Ashmori, Y.Y.; Othman, I.; Rahmawati, Y.; Amran, Y.M.; Sabah, S.A.; Rafindadi, A.D.U.; Mikić, M. BIM benefits and its influence on the BIM implementation in Malaysia. Ain Shams Eng. J. 2020, 11, 1013–1019. [Google Scholar] [CrossRef]
  45. He, Q.; Wang, G.; Luo, L.; Shi, Q.; Xie, J.; Meng, X. Mapping the managerial areas of Building Information Modeling (BIM) using scientometric analysis. Int. J. Proj. Manag. 2017, 35, 670–685. [Google Scholar] [CrossRef]
  46. Wu, C.; Xu, B.; Mao, C.; Li, X. Overview of BIM maturity measurement tools. J. Inf. Technol. Constr. 2017, 22, 34–62. [Google Scholar]
  47. Eastman, C.; Teicholz, P.; Sacks, R.; Liston, K. Managing BIM technology in the building industry. AECbytes Feb 2008, 12, 2008. [Google Scholar]
  48. Gu, N.; London, K. Understanding and facilitating BIM adoption in the AEC industry. Autom. Constr. 2010, 19, 988–999. [Google Scholar] [CrossRef]
  49. Siebelink, S.; Voordijk, J.T.; Adriaanse, A. Developing and testing a tool to evaluate BIM maturity: Sectoral analysis in the Dutch construction industry. J. Constr. Eng. Manag. 2018, 144, 05018007. [Google Scholar] [CrossRef]
  50. Smits, W.; van Buiten, M.; Hartmann, T. Yield-to-BIM: Impacts of BIM maturity on project performance. Build. Res. Inf. 2017, 45, 336–346. [Google Scholar] [CrossRef]
  51. Giel, B.; Issa, R. Framework for evaluating the BIM competencies of building owners. In Computing in Civil and Building Engineering (2014); ASCE: Reston, VA, USA, 2014; pp. 552–559. [Google Scholar]
  52. Pan, P.; Wang, Y.; Yang, Y.; Zhang, S. Conceptualization and measurement of owner BIM capabilities: From a project owner organization perspective. In Engineering, Construction and Architectural Management; Emerald Publishing Limited: Leeds, UK, 2024. [Google Scholar]
  53. Abbasianjahromi, H.; Ahangar, M.; Ghahremani, F. A maturity assessment framework for applying BIM in consultant companies. Iran. J. Sci. Technol. Trans. Civ. Eng. 2019, 43, 637–649. [Google Scholar] [CrossRef]
  54. Nonirit, E.; Poirier, É.A.; Forgues, D. Assessing the assessor: A framework for BIM maturity, capacity, and competency evaluation at the organizational level. Can. J. Civ. Eng. 2022, 50, 143–156. [Google Scholar] [CrossRef]
  55. Mahamadu, A.-M. Development of a decision support framework to aid selection of construction supply chain organisations for BIM-enabled projects. In Faculty of Environment and Technology; University of the West of England: Bristol, UK, 2016. [Google Scholar]
  56. Fenby-Taylor, H.; Thompson, N.; Philp, D.; MacLaren, A.; Rossiter, D.; Bartley, T. Scotland Global BIM Study; Elsevier: Amsterdam, The Netherlands, 2016. [Google Scholar]
  57. Mahamadu, A.-M.; Mahdjoubi, L.; Booth, C.A. Critical BIM qualification criteria for construction pre-qualification and selection. Archit. Eng. Des. Manag. 2017, 13, 326–343. [Google Scholar] [CrossRef]
  58. Dib, H.; Chen, Y.; Cox, R. A framework for measuring building information modeling maturity based on perception of practitioners and academics outside the USA. In Proceedings of the CIB W78, Beirut, Lebanon, 17–19 October 2012; p. 2012. [Google Scholar]
  59. Van Berlo, L.; Dijkmans, T.; Hendriks, H.; Spekkink, D.; Pel, W. BIM QuickScan: Benchmark of BIM performance in the Netherlands. In Proceedings of the CIB W78, Beirut, Lebanon, 17–19 October 2012; p. 2012. [Google Scholar]
  60. Mahamadu, A.-M.; Manu, P.; Mahdjoubi, L.; Booth, C.; Aigbavboa, C.; Abanda, F.H. The importance of BIM capability assessment. Eng. Constr. Archit. Manag. 2019, 27, 24–48. [Google Scholar] [CrossRef]
  61. Messner, J. BIM Planning Guide for Facility Owners; CIC Research Group, Penn State University Park: University Park, PA, USA, 2013. [Google Scholar]
  62. Juul, J.; Jensen, H. Relational Competence: Towards a New Culture of Education; Mathias Voelchert GmbH Verlag: Windberg, Germany, 2017. [Google Scholar]
  63. Yin, R.K. Case Study Research: Design and Methods; Sage: New York, NY, USA, 2009; Volume 5. [Google Scholar]
  64. Bidart, C.; Longo, M.E.; Mendez, A. Time and process: An operational framework for processual analysis. Eur. Sociol. Rev. 2013, 29, 743–751. [Google Scholar] [CrossRef]
  65. Fellows, R.F.; Liu, A.M. Research Methods for Construction; John Wiley & Sons: Hoboken, NJ, USA, 2021. [Google Scholar]
  66. Denzin, N.K.; Lincoln, Y.S. The Sage handbook of Qualitative Research; Sage: New York, NY, USA, 2011. [Google Scholar]
  67. Carley, K.; Palmquist, M. Extracting, representing, and analyzing mental models. Soc. Forces 1992, 70, 601–636. [Google Scholar] [CrossRef]
  68. Guest, G.; Bunce, A.; Johnson, L. How many interviews are enough? An experiment with data saturation and variability. Field Methods 2006, 18, 59–82. [Google Scholar] [CrossRef]
  69. Patton, M.Q. Qualitative Research & Evaluation Methods: Integrating Theory and Practice; Sage Publications: New York, NY, USA, 2014. [Google Scholar]
  70. Shu, F.; Sugimoto, C.R.; Larivière, V. The institutionalized stratification of the Chinese higher education system. Quant. Sci. Stud. 2021, 2, 327–334. [Google Scholar] [CrossRef]
  71. Abbasnejad, B.; Nepal, M.P.; Ahankoob, A.; Nasirian, A.; Drogemuller, R. Building Information Modelling (BIM) adoption and implementation enablers in AEC firms: A systematic literature review. Archit. Eng. Des. Manag. 2021, 17, 411–433. [Google Scholar] [CrossRef]
  72. Zhang, X.; Shen, L.; Wu, Y. Green strategy for gaining competitive advantage in housing development: A China study. J. Clean. Prod. 2011, 19, 157–167. [Google Scholar] [CrossRef]
  73. Buttle, F. SERVQUAL: Review, critique, research agenda. Eur. J. Mark. 1996, 30, 8–32. [Google Scholar] [CrossRef]
  74. Dawes, J. Do data characteristics change according to the number of scale points used? An experiment using 5-point, 7-point and 10-point scales. Int. J. Mark. Res. 2008, 50, 61–104. [Google Scholar] [CrossRef]
  75. Zhao, X.; Hwang, B.-G.; Pheng Low, S.; Wu, P. Reducing hindrances to enterprise risk management implementation in construction firms. J. Constr. Eng. Manag. 2015, 141, 04014083. [Google Scholar] [CrossRef]
  76. Wilkins, J.R. Construction workers’ perceptions of health and safety training programmes. Constr. Manag. Econ. 2011, 29, 1017–1026. [Google Scholar] [CrossRef]
  77. Mao, C.; Shen, Q.; Pan, W.; Ye, K. Major barriers to off-site construction: The developer’s perspective in China. J. Manag. Eng. 2015, 31, 04014043. [Google Scholar] [CrossRef]
  78. Sekaran, U.; Bougie, R. Research Methods for Business: A Skill Building Approach; John Wiley & Sons: Hoboken, NJ, USA, 2016. [Google Scholar]
  79. Jiang, W.; Lu, Y.; Le, Y. Trust and project success: A twofold perspective between owners and contractors. J. Manag. Eng. 2016, 32, 04016022. [Google Scholar] [CrossRef]
  80. Hair, J.F. Multivariate Data Analysis; Kennesaw State University: Kennesaw, GA, USA, 2009. [Google Scholar]
  81. Gorsuch, R.L. Factor Analysis. Handb. Psychol. 2003, 2, 143–164. [Google Scholar]
  82. Chen, L.; Fong, P.S. Revealing performance heterogeneity through knowledge management maturity evaluation: A capability-based approach. Expert Syst. Appl. 2012, 39, 13523–13539. [Google Scholar] [CrossRef]
  83. Tabish, S.Z.S.; Jha, K.N. Success traits for a construction project. J. Constr. Eng. Manag. 2012, 138, 1131–1138. [Google Scholar] [CrossRef]
  84. Hair, J.F.; Ringle, C.M.; Sarstedt, M. PLS-SEM: Indeed a silver bullet. J. Mark. Theory Pract. 2011, 19, 139–152. [Google Scholar] [CrossRef]
  85. Nunnally, J.; Bernstein, I. Psychometric Theory, 3rd ed.; MacGraw-Hill: New York, NY, USA, 1994. [Google Scholar]
  86. Schreiber, J.B.; Nora, A.; Stage, F.K.; Barlow, E.A.; King, J. Reporting structural equation modeling and confirmatory factor analysis results: A review. J. Educ. Res. 2006, 99, 323–338. [Google Scholar] [CrossRef]
  87. Doloi, H.; Sawhney, A.; Iyer, K. Structural equation model for investigating factors affecting delay in Indian construction projects. Constr. Manag. Econ. 2012, 30, 869–884. [Google Scholar] [CrossRef]
  88. Hulland, J. Use of partial least squares (PLS) in strategic management research: A review of four recent studies. Strateg. Manag. J. 1999, 20, 195–204. [Google Scholar] [CrossRef]
  89. Larcker David, F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar]
  90. Chen, Y.Q.; Zhang, Y.B.; Liu, J.Y.; Mo, P. Interrelationships among critical success factors of construction projects based on the structural equation model. J. Manag. Eng. 2012, 28, 243–251. [Google Scholar] [CrossRef]
  91. Jung, Y.; Joo, M. Building information modelling (BIM) framework for practical implementation. Autom. Constr. 2011, 20, 126–133. [Google Scholar] [CrossRef]
  92. Selçuk Çıdık, M.; Boyd, D.; Thurairajah, N. Innovative capability of building information modeling in construction design. J. Constr. Eng. Manag. 2017, 143, 04017047. [Google Scholar] [CrossRef]
  93. Rajapathirana, R.J.; Hui, Y. Relationship between innovation capability, innovation type, and firm performance. J. Innov. Knowl. 2018, 3, 44–55. [Google Scholar] [CrossRef]
  94. Antwi-Afari, M.; Li, H.; Pärn, E.; Edwards, D.J. Critical success factors for implementing building information modelling (BIM): A longitudinal review. Autom. Constr. 2018, 91, 100–110. [Google Scholar] [CrossRef]
  95. Liu, Y.; Van Nederveen, S.; Hertogh, M. Understanding effects of BIM on collaborative design and construction: An empirical study in China. Int. J. Proj. Manag. 2017, 35, 686–698. [Google Scholar] [CrossRef]
  96. Adriaanse, A.; Voordijk, H.; Dewulf, G. The use of interorganisational ICT in United States construction projects. Autom. Constr. 2010, 19, 73–83. [Google Scholar] [CrossRef]
  97. Liang, C. Development of a multifunctional building information modelling (BIM) maturity model. In HKU Theses Online (HKUTO); The University of Hong Kong: Hong Kong, China, 2016. [Google Scholar]
  98. Chunduri, S.; Kreider, R.; Messner, J.I. A case study implementation of the BIM planning procedures for facility owners. In AEI 2013: Building Solutions for Architectural Engineering; ASCE: Reston, VA, USA, 2013; pp. 691–701. [Google Scholar]
  99. Wang, T.; Chen, H.-M. Integration of building information modeling and project management in construction project life cycle. Autom. Constr. 2023, 150, 104832. [Google Scholar] [CrossRef]
  100. Succar, B.; Sher, W.; Williams, A. An integrated approach to BIM competency assessment, acquisition and application. Autom. Constr. 2013, 35, 174–189. [Google Scholar] [CrossRef]
  101. Ahn, Y.H.; Kwak, Y.H.; Suk, S.J. Contractors’ transformation strategies for adopting building information modeling. J. Manag. Eng. 2016, 32, 05015005. [Google Scholar] [CrossRef]
  102. Yilmaz, G.; Akcamete, A.; Demirors, O. A reference model for BIM capability assessments. Autom. Constr. 2019, 101, 245–263. [Google Scholar] [CrossRef]
  103. Langroodi, B.P.; Staub-French, S. Change management with building information models: A case study. In Proceedings of the Construction Research Congress 2012: Construction Challenges in a Flat World, West Lafayette, IN, USA, 21–23 May 2012; pp. 1182–1191. [Google Scholar]
  104. Pittet, P.; Cruz, C.; Nicolle, C. An ontology change management approach for facility management. Comput. Ind. 2014, 65, 1301–1315. [Google Scholar] [CrossRef]
Figure 1. The research framework.
Figure 1. The research framework.
Buildings 14 03598 g001
Figure 2. Research methodology.
Figure 2. Research methodology.
Buildings 14 03598 g002
Figure 3. Breakdown of survey respondents. CBIMMA: Construction BIM Modeling and Application; RBBIMDM: Road and Bridge BIM Design and Modeling; ABIMCPM: Application of BIM Construction Project Management; BIMWPCMA: BIM Whole Process Cost Management and Application; ICPB: Intelligent construction and prefabricated buildings; TFPABDD: The full process application of BIM decoration design; IBIMAAD; Innovation in BIM application in architectural design; MAMEBIM: Modeling and Application of Mechanical and Electrical BIM; HBE: Highway Bridge Engineering; BEEAE: Built Environment and Energy Application Engineering; WSDSE: Water Supply and Drainage Science and Engineering.
Figure 3. Breakdown of survey respondents. CBIMMA: Construction BIM Modeling and Application; RBBIMDM: Road and Bridge BIM Design and Modeling; ABIMCPM: Application of BIM Construction Project Management; BIMWPCMA: BIM Whole Process Cost Management and Application; ICPB: Intelligent construction and prefabricated buildings; TFPABDD: The full process application of BIM decoration design; IBIMAAD; Innovation in BIM application in architectural design; MAMEBIM: Modeling and Application of Mechanical and Electrical BIM; HBE: Highway Bridge Engineering; BEEAE: Built Environment and Energy Application Engineering; WSDSE: Water Supply and Drainage Science and Engineering.
Buildings 14 03598 g003
Figure 4. CFA model diagram.
Figure 4. CFA model diagram.
Buildings 14 03598 g004
Figure 5. SEM model diagram.
Figure 5. SEM model diagram.
Buildings 14 03598 g005
Table 1. Background information of 12 respondents.
Table 1. Background information of 12 respondents.
CodeEducational BackgroundMajorUniversity
Level
Award Situation
R1Doctoral CandidateCivil EngineeringProject 985BIM Competition Grand Prize
R2PostgraduateCivil EngineeringProject 985First Prize in BIM Competition
R3Undergraduate StudentCivil EngineeringProject 985First Prize in BIM Competition
R4PostgraduateCivil EngineeringProject 211First Prize in BIM Competition
R5Undergraduate StudentArchitectureProject 211BIM Competition Grand Prize
R6Undergraduate StudentArchitectureProject 211First Prize in BIM Competition
R7Undergraduate StudentCivil EngineeringOrdinary undergraduateBIM Competition Grand Prize
R8Undergraduate StudentCivil EngineeringOrdinary undergraduateBIM Competition Grand Prize
R9Undergraduate StudentArchitectureOrdinary undergraduateFirst Prize in BIM Competition
R10Junior College StudentCivil EngineeringJunior collegeBIM Competition Grand Prize
R11Junior College StudentCivil EngineeringJunior collegeFirst Prize in BIM Competition
R12Junior College StudentCivil EngineeringJunior collegeFirst Prize in BIM Competition
Table 2. Reliability test.
Table 2. Reliability test.
DimensionCodeCronbach αIndicators of Instructor BIM Capability
Technical
capability
JS10.890BIM
theoretical
knowledge
JS2Software
application
capability
JS3Capability
of solving
practical problems
JS4Innovation
capability
JS5Professional knowledge in civil engineering
Organization
capability
ZL10.871Teamwork
capability
ZL2Communication
capability
ZL3Coordination
capability
Personnel
capability
RY10.868Encouragement
and support
RY2Vision and
objectives
RY3Psychological counseling
capability
Process
capability
LC10.847Time
management
capability
LC2Leadership
capability
LC3Project
management
capability
Support
capability
JZ10.963Information collection and integration capability
JZ2Training and
education
JZ3Acquisition of
competition
venues
JZ4Hardware
configuration
JZ5Network
environment
JZ6Adaptability
JZ7Obtaining
software usage authorization
JZ8Financial
support
JZ9Technical
assistance
Table 3. GOF Measures.
Table 3. GOF Measures.
NumberGOF MeasureRecommended Level of GOF MeasuresActual Measurement Results
1Chi-square/degree of freedom (χ2/df)1 to 31.156
2GFI0 (no fit) to 1 (perfect fit)0.918
3AGFI0 (no fit) to 1 (perfect fit)0.897
4CFI0 (no fit) to 1 (perfect fit)0.992
5RMSEA<0.05 (very good) to 0.1 (threshold)0.026
6IFI0 (no fit) to 1 (perfect fit)0.992
7TLI0 (no fit) to 1 (perfect fit)0.990
Note: GFI = Goodness of fit index; AGFI (Adjusted Goodness of Fit Index) IFI = incremental fit index; TLI = Tucker–Lewis index; CFI = comparative fit index; RMSEA = RMS error of approximation.
Table 4. Convergent validity analysis.
Table 4. Convergent validity analysis.
Path RelationshipEstimateAVECR
JS1<---Technical capability0.8130.6210.891
JS2<---Technical capability0.775
JS3<---Technical capability0.820
JS4<---Technical capability0.721
JS5<---Technical capability0.806
ZL1<---Organization capability0.8260.6940.872
ZL2<---Organization capability0.782
ZL3<---Organization capability0.888
RY1<---Personnel capability0.8320.6920.871
RY2<---Personnel capability0.870
RY3<---Personnel capability0.792
LC1<---Process capability0.7360.6510.848
LC2<---Process capability0.847
LC3<---Process capability0.833
JZ1<---Support capability0.8460.7450.963
JZ2<---Support capability0.859
JZ3<---Support capability0.871
JZ4<---Support capability0.874
JZ5<---Support capability0.863
JZ6<---Support capability0.870
JZ7<---Support capability0.869
JZ8<---Support capability0.874
JZ9<---Support capability0.843
Table 5. Discriminative validity analysis.
Table 5. Discriminative validity analysis.
Technical
Capability
Organization
Capability
Personnel
Capability
Process
Capability
Support
Capability
Technical
capability
0.621
Organization
capability
0.4010.694
Personnel
capability
0.4940.5920.692
Process
capability
0.3070.2600.4870.651
Support
capability
0.6030.4390.5320.4290.745
The square root of AVE value0.7880.8320.8330.8070.863
Table 6. SEM model fitness test.
Table 6. SEM model fitness test.
NumberGOF MeasureRecommended Level of GOF MeasureActual Measurement Results
1Chi-square/degree of freedom (χ2/df)1 to 31.235
2GFI0 (no fit) to 1 (perfect fit)0.911
3AGFI0 (no fit) to 1 (perfect fit)0.891
4CFI0 (no fit) to 1 (perfect fit)0.987
5RMSEA<0.05 (very good) to 0.1 (threshold)0.031
6IFI0 (no fit) to 1 (perfect fit)0.987
7TLI0 (no fit) to 1 (perfect fit)0.986
Table 7. SEM model path coefficients and significance tests.
Table 7. SEM model path coefficients and significance tests.
CodePathStandardized Path CoefficientP Label
H1Instructor BIM capability→
Technical capability
0.69*** (p < 0.01)
H2Instructor BIM capability→
Organization capability
0.63*** (p < 0.01)
H3Instructor BIM capability→
Personnel capability
0.77*** (p < 0.01)
H4Instructor BIM capability→
Process capability
0.54*** (p < 0.01)
H5Instructor BIM capability→
Support capability
0.75*** (p < 0.01)
Table 8. Dimensions and indicator weights of instructor BIM capability.
Table 8. Dimensions and indicator weights of instructor BIM capability.
DimensionDimension
Weight
Indicator CodeIndicator Weight
Technical capability JS10.206
JS20.198
0.204JS30.208
JS40.183
JS50.206
Organization capability ZL10.336
0.186ZL20.316
ZL30.348
Personnel capability RY10.338
0.228RY20.345
RY30.317
Process capability LC10.306
0.160LC20.347
LC30.347
Support capability JZ10.110
JZ20.111
JZ30.112
JZ40.112
0.222JZ50.111
JZ60.112
JZ70.112
JZ80.112
JZ90.108
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pan, P.; Li, M.; Li, C.; Zong, X. Evaluation of Instructor Capability: Perspective from Building Information Modeling Competition Students in Mainland China. Buildings 2024, 14, 3598. https://doi.org/10.3390/buildings14113598

AMA Style

Pan P, Li M, Li C, Zong X. Evaluation of Instructor Capability: Perspective from Building Information Modeling Competition Students in Mainland China. Buildings. 2024; 14(11):3598. https://doi.org/10.3390/buildings14113598

Chicago/Turabian Style

Pan, Pengcheng, Maoyuan Li, Chenshuo Li, and Xuemeng Zong. 2024. "Evaluation of Instructor Capability: Perspective from Building Information Modeling Competition Students in Mainland China" Buildings 14, no. 11: 3598. https://doi.org/10.3390/buildings14113598

APA Style

Pan, P., Li, M., Li, C., & Zong, X. (2024). Evaluation of Instructor Capability: Perspective from Building Information Modeling Competition Students in Mainland China. Buildings, 14(11), 3598. https://doi.org/10.3390/buildings14113598

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop