Next Article in Journal
The Relationship between Stress Levels Measured by a Questionnaire and the Data Obtained by Smart Glasses and Finger Pulse Oximeters among Polish Dental Students
Next Article in Special Issue
BIM-Based Methodology for the Management of Public Heritage. CASE Study: Algeciras Market Hall
Previous Article in Journal
A Multicast Routing Scheme for the Internet: Simulation and Experimentation in Large-Scale Networks
Previous Article in Special Issue
Analysis of the Current State of Automation of Hazard Detection Processes in BIM in Slovakia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Investigation of Work-Based Education and Training Needs for Effective BIM Adoption and Implementation: An Organisational Upskilling Model

1
School of Science, Engineering & Environment, The University of Salford, The Crescent, Salford M5 4WT, UK
2
Mott MacDonald, Altrincham WA14 1ES, UK
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2021, 11(18), 8646; https://doi.org/10.3390/app11188646
Submission received: 9 August 2021 / Revised: 6 September 2021 / Accepted: 7 September 2021 / Published: 17 September 2021

Abstract

:
Research reveals that organisations in general are keen to provide their staff with the support needed to boost their competency in BIM and subsequently leverage the effectiveness of its implementation. However, employers need a decision-making tool to make better informed investments in specific work-based education and training that addresses the immediate upskilling needs of their employees. Therefore, the aim of this research project is to investigate the significance of Work-Based Education and Training (WBET) needs through the development of an Organisational Upskilling Model (OUM). A comprehensive literature review retrieved 25 hypotheses that were tested for significance from a questionnaire survey completed by 73 AEC professionals working for a large-scale UK engineering consultancy. Based on the current expert sample, the study revealed a holistic inter-organisational agreement that technology training is in high demand. Whereas, the organisational body of knowledge needs only to be better publicised, as employees were unaware of its immediate availability. OUM proved that the most influential variables to BIM Uptake were Attitude (R2 = 0.569 & Q2 = 0.395), User Competency (R2 = 0.523 & Q2 = 0.369), and Organisational Support (R2 = 0.400 & Q2 = 0.233). Informed by their in-house culture, OUM enabled the sponsoring engineering consultant to predict immediate WBET upskilling needs and plan for the required capital investment. However, ‘OUM’ may be applied by any BIM-adopting organisation seeking WBET informed decision-making assistance for better upskilling, continuous improvement, organisational learning, and ultimately business growth.

1. Introduction

Following the UK Government’s recommendation to de-carbonise the industry in 2010 [1] and the launch of its Construction Strategy in 2011 [2], the UK sector has witnessed a building of momentum in the adoption and implementation of BIM that facilitates the digital transformation of the construction sector and the wider built environment. Based on the UK NBS National BIM 2020 Report [3], the rate at which the industry has managed to implement BIM on 100% of its projects increased from 22% in 2019 to only 23% in 2020. Accordingly, 56% of the UK AEC community attributed the shy implementation rate to a lack of in-house skills/expertise, and 48% to the ‘lack of training’; both remain two of the top three reported key barriers to the successful adoption of BIM and digital transformation. The Construction Manager Annual BIM Survey 2020 similarly reported the main organisational barrier to the adoption of BIM, or further adoption, as the lack of digital skills (64%) [4]. However, these figures do not necessarily mean there is a lack of BIM-enabled professionals, but rather a potential deficiency in the organisational up-skilling/re-skilling, and in continuing professional development that confounds the nature of the barriers and issues reported. Amongst research on BIM adoption and implementation Liu, van Nederveen & Hertogh [5] state there is little interest in lessons learned and knowledge exchange. Moreover, they stated that the reported inadequacy in BIM skills is the responsibility of organisations and their associated learning strategies, which should dictate the right training material to the right target audience.
Furthermore, the NBS National BIM 2018 report forecasted a 16% growth in the adoption of BIM over the following three-to-five years [6]. However, by 2020 the adoption rate remained relatively similar with just a slight increase of 2%. Therefore, progressing at this pace undermines the ability of the UK AEC industry to develop the fundamental capability needed to saturate the implementation of BIM and digital transformation in preparation for achieving the vision of the Digital Built Britain (DBB) strategy by 2025 [7] and delivering the key recommendations of the National Infrastructure Commission’s ‘Data for the Public Good Report’, e.g., a National Digital Twin comprised of an ecosystem of connected digital twins to foster better outcomes from the built environment [8].
The slow uptake of the industry’s adoption and implementation of BIM and digital transformation is not only prevalent in the UK, but also globally [9,10]. Hence, the overall effectiveness of adoption and implementation is not fully established due to several associated barriers [11,12,13]. According to extensive studies, it is widely accepted that the most influential barrier to the uptake of BIM is the lack of individual competency pertaining to knowledge, skills, abilities, experience, behaviours and attitude [13,14,15,16]. Furthermore, such studies collectively acknowledge the need for education and training as a resolution, albeit from distinct and limited perspectives. These perspectives mainly focus on integrating BIM education and training within Higher Education (HE) with very little emphasis on the role that organisations play in upskilling/reskilling their in-house culture to drive effective adoption.
For instance, Santos et al. [15] touched upon education and training as the responsibility of Higher Education Institutes (HEIs), stating that current curricula are unsatisfactory for industrial needs, while disregarding the fact that the skilled professionals needed to structure the curricula, are scarce in first place. Alternatively, while investigating the benefits, risks, and challenges of BIM implementation, Ghaffarianhoseini et al. [11] concluded that the slow uptake in the implementation of BIM is impeded by traditional educational trends, whereby educators should rethink new means of teaching and evaluating BIM competencies. Furthermore, Yin [17] called on universities to urgently reform their curricula in an attempt to align higher education with current industrial trends. Moreover, Sacks & Pikas [18], Smith [19], and Uhm, Lee, & Jeon [20] reported that the majority of educational institutes focus on teaching BIM software technology for drafting and 3D modelling, despite a strong need for other non-technical competencies in the BIM implementation processes, coordination, collaboration, and management. However, from an empirical point of view, these technical and non-technical competencies may be highly dependent on external and internal influential variables that dictate vocational training needs within organisations. Such influences include: government mandates, policies, knowledge sources, data proprietorships, roles and responsibilities, choice of BIM software, choice of common data environments, user attitudes, and cultural collaboration. Similarly, according to Liao et al. [13], organisational change resulting from BIM adoption requires organisations to educate and train their in-house cultures on bespoke processes that are tailored to suit the type and function of their specialised multi-disciplinary domains.
Organisations in general are committed to providing staff with the support needed to boost their competency and subsequently leverage the effectiveness of BIM implementation. Wang & Song [21] and Gokuc & Arditi [9] revealed that the perceived organisational support to BIM-user competency was 75% and 85%, respectively. However, little guidance is available to organisations pursuing the upskilling of their in-house cultures when it comes to strategic decision-making involving Work-Based Education and Training (WBET) investment focus and magnitude.
Furthermore, current literature pertaining to WBET focuses on high level recommendations in the form of verified variables, and based on results of exploratory reviews, conceptual frameworks, and user-acceptance models. Such variables range between socio-organisational technical and nontechnical WBET needs, which are randomly scattered within extant literature; therefore, they lack hierarchal importance and significance. Moreover, it is unclear as to what variables overlap, supersede, or nullify others as no research has proposed a binding framework or an in-situ evaluation mechanism that verifies the applicability of these variables to distinct organisational functions.
Therefore, this paper proposes the development of an Organisational Upskilling Model (OUM) that represents a guided framework of organised WBET variables, and identifies vocational priorities within an organisation. OUM focuses on interrogating intercultural competency needs so that it can accurately predict and inform WBET investment decisions based on the influential significance of the interacting variables within its framework.
OUM is regarded as an extension to the Technology Acceptance Model (TAM) of Davis et al. [22], which is extremely reliable for Information Systems research and development [21,23]. Extensively cited in [24], TAM is a predictive tool, specific to user-behaviour and accepted for developing Information Technology systems [22], hence its high applicability to BIM adoption. TAM is structured so that it allows for the testing of influencing factors, such as WBET variables, against the constructs of individual beliefs in order to predict and inform potential uptake of new technologies. In particular, TAM puts forward two user-behavioral independent variables. ‘Perceived Usefulness’ is the subjective belief of increased performance from the use of a new technology, which is influenced by the level of user-expectation towards operating a new technology with minimal effort, i.e., ‘Perceived Ease of Use’. These, in turn, determine the (positive/negative) acceptance of a specific technology (‘User-Satisfaction’) via an individual’s subjective (positive/negative) feeling about performing certain behaviour (‘Attitude’) and their intention to perform the specific behaviour (‘Behavioral Intention’) [22] (Figure 1).
The proposed OUM is developed by eliciting WBET variables that are structured and connected to TAM based on a synthesis of the related literature; this is followed by their statistical validation in a path model framework. The overall framework enables the predictive relevance and significance of variables following the derivation and testing of the hypothesis, which leads organisations to focus their upskilling investments on inter-cultural specific needs and improvement opportunities.

2. Literature Review

In recent years, research studies aimed at enabling effective BIM implementation focused explicitly on identifying adoption and implementation barriers (AIBs) or inversely analogous, critical success factors (CFSs). These studies were conducted with various objectives endorsing generic remedial recommendations to the slow uptake of BIM, ranging from top-down socio-organisational management and governance approaches to bottom-up technical and nontechnical solutions [10,11]. With the exception of only one highly related study, this review of the key literature critically reviews 13 related studies conducted between 2017 and 2018. These studies are segregated by their research aims into three categories, namely exploratory reviews, conceptual frameworks, and theoretical models. While a core focus of this research paper is BIM uptake, the synthesis of the key literature revealed a distinct lack of studies specifically focused on assessing the impact of BIM WBET on user satisfaction and BIM uptake. Accordingly, the main objective of this literature review is to interpret and identify independent WBET-related variables that enable hypothesis derivation and the ontological extension of TAM into the proposed OUM.

2.1. Hypotheses Derivation from Exploratory Studies

In a recent study, Santos et al. [15] reviewed 381 BIM-related articles between 2009 and 2015, and found that BIM adoption and implementation barriers ranked first amongst nine other BIM categories in the top 100 most cited publications, totaling 25 articles. This may be attributed to the rapid changes in BIM, which impose a knockout effect on knowledge patterns [25]. While a wide variety of knowledge sources exist from specialised professional institutes, including UK BIM Framework, BSI, NBS, Professional Bodies, CIC and HM UK Government, BIM Task Group, UK BIM Alliance, BIM4 Community Groups, Centre for Digital Built Britain (CDBB), etc., BIM users appear disoriented and to lack guidance on knowledge sourcing. Hence, 71% of industry professionals in the UK are reliant upon colleagues for BIM information, and the average are confident in their BIM knowledge and skills is 54% compared to 55% in 2017 [26]. These figures suggest that the BIM community is currently recycling available knowledge rather than actually improving it. Liao et al. [13] similarly raised the same issue and recommended the ongoing training of employees to keep pace with the continual development of BIM knowledge. However, it is arguable that this type of information may simply be acquired by introducing users to their local BIM Body of Knowledge (BOK), which should incorporate information pertaining to BIM standards, practices, principles, and implementation processes for the profession.
On the other hand, as the lack of in-house BIM skills/expertise remains one of the most influential barriers to BIM adoption, NBS [3] reveals a serious deficiency in BIM user competency. Although many HEIs and other educational institutes are integrating BIM education and training in their curricula, albeit to varying degrees, the AEC industry—both in the UK and worldwide—is still experiencing a lack of educated, BIM-competent employees [20]. This is attributed to the fact that the majority of educational institutes are focused on training and teaching BIM software technology for drafting despite a strong need for other nontechnical competencies [18,19], which include aspects of BIM implementation, coordination, and management. Uhm et al. [20] share a similar view, adding that the acknowledgement of BIM competency will consolidate the missing constructs of knowledge, skills, and experience, which are sought by employers. Whereby, ‘BIM Competency’ refers to knowledge, training, and continuing professional development as a set of constructs measurable against performance standards and policies [27]. Accordingly, a BIM-competent user is regarded as an individual possessing BIM uptake (adoption and implementation) knowledge and expertise. Therefore, WBET may be understood as an organisational planned decision to invest in human capital for the purpose of leveraging employee-BIM competency. However, until HEIs and other educational institutes have developed curricula that shape BIM-competent mindsets, which is yet to be resolved [28,29], it is only wise for organisations to invest in their own WBET programmes.
Accordingly, two variables were identified in this section, ‘Knowledge Acquisition’ and ‘User Competency’. According to Venkatesh, et al. [30], both variables are derived from the ‘Perceived Behavioural Control’ construct, which reflects the impact of constraints on behaviour encompassing resource facilitating conditions and self-efficacy. Arguably, ‘Knowledge Acquisition’ and ‘User Competency’ may be regarded as external variables influencing ‘Behavioural Intention to Use’ following the standard path of TAM through ‘Perceived Ease of Use’. Hence, provided there are adequate knowledge and skilled resources, it should be easy for an employee to use and control a new system. Therefore, since the term ‘User Competency’ encompasses knowledge and skills, it may be argued that ‘Knowledge Acquisition’ leverages the ‘Behavioural Intention to Use’ via ‘User Competency’. Thus, it may be hypothesised that:
Hypothesis 1 (H1).
User Competency is significant to Behavioural Intention to Use.
Hypothesis 2 (H2).
Knowledge Acquisition is significant to Behavioural Intention to Use.
Furthermore, Ozorhon & Karahan [14] realised that the most influential critical success factors to BIM adoption were people-related. This includes the availability of competent users followed by the need for organisational commitment and the need for training. The authors justified their findings through citations that only emphasised the importance of these factors, stating that organisations should invest more in the innovative training and recruitment of BIM experienced employees to allow for smooth processes. However, such recommendations may be comprehended with subjectivity; hence, neither the size of investment in training is clear nor the required user experience level. Meanwhile, similar subjectivity was elaborated by Ghaffarianhoseini et al. [11] who added that investment size in BIM is based on organisational appreciation of incentives and performance benefits of BIM adoption, thus implicating the need for organisational BIM-perceived usefulness awareness. Accordingly, an ‘Organisational Support’ variable equivalent to the ‘Facilitating Conditions’ construct of Thompson, Higgins & Howell [31] is instigated. The ‘Organisational Support’ variable is defined as the extent to which employees believe that organisations are able to provide the necessary resources for successful system ‘Uptake’ [23]. Since this variable is defined from the employees’ point of view, its impact is expected to leverage the ‘Perceived Ease of Use’ thus, leading to the following hypothesis:
Hypothesis 3 (H3).
Organisational Support is significant to Perceived Ease of Use.
Alternatively, following research into the benefits and challenges of the adoption and implementation of BIM, Jin et al. [32] found that an organisations’ decision to provide technical training only at junior levels is perceived as a hindrance because senior management are unable to leverage work performance due to their lack of BIM knowledge. This implicates that the executive top management should question their knowledge about the applicability of their existing workflow processes and implementation methods, which should be redefined upon the adoption and implementation of BIM. Furthermore, process redefinitions should state that technical training is properly assigned to members of staff based on their job roles and responsibilities. The adoption and implementation of BIM is a subsidiary of organisational change that integrates new processes and technologies [13], thus implicating a primary need for process and technical WBET. However, from an empirical point of view, whilst process training should equally cover all team members, the technical training requirements differ between junior and senior team members. Whereby, senior management will only need non-intensive technical training to clarify the usability and capability of newly adopted BIM technologies and, accordingly, better-inform project management and decision making. This view is shared by Liu et al. [5] in that the reported inadequacy in BIM skills relate to the organisation’s learning approach, whereby training is only provided to junior staff members instead of all BIM-related staff. Therefore, the lack of technical knowledge amongst senior members not only creates a skills gap that hinders the clear benefits of BIM, but also (based on industrial psychology studies) results in an imbalance between leadership expectations and project demands with less ability to control the implementation process [33]. Furthermore, according to Venkatesh et al. [30], effort-oriented variables influential to the ‘Perceived Ease of Use’ are more noticeable in the early stages of technology adoption when process hindrance is eclipsed by technical complications. Therefore, since the term ‘Training’ is defined as an organisational upskilling ‘effort’ that facilitates employees’ learning and job-competencies (Noe, 2010, cited by Stabile & Ritchie [34]), it may be regarded as an effort-oriented variable. Moreover, ‘Technical Training’ and ‘Process Training’ variables are interpreted and added to determine the effect of training on ‘Perceived Ease of Use’ through ‘User Competency’, whereby, both ‘Technical’ and ‘Process Training’ variables are assumed to influence the previously identified ‘User Competency’ variable. Hence, the outcomes from education and training are regarded as competency manifestations [27], and the following may be hypothesised:
Hypothesis 4 (H4).
Technical Training is significant to Perceived Ease of Use.
Hypothesis 5 (H5).
Process Training is significant to Perceived Ease of Use.
Hypothesis 6 (H6).
Technical Training is significant to User Competency.
Hypothesis 7 (H7).
Process Training is significant to User Competency.
Hypothesis 8 (H8).
User Competency is significant to Perceived Ease of Use.
Hypothesis 9 (H9).
User Competency is significant to Perceived Usefulness.
In addition to the above, the reported BIM skills gap can hinder advancement and continuous improvement as senior management will be incapable of digesting the feedback of end-line users for process improvements and upgrades. This observation was asserted by Antwi-Afari et al. [16] who identified a need for Lean-thinking to enhance information exchange and knowledge management between all BIM participants. The main purpose of applying Lean is not only to eliminate waste from processes, but also to stimulate a learning culture that enables organisational learning through continuous improvement [5]. Lean is equivalent to operational excellence in adding value to any type of industry Fredendall & Thürer, 2016). Lean was first introduced by the Toyota Production System (TPS) in the 1940s reflecting Toyota’s success in creating an in-house culture capable of mitigating and adapting to change in context, while problem-solving, adding value, and cutting lead time at each step of the production process [35]. Therefore, the following may be hypothesised.
Hypothesis 10 (H10).
Lean is significant to Perceived Ease of Use.

2.2. Hypotheses Derivation from Existing Frameworks

In addition, Alreshidi et al. [10] referred explicitly to process improvement terms by referring to three continuous improvement (CI/Kaizen) tools derived from renowned Lean principles, including ‘Just in Time’, ‘Right at first time by Stopping work and bringing issues to the surface’ and ‘Visual management check points’. Continuous Improvement is defined as the act of incremental positive progression, also known as Kaizen [36]; Kaizen is a Japanese term derived from two Japanese words: “Kai” meaning “Change” and “Zen” meaning “Good” [37]. While these Kaizen tools were regarded as process improvement, the authors called for process clarification by implicating a need to integrate Kaizen tools in the WBET of BIM standards, wherever applicable, for effective collaboration and implementation. However, it is empirically arguable that these tools may be highly appreciated, not only in knowledge acquisition and process training, but also in technical training. For instance, the concepts of ‘Just in Time’ and ‘Right at First Time’ refer to the implementation activity itself. The principle of getting it right first time is integrated in almost all software applications in the form of instant pop-up notifications that inform BIM users of modelling errors and clashing elements. However, in a similar approach in Singapore, Liao et al. [13] proposed an organisational change framework based on the identification of BIM adoption and implementation barriers and drivers. The lack of skilled personnel and the need for training were most cited as both barriers and drivers for BIM adoption and implementation. Therefore, considering the aforementioned, a ‘Kaizen’ variable may be added to the framework of the proposed model. However, since this is a unique variable that had not been previously explored, its equivalence to the ‘Outcomes Expectations’ construct is assumed. It is defined in relation to information systems as the degree to which individuals believe that using a particular system would yield valued job-related outcomes [38]. On the other hand, according to Sacks & Pikas [18], education and training in Lean for continuous improvement is vital to successful BIM uptake and organisational learning. Organisational learning is defined as the collective ability of an organisation to learn and continuously improve where knowledge is central to business development and growth [39]. In fact, Learning Organisations use knowledge to continually catalyse the competency of their in-house cultures [40] in order to grow and succeed, which suggests that ‘Organisational Learning’ is a pillar for WBET. Accordingly, the following is hypothesised:
Hypothesis 11 (H11).
Continuous Improvement Kaizen in Process Training is significant to User Competency.
Hypothesis 12 (H12).
Continuous Improvement Kaizen in Technical Training is significant to User Competency.
Hypothesis 13 (H13).
Continuous Improvement Kaizen in Knowledge Acquisition is significant to User Competency.
Hypothesis 14 (H14).
Knowledge Acquisition is significant to Organisational Learning.
Moreover, Liao et al. [13] recommended that constant in-house training is needed for employees to cope with the continually developing policies and procedures of BIM. This type of information may be simply acquired by introducing users to their local BIM BOK, which should incorporate knowledge pertaining to BIM standards practices, principles and implementation processes for the profession. In line with this need, Wu, et al. [28,29] developed a BIM BOK framework with objectives to identify basic BIM skills, benchmark user competency, and enable BIM education and training. It was observed that the four levels of BIM implementation—Plan, Coordinate, Manage, and Do—are analogous to a CI/Kaizen tool known as Plan-Do-Check-Act (PDCA). Rother [41] (pp. 133–138) defined PDCA as “an incremental continuous improvement method that iterates between current and target conditions of production”. Provided this implicit reference to continuous improvement, the following may be hypothesised:
Hypothesis 15 (H15).
Continuous Improvement Kaizen is significant to Perceived Usefulness.

2.3. Hypothesis Derivation from Theoretical Models

A number of BIM-related theoretical models extending from TAM and the associated perceived usefulness and perceived ease of use have been proffered. For instance, the model developed by Wang & Song [21] investigated the fit between ‘User Satisfaction’ and ‘Organisational Support’ in the implementation of BIM. The model identified that perceived usefulness, perceived ease of use, attitude, senior management support, and executive management support were all imperative constructs to BIM user satisfaction. The results revealed that the most influential variables to BIM user satisfaction were executive management, followed by senior management support, and then perceived usefulness. Interestingly, the need to integrate perceived usefulness and perceived ease of use constructs in their relational model was based on the fact that the lack of WBET is regarded as a significant barrier to the effective adoption and implementation of BIM. However, the study maintained a general approach without reflecting on the results of perceived usefulness and perceived ease of use specific to WBET. Additionally, the authors argue that ‘User Attitude‘ should be recognised as an influencing antecedent of perceived usefulness and perceived ease of use, and included it in their model as an independent variable. Meanwhile, based on TAM terminologies and its associated formulaic studies, ‘Attitude’ is determined by perceived usefulness and perceived ease of use (i.e., Attitude = Perceived Usefulness + Perceived Ease of Use) and not vice-versa [22]. However, since ‘Attitude’ was assessed as an independent variable, the degree of variance from assumptive external WBET factors could not be estimated. Such factors may include the impact of knowledge acquisition or technical training on user attitude, which were not considered. Therefore, the influence upon ‘Attitude’ is not supported by the proposed framework. Otherwise, it may be argued that the high mean average of the ‘Attitude’ variable (reported at 4.47/5 from 118 respondents) could have changed the results of the Hypothesis Testing for the entire model. Therefore, this observation may be tested by the following hypotheses:
Hypothesis 16 (H16).
Knowledge Acquisition is significant to Attitude.
Hypothesis 17 (H17).
Technical Training is significant to Attitude.
Furthermore, in a similar approach, Howard, Restrepo & Chang [23] adopted the Unified Theory of Acceptance and Use of Technology (UTAUT) [30], an elaborated extension of TAM to evaluate the general impact of BIM adoption on BIM user behaviour in the UK. The authors of the UTAUT argued that TAM was based on voluntary user acceptance and required two additional external influential constructs, ‘Social Influence’ and ‘Facilitating Conditions’, supported by four moderators, namely ‘Gender’, ‘Age’, ‘Experience’, and ‘Voluntariness of Use’ [30]. The authors dropped the ‘Attitude’ construct, arguing that it becomes insignificant over time. Moreover, Howard et al. [23] dropped both the ‘Gender’ and ‘Age’ moderators due to inconsistencies with an 84-respondent sample size and considered that ‘Age’ was accommodated within the ‘User-Experience’ moderator. On the other hand, Howard et al. [23] reinstated ‘Attitude’ as an independent variable, as with Wang & Song [21], but with the direct influence on ‘Behavioural Intention’, citing a significantly outdated 2004 survey that highlighted ‘Resistance to Change’ as the most influential barrier hindering the adoption and implementation of BIM. However, it is suggested that ‘Attitude’ may only be tested in survey research with a targeted expert-sampling audience in order to better understand the trends of in-house cultures specific to a particular organisation. Otherwise, random sampling from public participants may lead to random inconsistent results based on the bias of different organisations.
This research proposes the retesting of ‘Attitude’ and ‘Behavioural Intention’ as Howard et al. [23] reported a significant relationship between ‘Attitude’ and ‘BIM Uptake’ and an insignificant relationship between ‘Attitude’ and ‘Behavioural Intention’. Therefore, Howard et al.’s approach to Hypothesis testing completely contradict TAM principles, as the authors’ results stated that perceived usefulness had no direct significance for ‘Behavioural Intention’, and perceived ease of use had a positive influence on ‘Behavioural Intention’. This can be tested via the following hypotheses:
Hypothesis 18 (H18).
Attitude is significant to Behavioural Intention to Use.
Hypothesis 19 (H19).
Attitude is significant to BIM Uptake.
On the other hand, the ‘Social Influence’ variable had a positive influence on ‘Behavioural Intention’ although the questionnaire statements relating to this variable may lead to discrepancies as they were adopted for two purpose-differing variables, i.e., ‘Social Factors’ and ‘Subjective Norm’, with the difference listed in Venkatesh et al. [30] (p. 452). The former (Social Factors) refers to the social influence of others while the latter (Subjective Norm) refers to an individual’s perception of what others believe about them. Accordingly, this study highlighted the additional need for the ‘Social Factors’ variable, adopted from Thompson, Higglns & Howell [31]. The ‘Social Factors’ variable relates to ‘Organisational Support’ in reflecting the current trends of organisational in-house cultures. Therefore, the study proposes to test if the ‘Social Factors’ variable representing the current culture is significant to the ‘Perceived Usefulness’ of ‘Organisational Support’ as it is natural that ‘Social Factors’ reflect the cultures’ expectations of benefits and increased performance from BIM. Additionally, the study proposes testing whether ‘User Competency’ is significant for ‘Social Factors’, as this may highlight the importance of competent users in fostering the right organisational culture. This further conforms to the observations of Venkatesh et al. [30] requiring WBET to cover all levels of people for the effective adoption and implementation. Accordingly, the following may be hypothesised:
Hypothesis 20 (H20).
Social Factors variable is significant to Perceived Usefulness.
Hypothesis 21 (H21).
User Competency is significant to Social Factors.
On the other hand, Gokuc & Arditi [9] investigated design tasks, organisational competency, and user competency on project implementation performance of design firms, by extending TAM to Designer Competence Model (DCM). The model assesses the fit between BIM technology and designer-user competence in terms of experience, education, anxiety, and self-efficacy (competency). Results from the DCM reported no significant relationship between ‘User Competence’ and ‘Organisational Performance’, which is empirically arguable considering the “lack of in-house skills/expertise” barrier positioned as the primary hindrance to the adoption and implementation of BIM [3,6,26]. Furthermore, according to Liu et al. [5], extensive experience and competency in BIM leads to an increased opportunity of winning project tenders, which is empirically a reflection of organisational performance, as utilised by the Engineering News-Record ENR Top 100 best performing series [42]. Therefore, ‘User Competency’ should result in very high significance for ‘BIM Uptake’. Thus, it may be hypothesised that:
Hypothesis 22 (H22).
User Competency is the most significant external variable for BIM Uptake.
In addition, BIM collaboration in design and construction was evaluated empirically by Liu et al. [5] in China via Grounded Theory-led focus groups. The authors proposed a theoretical model, structured into three categories pertaining to Technology, People, and Process. Under Technology, ICT capacity and technology management contributed to peoples’ attitudes and role taking, which, in turn, influenced processes, in terms of trust, communication and leadership. Most significantly, the authors explicitly addressed a need for continuous improvement (CI/Kaizen) to enable effective collaboration and BIM competency. However, their research participants had little interest in lessons learned and continuous improvement despite the authors’ acknowledgement that industrial experience is leveraged through the aggregation of BIM knowledge. According to Liu et al. [5], the industry’s lack of interest in continuous improvement had persisted for more than ten years. This finding is not surprising, as research reveals that organisations face difficulties when establishing CI/Kaizen techniques, whereas the implementation of Lean in general is not a straightforward process [37,43]. Therefore, it is worthwhile exploring whether continuous improvement is better appreciated in current trends where the following may be hypothesised:
Hypothesis 23 (H23).
CI/Kaizen is significant to User Competency.
While Lean for BIM has not been previously explored in a theoretical model, research from the manufacturing sector provides justification for further hypothesis testing to provide insight on ‘User-Job Satisfaction’ post-Lean adoption. Sim, Curatola & Banerjee [44] conducted a research case study on a USA manufacturing plant that endorses a Lean-CI environment, to investigate the overarching principle of Lean “Respect for Humanity” through the constructs of user-self autonomy, HR and manufacturing practices, perceived organisational support, and training. The significance of these constructs was assessed against ‘User-Job Satisfaction’, ‘Perceived User-Rewards’, and ‘Perceived User-Job Security’. In the study, Lean for continuous improvement was used as a validity measure. Thus, ‘Perceived User-Rewards’ negatively related to ‘User-Job Satisfaction’, which indicates that users may be satisfied from the organisation’s perspective and incentives driven by the adoption of Lean. Furthermore, the prevalence of Lean was asserted by both ‘Perceived Organisational Support’ and ‘Manufacturing Practices’, which are positively related to ‘User-Job Satisfaction’. Unexpectedly, the ‘Training’ construct reported no significance over ‘User-Job Satisfaction’, which may be attributed to the good practice of Lean for continuous improvement, whereby processes improve incrementally in small daily steps allowing people to learn, make adjustments and innovate [40]. This appears to have been occurring, leading to minimal training needs from users. Therefore, the following may be hypothesised:
Hypothesis 24 (H24).
Lean is significant to BIM Uptake.
Hypothesis 25 (H25).
Lean is significant to Organisational Learning.

3. Proposed OUM Model and Research Methodology

The critical appraisal of key literature in relation to WBET and the associated theoretical models, conceptual frameworks, and recommendations, has led to the development of a conceptual OUM that extends the TAM of Davies et al. [22] through the addition of nine new variables, which are logistically concatenated into its framework. The purpose of the proposed OUM is to the overcome limitations of previous hypothetical assumptions and deliver a new means of understanding the upskilling/reskilling needs of in-house cultures concerning the facilitation of the effective adoption and implementation of BIM to drive the digital transformation (Figure 2).

3.1. Approach

This research is deliberately targeted at a group of BIM-enabled AEC professionals, which therefore leads to the need for a survey approach to enable an economical outreach to statistical generalisation [45] that rapidly quantifies large amounts of feedback. According to Yin [45], quantitative research follows a deductive approach that defines concepts and theories through data-measurement and prediction tools. Therefore, the study lends itself completely to a cross-sectional quantitative survey research method, which best aligns to the development of the proposed Organisational Upskilling Model.
The targeted group of BIM-enabled AEC professionals are represented by the sub-group of the sponsoring engineering consultant industry organisation. The organisation has been on a journey of digital transformation for the last five years, implementing a number of programmes to upskill the workforce. With an ever-growing need to rapidly develop the appropriate BIM skills, a new approach is required to determine the training that should be implemented. It is noted that this need is not unique to the sponsoring industry organisation but is common across all participants in the construction industry.
As a global organisation, the sponsor employs more than 16,000 people globally across all sectors of the construction industry. The sub-group represents an individual business sub-division within the organisation that employs approximately 500 AEC professionals. The sub-division operates across seven offices in the Western half of the UK.

3.2. Variables and Hypotheses

Surveys are operationalised with a series of variables that are systematically related in a theoretical model upon which hypotheses may be formulated [46]. The variables utilised in this research project are categorised as four independent variables (IV), nine dependent variables (DV) and one moderator variable (MV); all are derived from consistent findings from the critical review of key literature. It should be noted that the moderator variable in particular, is mainly used to speculate further hypotheses by influencing the significance of IVs on DVs [46]. The MV is mainly used to measure the increased or decreased impact on a DV from the addition of a controlling phenomenon, referred to as the moderator effect [47]. For instance, ‘CI/Kaizen’ has been introduced as an MV in order to assess the BIM up-takers’ need for continuous improvement over the IVs, including ‘Technical Training’, ‘Process Training’ and/or ‘Knowledge Acquisition’, impacting the ‘User Competency’ DV.

3.3. Questionnaire and Survey Design

The questionnaire design involved the adoption of statements associated with the selected variables and established from the variety of peer-reviewed studies of comparable acceptance models. The questionnaire survey consisted of 14 variables and 52 questionnaire statements. The questionnaire statements were piloted with three senior members from the sponsoring industry organisation. The valuable feedback led to the elaboration of some statements and the inclusion of some descriptions and clarifications in order to enhance the accuracy and comprehension of the questionnaire survey; this made the questionnaire survey more understandable by all participants regardless of their level of experience.

3.4. Sampling and Measurement

The survey utilised expert sampling to non-randomly select a group of respondents based on their direct engagement in a specific domain [48], i.e., BIM implementation and adoption. Email invitations were sent to 500 professionals working for the sponsoring organisation. Recommended by Kwong & Wong [49] and Hair et al. [47] for PLS SEM researchers, the sample size for this study was based on the maximum amount of independent variables pointing at a given dependent variable in the proposed model. For example, there are three independent variables pointing at ‘User Competency’, which is a dependent variable in the proposed OUM; therefore, a minimum of 37 and 59 respondents are required, respectively. Accordingly, and for enhanced robustness, a minimum of 59 respondents were sought, while 73 complete responses were received, thereby exceeding the minimum response benchmark of 59 or 20%. The demographics of the 73 respondents (14% return rate) represent various architectural and engineering disciplines operating from the distinct office locations of the sponsoring organisation within the United Kingdom (Table 1). Forty (55%) respondents are operating at a junior level with seven or less years’ industry experience and 33 (45%) at a senior level with over seven year’s industry experience. Furthermore, 67 (92%) respondents are hands-on architects and engineers working directly on BIM projects, in addition to six (8%) senior management directors who are key decision makers in the organisation’s uptake of BIM.
In terms of the questionnaire measurement instrument, respondents were asked to rate their level of agreement based on a five-point Likert-scale (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly agree) in conjunction with the studies of Thiele Schwarz et al. [36], Wang & Song [21], Gye-Soo [24], Sim et al. [44], Monecke & Leisch [50] and Sarstedt et al. [51], and the recommendation by Holt [52] on identifying the relative importance of a set of variables. The survey was conducted using Google Forms; a free online documentation platform that allows for Likert scale measurements and the rapid export of data in Excel format, thus enabling effortless data entry into the statistical software. In terms of coding, the variables and corresponding associated indicators (questionnaire statements) were abbreviated based on common research methods proposed by Hair et al. [47], e.g., the ‘User Competency’ variable was abbreviated as ‘UC’ with associated indicators ‘UC1, UC2, UC3UCn’.

4. Data Analysis and Findings

Structural Equation Modelling (SEM) can be an effective method where the objectives indicate the investigation of associations between variables and the strength of variables affecting a construct. Furthermore, according to Kline [53], SEM can be deployed to conduct multivariate regression in confirmatory and exploratory studies. Moreover, the strength of relationships can be explored in order to prioritise resources for the most important variables to better serve management purposes. Moreover, unobserved variables (i.e., constructs that are not directly measurable) can be included in such analyses; therefore, SEM provides an ideal tool for business and management research studies [49]. The objectives of the study and the nature of the data collected will determine the selection of the most appropriate SEM method. Due to the need for statistical generalisation, the relatively small sample size, novelty of the conceptual model, and capability to handle variables with non-normal distributions, Partial Least Squares Structural Equation Modelling (PLS-SEM) was considered the most appropriate SEM method, which accorded with recommendations proposed by Kwong & Wong [49] and Sarstedt et al. [51]. PLS-SEM is a statistical application capable of providing reliable estimations from complex, predictive and hypotheses-testing models associated with almost any sample size [51]. Since the aim of PLS-SEM is to maximise variance, the metrics of PLS-SEM, including the Coefficient of Determination (R2), Significance (p-value) and (t-value), and Predictive Relevance (Q2), are capable of hypotheses testing in different combinations. The analysis was conducted using SmartPLS software application [54] according to the guidelines and instruction provided by Sarstedt et al. [51].

4.1. Convergent Validity

4.1.1. Outer Loadings

The convergent validity assessment requires each indicator to receive an outer loading value of ≥0.7 to maintain its reliability, otherwise that indicator and its associated questionnaire statement would need to be dropped from the study. Table 2 provides the outer loading results for each of the questionnaire statements, highlighting those results below 0.7.
Accordingly, in this test, none of the proposed WBET variables were dropped. However, ‘User Competency’, ‘Knowledge Acquisition’, ‘Organisational Support’, ‘Social Factors’ and ‘Lean’ variables each had one of their indicators removed (coded as UC3, KA1, OS6, SF1, and LN1), leaving the study with a total of 47 indicators. Although the mean averages and standard deviation fall within acceptable limits, the outer loading test detected a similarity between these indicators from within their corresponding variables. Furthermore, although three of the four indicators pertaining to ‘Social Factors’ failed the 0.7 threshold, dropping only one indicator SF1 was enough to leverage the Average Variance Extracted for that variable to over 0.5. Meanwhile, the other failing indicators were retained in order to avoid eliminating the ‘Social Factors’ variable. Such an exceptional statistical decision was verified and recommended by Hair et al. [47]. Moreover, SF1 (“I use BIM because of the proportion of colleagues who use it.”) received a mean average of 2.534, indicating a neutral result amongst the 73 respondents. However, this result does not necessarily mean that the statement is invalid but rather that the result should be reported within a median of 0 to 2.4. This result indicates the lack of common agreement among the culture with respect to the statement, yet it is still slightly over 50% true.

4.1.2. Composite Reliability and Average Variance Extracted

The Composite Reliability (ρc) and Average Variance Extracted (AVE) tests require all variables to receive a rating of AVE > 0.5 and ρc > 0.6 so that respectively, the degree of variance between the indicators is justified and the overall reliability of the collection of indicators used in the research is validated against the variable redundancy measures. Accordingly, all variables surpassed the AVE > 0.5 and ρc > 0.6 threshold (Table 3).

4.2. Discriminant Validity

Cross Loading and Fornell-Larker Criterion

Discriminant Validity testing ensures that a variable is unique from other variables in the proposed model. It is defined as the degree to which a variable is distinct from other variables by measuring the correlations between overlapping indicators using two testing methods: Cross Loadings and the Fornell-Larcker criterion [47]. In the first method, an indicator’s outer loading on the associated variable should be greater than any of the other cross-loadings on the remaining variables. Cross Loadings are best reported in a table where a successful indicator receives the highest loading in the same row. Alternatively, for the second method, the Fornell-Larcker criterion requires the square root of each variable’s AVE to be greater than the correlation value of any other variable. Accordingly, no conflicts were reported from either the Cross Loadings or Fornell-Larker assessments (Table 4 and Table 5).

4.3. Hypotheses Testing Measurements

The main objective of this section is to enable hypotheses testing. However, two antecedent tests, the Coefficient of Determination (R2) and Predictive Relevance (Q2), are required prior to hypotheses testing in order to better understand, discuss, and argue the results of statistical significance (p-values and t-values). On the one hand, R2 will predict the variance inflicted by the entire framework of OUM on each dependent variable, while on the other, Q2 will elaborate the predictive capability of each dependent variable within OUM. Following the above antecedent tests, the statistical significance (p-values and t-values) will be evaluated according to the OUM’s 25 hypotheses.

4.3.1. Significance Measurement

According to Hair et al. [47], most researchers rely on the p-values and t-values to assess the probabilistic significance and confidence cut-off levels of the dependent variables. Indeed, p-values are regarded as determinants of significance through probabilistic error calculation between two variables. Moreover, p-values should achieve a maximum error threshold of 0.05, equivalent to 5%, where p < 0.05 determines a significant causal relationship between two variables. It should be noted that p-values are calculated in conjunction with t-values representing the cut-off degree at which the strength of significance is determined [47]. Therefore, for hypotheses testing at 5% error (p < 0.05) or 1% error (p < 0.01), the t-value should not fall below 1.96. Accordingly, a 5000 resample bootstrapping calculation method was engaged by Smart PLS 3.0 software to compute the significance.

4.3.2. Coefficient of Determination and Predictive Relevance

The Coefficient of Determination (R2) utilises the combined effects of the independent variables to explain predictive ability through the level of variance in each dependent variable. R2 is measured between 0 and 1 based on Equation (1) below (Coefficient of determination, where SS is the sum of squares), and denoted as weak (<0.25), moderate (0.26 to 0.5) or strong (>0.5) [46].
R 2 = S a b 2 S S a   S S b
Though the evaluation of R2 is a criterion for predictive analysis, research recommends checking the Predictive Relevance (Q2) value, which evaluates the predictive validity of independent to dependent variables in the entire model based on Equation (2) where SSE is the sum of squares due to error and SS is the sum of squares [46]. The predictive relevance Q2 of each dependent variable should be greater than 0 so that the model is considered capable of predicting influence over that variable. However, a dependent variable with a Q2 of 0 or less, is considered irrelevant for prediction and accordingly insignificant to the study.
Q 2 = 1 S S E S S
Accordingly, this research considers the combination of t-value, p-value, R2, and Q2 as the means of analysis to satisfy significance-based hypotheses testing. Figure 3 presents a snapshot of the R2 and Q2 results from the SmartPLS 3.0 PLS-SEM application, which considers that the same results are depicted by the application to each of the dependent variables.

4.4. Hypotheses Testing Measurements

Significance is benchmarked at a maximum 5% error allowance so that a hypothesis may be accepted. However, it should be noted that the rejection of any hypothesis should not be viewed negatively, but rather objectively informative with alternative explanations to the hypothesised phenomenon. The Significance and Hypotheses Testing results are presented in Table 6.

5. Discussion

As shown in Table 6, the Significance and Hypothesis testing resulted in the rejection of seven of the sixteen hypotheses. However, all hypotheses will be separately discussed in this section for validation purposes:
  • Hypothesis 1 (H1). User Competency is significant to Behavioural Intention to Use” was accepted. This hypothesis was instigated by the status quo of BIM adoption in the UK emphasising the shortage of competent users in the industry. Accordingly, an organisation with the intention to uptake BIM within a limited timeframe should consider building in-house user competency as a primary objective. Hence, the strong significance of ‘User Competency’, supported by the combined effects of the entire framework, explained around 30% of the variance and 24% of predictive relevance in the ‘Behavioural Intention to Use’ variable.
  • Hypothesis 2 (H2). Knowledge Acquisition is significant to Behavioural Intention to Use” was accepted. As with H1, this hypothesis was instigated by the status quo of BIM adoption in the UK and the need to leverage user competency by training and educating employees on workflow processes and the in-house resourcing of knowledge through the use of a dedicated, well publicised, private knowledge portal to ensure employees awareness. This point, in particular, was addressed in questionnaire statement “KA4—My organisation enables quick access to the relevant knowledge and training material”, which received a mean average of 3.151 or 69%. Thus, 31% of survey participants were unaware that the sponsoring organisation had an elaborated BIM knowledge portal to support and assist their day-to-day WBET and practice needs. Accordingly, it may be speculated that the ‘Knowledge Acquisition’ variable could have a double star significance rating on behavioural intention should the knowledge portal be better publicised internally.
  • Hypothesis 3 (H3). Organisational Support is significant to Perceived Ease of Use” was accepted. Following the calls of Ozorhon & Karahan [14] and Ghaffarianhoseini et al. [11] for organisational awareness to be added to the benefits of BIM, this variable was introduced to assess the degree to which employees believe that their organisation’s support can lead to reduced effort in BIM learning and implementation. Accordingly, OUM explained 40% of variance and 23% of predictive relevancy in the ‘Organisational Support’ variable. Therefore, the combination of variables oprationalising OUM may be highly accurate in reflecting the in-house culture’s perception of their leadership, thus providing senior management with vital decision-making support services.
  • Hypothesis 4 (H4). Technical Training is significant to Perceived Ease of Use” and “Hypothesis 5 (H5). Process Training is significant to Perceived Ease of Use” were both rejected. When considering OUM in Figure 2, this result may be attributed to the indirect effect on Perceived Ease of Use, which is a successor to the User Competency variable. Since, OUM explained 52% of variance and 37% of predictive relevancy in the ‘User Competency’ variable, the sponsoring organisation should focus not only on training, but also on building learning capacity and knowledge capability amongst employees, which is considered a challenge to both business and economic growth [55].
  • Hypothesis 6 (H6). Technical Training is significant to ‘User Competency” was accepted. Whereas, “Hypothesis 7 (H7). Process Training is significant to User Competency” was rejected. These results are aimed at the sponsoring organisation so that the training budget is focused on providing technology training to employees. Meanwhile, the research revealed that the culture of the sponsoring organisation conveys relative confidence in their organisational BIM processes, standards and workflows.
  • Hypothesis 8 (H8). User Competency is significant to Perceived Ease of Use” and “Hypothesis 9 (H9). User Competency is significant to Perceived Usefulness” were accepted. These hypotheses were instigated by Venkatesh et al. [30] who argued that organisational facilitating conditions become insignificant when effort expectancy is leveraged. Accordingly, with reference to Table 6, it is clear that the significance of H8 and H8 is greater than H3, which verifies that organisational support may be substantially relaxed when competency is leveraged.
  • Hypothesis 10 (H10). Lean is significant to Perceived Ease of Use” was accepted. This variable was instigated by Antwi-Afari et al. [16] who recommended the need for Lean to enhance knowledge acquisition. Accordingly, it was added to OUM as an influencer of ‘Organisational Support’ in order to assess participants’ perceptions of anticipated methods of effective BIM implementation. As such, Lean explains the variance in ‘Perceived Ease of Use’ by 30% and its predictive power by 20%. Thus, depending on the reflections of the in-house culture, a decision to take up Lean would be wise by the sponsoring organisation or any other organisation adopting the OUM.
  • Hypothesis 11 (H11). CI/Kaizen in Process Training is significant to User Competency”, “Hypothesis 12 (H12). CI/Kaizen in Technical Training is significant to User Competency” and “Hypothesis 13 (H13). CI/Kizen in Knowledge Acquisition is significant to User Competency” were all rejected. CI/Kaizen was introduced as a moderator variable to ‘Technical Training’, ‘Process Training’, and ‘Knowledge Acquisition’ following the implicit recommendations of Alreshidi et al. [10] to three different Kaizen tools. However, the insignificance of these moderator variables may indicate a lack of clarity over the importance of CI/Kizen to the sponsoring organisation. Although ‘Process Training’ was found to have no significance over ‘User Competency’ with t = 0.588 and p = 0.557, once ‘CI/Kaizen’ was introduced as a moderator, it leveraged the p-value by 46%. Moreover, although insignificant at p = 0.11, the importance of process training was either underestimated by survey participants or the participants were well aware of their project delivery workflows to the point that additional training in processes is negligible. This is a clear example of how a rejected hypothesis may provide in-depth knowledge of a current in-house status.
  • Hypothesis 14 (H14). Knowledge Acquisition is significant to Organisational Learning” was accepted and verified that learning organisations use knowledge to continually catalyse the competency of their in-house cultures in order to grow and succeed, which signifies that ‘Organisational Learning’ is a pillar for WBET.
  • Hypothesis 15 (H15). CI/Kaizen in Knowledge Acquisition is significant to Perceived Usefulness” was rejected. This hypothesis was based on the PDCA-like research approach to knowledge management by Wu, et al. [28,29]. However, as is the case in hypotheses H11, H12, and H13, Kaizen has had no impact except on process training. Furthermore, this variable may require further research that specifically dissects the Kaizen tools with further multiple relations to enable a better understanding of its significance.
  • Hypothesis 16 (H16). Knowledge Acquisition is significant to Attitude” and “Hypothesis 17 (H17). Technical Training is significant to Attitude” were both accepted. In reference to Table 6, OUM revealed that ‘Technical Training’ was significant to ‘Attitude’ (with t = 1.992 * and p = 0.046), which is considered the strongest determinant in explaining 57% of variance and 40% of predictive relevancy. Since the inclusion of the ‘Attitude’ variable was arguable in the literature review, especially by Howard et al. [23] in terms of resistance to change, the predictive relevancy results lead this research project to be a major identifier of attitude in any organisation adopting OUM. In the case of the sponsoring organisation, the result may be interpreted in the sense that there is no resistance to change since the drivers of ’Attitude’ were clearly identified by OUM. These include ‘Technology Training’ and two more significant variables, namely ‘Knowledge Acquisition’ to Attitude with t = 2.848 ** and p = 0.004, and ‘User Competency’ with t = 5.297 ** and p = 0.
  • Hypothesis 18 (H18). Attitude is significant to Behavioural Intention to Use” was accepted. However, “Hypothesis 19 (H19). Attitude is significant to BIM Uptake” was rejected. The reason for this manifested in the definition of the ‘System Uptake’ variable of TAM, which Davis et al. [22] defined as the positive or negative acceptance of a specific technology based on the relationship of all TAM variables. Therefore, the definition implies analogy to the defined ‘Attitude’ variable, as a positive or negative feeling about an aspect or event. Therefore, it is almost impossible to relate the two attitudinal variables.
  • Hypothesis 20 (H20). The Social Factors variable is significant to Perceived Usefulness” was rejected. Based on Howard et al. [23], the introduction of social influence and its significant impact on ‘Behavioural Intention to Use’, this research realised that the best fit for the ‘Social Factors’ variable would be ‘Organisational Support’, as stated in the literature review. However, regardless of either view, the ‘Social Factors’ variable has no influence over any other variable in the study. Indeed, both the explained variance and predictive relevance for this variable are extremely weak requiring further investigation to clarify its relative impact in future studies.
  • Hypothesis 21 (H21). User Competency is significant to Social Factors” was accepted, and is perhaps one of a few explanatory variables to ‘Social Factors’, as argued in the literature review. Since ‘Social Factors’ is representative of in-house cultures [30], the hypothesis succeeded in signifying the relationship with less than 5% error variance. However, unless the ‘Social Factors’ variable is capable of signifying another relationship, such as behavioural intention, it is futile to make additional sense of its relationship to ‘User Competency’.
  • Hypothesis 22 (H22). User Competency is significant to BIM Uptake” was accepted. Indeed, t = 3.49 ** and p = 0 imply that the relationship is highly significant with 0% error variance, thus meaning that ‘User Competency’ is the most significant variable for ‘BIM Uptake’.
  • Hypothesis 23 (H23). CI/Kaizen is significant to User Competency” was accepted when ‘CI/Kaizen’ is treated as an independent variable rather than a moderator. Therefore, it may be assumed that, should the inhouse culture of the sponsoring organisation receive more in-depth and specific clarity in terms of CI/Kaizen in Technical and Process trainings, the results in H11 and H12 would be accepted. As noted in H9, this variable may require further research to better adjust its causality in the proposed framework.
  • Hypothesis 24 (H24). Lean is significant to BIM Uptake” and “Hypothesis 25 (H25). Lean is significant to Organisational Learning” were both rejected. This hypothesis was based on the recommendation of Sim et al. [44], that Lean leverages organisational learning to a point where employees require less training. However, as verified in H10, Lean is only significant to ‘Perceived Ease of Use’ in this study, which raises a question as to how Lean, Kaizen, and Organisational Learning are related.

6. Conclusions

OUM provides a nonconventional predictive tool that can be applied to better adopt and implement BIM, while providing a detailed understanding of WBET needs that differ from one organisation to another. The results of OUM led the sponsoring organisation to a deeper understanding of the factors that influence ‘Attitude’, ‘User Competency’ and their own ‘Organisational Support’, as these variables reflected the strongest predictive relevancy within the organisation. In terms of ‘Attitude’, through OUM, the sponsoring organisation realised that there is no resistance to change within their culture, as long as their investments in ‘Knowledge Acquisition’ and ‘Technical Training’ are well managed and maintained.
On the other hand, while investigating the ‘User Competency’ variable, OUM highlighted a crucial reality, where 31% of survey participants were unaware of the constituents of their knowledge portal that were already in-place. Accordingly, the sponsoring organisation realised that more effort is needed in knowledge management so that ‘Knowledge Acquisition’, which is needed to leverage ‘User Competency’, is better promoted and publicised at first hand. However, leveraging ‘User Competency’ was relational to other variables, such as ‘Technical Training’ and ‘CI/Kizen’. As such, OUM revealed that employees’ demand for ‘Technical Training’ prevailed over ‘Process Training’, which would enable a higher level of confidence amongst the sponsoring organisation to prioritise the training budget towards the technical aspects of BIM adoption. Alternatively, OUM revealed that although employees were aware of the significance of ‘CI/Kizen’ to ‘User Competency’, they lacked clarity on the subject and its integration, which may reflect a secondary need to invest in a continuous improvement programme. Furthermore, despite the need for better knowledge management and additional technical training, OUM revealed that organisational efforts were valued by the sponsoring organisation’s employees. Hence, the reported significance between ‘Organisational Support’ and ‘Perceived Ease of Use’ implies that employees believe the organisation is progressively transitioning to BIM.
Finally, as ‘User Competency’ is reported as the strongest determinant of ‘BIM Uptake’, OUM managed to provide detailed insights to WBET needs particular to the sponsoring organisation. These insights may be interpreted as guidelines to business decisions that not only enable capability and capacity building, but also provides the means to predict and strategically plan for BIM uptake and organisational learning at their highest levels.

Author Contributions

Conceptualization, J.S., J.U. and J.H.; methodology, J.S., J.U. and J.H.; validation, J.S.; formal analysis, J.S.; investigation, J.S., J.U. and J.H.; data curation, J.S. and J.H.; writing—original draft preparation, J.S., J.U. and J.H.; writing—review and editing, J.S., J.U. and J.H.; supervision, J.U. and J.H.; project administration, J.S., J.U. and J.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding authors.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. IGT [HM Government]. Low Carbon Construction Innovation and Growth Team Final Report; IGT [HM Government]: London, UK, 2010. Available online: http://www.carbonaction2050.com/sites/carbonaction2050.com/files/document-attachment/IGTLowCarbonConstruction.pdf (accessed on 15 June 2018).
  2. Cabinet Office. Government Construction Strategy; E. a. R. G. & BIS, C.S.U.: London, UK, 2011.
  3. NBS. 10th Annual BIM Report 2020; NBS Enterprises Ltd.: Newcastle Upon Tyne, UK, 2020.
  4. Construction Manager. Annual BIM Survey: Poor Digital Skills Hold Back Adoption; Charted Institute of Building (CIOB): Berkshire, UK, 2020. Available online: https://www.constructionmanagermagazine.com/annual-bim-survey-poor-digital-skills-hold-back-adoption/ (accessed on 1 June 2020).
  5. Liu, Y.; van Nederveen, S.; Hertogh, M. Understanding effects of BIM on collaborative design and construction: An empirical study in China. Int. J. Proj. Manag. 2017, 35, 686–698. [Google Scholar] [CrossRef]
  6. NBS. Annual BIM Report 2018; NBS Enterprises Ltd.: Newcastle Upon Tyne, UK, 2018.
  7. DBB. Digital Built Britain: Level 3 Building Information Modelling-Strategic Plan; HM Government: London, UK, 2015.
  8. National Infrastructure Commission. Data for the Public Good; National Infrastructure Commission: London, UK, 2017.
  9. Gokuc, Y.T.; Arditi, D. Adoption of BIM in architectural design firms. Arch. Sci. Rev. 2017, 60, 483–492. [Google Scholar] [CrossRef]
  10. Alreshidi, E.; Mourshed, M.; Rezgui, Y. Factors for effective BIM governance. J. Build. Eng. 2017, 10, 89–101. [Google Scholar] [CrossRef] [Green Version]
  11. Ghaffarianhoseini, A.; Tookey, J.; Ghaffarianhoseini, A.; Naismith, N.; Azhar, S.; Efimova, O.; Raahemifar, K. Building Information Modelling (BIM) uptake: Clear benefits, understanding its implementation, risks and challenges. Renew. Sustain. Energy Rev. 2017, 75, 1046–1053. [Google Scholar] [CrossRef]
  12. Kassem, M.; Succar, B. Macro BIM adoption: Comparative market analysis. Autom. Constr. 2017, 81, 286–299. [Google Scholar] [CrossRef]
  13. Liao, L.; Teo, E.A.L. Organizational Change Perspective on People Management in BIM Implementation in Building Projects. J. Manag. Eng. 2018, 34, 04018008. [Google Scholar] [CrossRef]
  14. Ozorhon, B.; Karahan, U. Critical Success Factors of Building Information Modeling Implementation. J. Manag. Eng. 2017, 33, 04016054. [Google Scholar] [CrossRef]
  15. Santos, R.; Costa, A.A.; Grilo, A. Bibliometric analysis and review of Building Information Modelling literature published between 2005 and 2015. Autom. Constr. 2017, 80, 118–136. [Google Scholar] [CrossRef]
  16. Antwi-Afari, M.; Li, H.; Parn, E.; Edwards, D. Critical success factors for implementing building information modelling (BIM): A longitudinal review. Autom. Constr. 2018, 91, 100–110. [Google Scholar] [CrossRef]
  17. Yin, Q. Application of Case Teaching Method in BIM Learning. Int. J. New Dev. Educ. 2021, 3. [Google Scholar] [CrossRef]
  18. Sacks, R.; Pikas, E. Building Information Modeling Education for Construction Engineering and Management. I: Industry Requirements, State of the Art, and Gap Analysis. J. Constr. Eng. Manag. 2013, 139, 04013016. [Google Scholar] [CrossRef] [Green Version]
  19. Smith, P. BIM Implementation–Global Strategies. Procedia Eng. 2014, 85, 482–492. [Google Scholar] [CrossRef] [Green Version]
  20. Uhm, M.; Lee, G.; Jeon, B. An analysis of BIM jobs and competencies based on the use of terms in the industry. Autom. Constr. 2017, 81, 67–98. [Google Scholar] [CrossRef]
  21. Wang, G.; Song, J. The relation of perceived benefits and organizational supports to user satisfaction with building information model (BIM). Comput. Hum. Behav. 2017, 68, 493–500. [Google Scholar] [CrossRef]
  22. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User Acceptance of Computer Technology: A Comparison of Two Theoretical Models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef] [Green Version]
  23. Howard, R.; Restrepo, L.; Chang, C.-Y. Addressing individual perceptions: An application of the unified theory of acceptance and use of technology to building information modelling. Int. J. Proj. Manag. 2016, 35, 107–120. [Google Scholar] [CrossRef]
  24. Hair, J.F., Jr.; Sarstedt, M.; Hopkins, L.; Kuppelwieser, V.G. Partial least squares structural equation modeling (PLS-SEM): An emerging tool in business research. Eur. Bus. Rev. 2014. [Google Scholar] [CrossRef]
  25. Li, X.; Wu, W.; Shen, Q.; Wang, X.; Teng, Y. Mapping the knowledge domains of Building Information Modeling (BIM): A bibliometric approach. Autom. Constr. 2017, 84, 195–206. [Google Scholar] [CrossRef]
  26. NBS. National BIM Report 2019: The Definitive Industry Update; NBS Enterprises Ltd.: Newcastle Upon Tyne, UK, 2019.
  27. Succar, B.; Sher, W.; Williams, A. An integrated approach to BIM competency assessment, acquisition and application. Autom. Constr. 2013, 35, 174–189. [Google Scholar] [CrossRef]
  28. Wu, W.; Mayo, G.; McCuen, T.L.; Issa, R.R.A.; Smith, D.K. Building Information Modeling Body of Knowledge. I: Background, Framework, and Initial Development. J. Constr. Eng. Manag. 2018, 144, 04018065. [Google Scholar] [CrossRef]
  29. Wu, W.; Mayo, G.; McCuen, T.L.; Issa, R.R.A.; Smith, D.K. Building Information Modeling Body of Knowledge. II: Consensus Building and Use Cases. J. Constr. Eng. Manag. 2018, 144, 04018066. [Google Scholar] [CrossRef]
  30. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User Acceptance of Information Technology: Toward a Unified View. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
  31. Thompson, R.L.; Higgins, C.A.; Howell, J.M. Personal Computing: Toward a Conceptual Model of Utilization. MIS Q. 1991, 15, 125. [Google Scholar] [CrossRef]
  32. Jin, R.; Hancock, C.; Tang, L.; Chen, C.; Wanatowski, D.; Yang, L. Empirical Study of BIM Implementation–Based Perceptions among Chinese Practitioners. J. Manag. Eng. 2017, 33, 04017025. [Google Scholar] [CrossRef] [Green Version]
  33. Nielsen, K.; Randall, R.; Christensen, K.B. Do Different Training Conditions Facilitate Team Implementation? A Quasi-Experimental Mixed Methods Study. J. Mix. Methods Res. 2016, 11, 223–247. [Google Scholar] [CrossRef] [Green Version]
  34. Stabile, C. Clarifying the Differences between Training, Development, and Enrichment: The Role of Institutional Belief Constructs in Creating the Purpose of Faculty Learning Initiatives. New Dir. Teach. Learn. 2013, 2013, 71–84. [Google Scholar] [CrossRef]
  35. Liker, J.K. The Toyota Way: 14 Management Principles (First); McGrawhill: New York, NY, USA, 2004. [Google Scholar]
  36. Schwarz, U.V.T.; Nielsen, K.M.; Stenfors-Hayes, T.; Hasson, H. Using kaizen to improve employee well-being: Results from two organizational intervention studies. Hum. Relat. 2016, 70, 966–993. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Álvarez-García, J.; Durán-Sánchez, A.; Río-Rama, M.D.L.C.D. Systematic bibliometric analysis on Kaizen in scientific journals. TQM J. 2018, 30, 356–370. [Google Scholar] [CrossRef]
  38. Compeau, D.R.; Higgins, C.A. Application of Social Cognitive Theory to Training for Computer Skills. Inf. Syst. Res. 1995, 6, 118–143. [Google Scholar] [CrossRef]
  39. Law, K.M.Y.; Chuah, K.B. PAL Driven Organizational Learning: Theory and Practices a Light on Learning Journey of Organizations; Springer International Publishing: Cham, Switzerland; New York, NY, USA, 2015. [Google Scholar] [CrossRef]
  40. Marsick, V.J.; Watkins, K.E. Demonstrating the Value of an Organization’s Learning Culture: The Dimensions of the Learning Organization Questionnaire. Adv. Dev. Hum. Resour. 2003, 5, 132–151. [Google Scholar] [CrossRef]
  41. Rother, M. Toyota Kata, 5th ed.; McGrawhill Education: Chennai, India, 2017. [Google Scholar]
  42. ENR. Engineering News-Record. 2018. Available online: https://www.enr.com/ (accessed on 17 August 2018).
  43. AlManei, M.; Salonitis, K.; Xu, Y. Lean Implementation Frameworks: The Challenges for SMEs. Procedia CIRP 2017, 63, 750–755. [Google Scholar] [CrossRef]
  44. Sim, K.L.; Curatola, A.P.; Banerjee, A. Lean Production Systems and Worker Satisfaction: A Field Study. Adv. Bus. Res. 2015, 6, 79–100. Available online: http://journals.sfu.ca/abr (accessed on 1 April 2018).
  45. Yin, R.K. Qualitative Research from Start to Finish; The Guilford Press: New York, NY, USA, 2011. [Google Scholar]
  46. Greenfield, T. Research Methods for Post Graduates, 2nd ed.; Arnold: New York, NY, USA, 2002. [Google Scholar]
  47. Hair JF, J.; Hult GT, M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), 2nd ed.; SAGE Publications, Inc.: Los Angeles, CA, USA, 2017. [Google Scholar]
  48. Bhattacherjee, A. Social Science Research: Principles, Methods, and Practices, 2nd ed.; University of South Florida: Tampa, FL, USA, 2012; Available online: http://scholarcommons.usf.edu/oa_textbookshttp://scholarcommons.usf.edu/oa_textbooks/3 (accessed on 15 June 2018).
  49. Kwong, K.; Wong, K. Partial Least Squares Structural Equation Modeling (PLS-SEM) Techniques Using SmartPLS. Mark. Bull. 2013, 24, 1–32. [Google Scholar]
  50. Monecke, A.; Leisch, F. semPLS: Structural Equation Modeling Using Partial Least Squares. J. Stat. Softw. 2012, 48, 1–32. [Google Scholar] [CrossRef] [Green Version]
  51. Sarstedt, M.; Ringle, C.M.; Smith, D.; Reams, R.; Hair, J.F. Partial least squares structural equation modeling (PLS-SEM): A useful tool for family business researchers. J. Fam. Bus. Strat. 2014, 5, 105–115. [Google Scholar] [CrossRef]
  52. Holt, G.D. Asking questions, analysing answers: Relative importance revisited. Constr. Innov. 2014, 14, 2–16. [Google Scholar] [CrossRef]
  53. Kline, R.B. Principles and Practice of Structural Equation Modeling; The Guilford Press: New York, NY, USA, 2011. [Google Scholar]
  54. SmartPLS. Available online: https://www.smartpls.com/ (accessed on 1 April 2018).
  55. Abbasnejad, B.; Nepal, M.; Ahankoob, A.; Nasirian, A.; Drogemuller, R. Building Information Modelling (BIM) adoption and implementation enablers in AEC firms: A systematic literature review. Arch. Eng. Des. Manag. 2020, 1–23. [Google Scholar] [CrossRef]
Figure 1. Technology Acceptance Model (TAM) [22].
Figure 1. Technology Acceptance Model (TAM) [22].
Applsci 11 08646 g001
Figure 2. Proposed Organisational Upskilling Model (OUM).
Figure 2. Proposed Organisational Upskilling Model (OUM).
Applsci 11 08646 g002
Figure 3. Coefficient of Determination results from SmartPLS 3.0.
Figure 3. Coefficient of Determination results from SmartPLS 3.0.
Applsci 11 08646 g003
Table 1. Demographics of respondents.
Table 1. Demographics of respondents.
ExperienceGradeTitleCountTotals
Junior
0 to 7 years
AApprentice240
BGraduate17
CProfessional (non-chartered)11
DProfessional (chartered)10
Senior
Over 7 years
EProfessional (chartered-senior)1533
FAssociate12
GDirector3
HDirector (senior)3
Total no. of participants73
Table 2. Questionnaire statements and outer loading results.
Table 2. Questionnaire statements and outer loading results.
IndicatorsQuestionnaire Statements(λ ≥ 0.7)
AT1BIM implementation makes my work more interesting.0.898
AT2BIM is better than CAD.0.889
AT3Adopting BIM is a wise decision.0.826
BI1I intend to be BIM-enabled in the next 6 months.0.948
BI2I predict that I will be implementing BIM in the next 6 months.0.958
BI3I plan to use Adopt BIM in the next 6 months.0.96
BU1I am very pleased that my organisation has adopted BIM.0.896
BU2I am very satisfied that my organisation has adopted BIM.0.892
BU3I am delighted that my organisation has adopted BIM.0.851
CI1If I am guided on methods to continuously improve, my effectiveness will be increased.0.947
CI2If I am guided on methods to continuously improve, I will save time on daily tasks.0.942
CI3If I am guided on methods to continuously improve, I will increase the quality of output of my job.0.943
KA1If I was educated and trained in PAS1192 series, CIC and BSI standards, it would be easy for me to adopt BIM.0.804
KA2If I was educated and trained in BIM Project Delivery Processes, it would be easy for me to adopt BIM.0.874
KA3Given the resources, opportunities and knowledge it takes to utilise BIM, it would be easy for me to adopt BIM.0.857
KA4My organisation enables quick access to the relevant knowledge and training material.0.612
LN1My organisation is working hard to eliminate lead time (delay) by making project deliveries more efficient through BIM.0.673
LN2My organisation is working hard to meet and exceed quality expectations.0.803
LN3My organisation is working hard to provide a healthy working environment.0.743
LN4My organisation is striving to become a Learning Organisation.0.856
LN5My organisation has a clear vision which I am aware of.0.781
LN6My organisation maintains a collaborative culture.0.761
OL1In my organization, people openly discuss mistakes in order to learn from them.0.804
OL2In my organization, people help each other learn.0.83
OL3In my organization, people view problems in their work as an opportunity to learn.0.864
OL4In my organization, people are encouraged to give open and honest feedback to enable process improvement.0.927
OS1My organisation understands the benefits of BIM.0.712
OS2My Organisation has sufficient technical capabilities for adopting BIM.0.776
OS3My organisation provides proper education and training for BIM adoption and implementation.0.812
OS4My organisation provides proper guidance for project delivery using BIM.0.878
OS5My organisation provides specialised training to suit my job role and responsibility.0.784
OS6My organisation utilizes a specific person, group or external consultants to support BIM difficulties.0.64
PE1I found it easy to get BIM software to do what I want it to do.0.898
PE2In general, BIM software is easy to use.0.898
PE3In general, I found BIM software to be flexible to interact with.0.909
PT1If I was given BIM project delivery guidance and training, I will be more skilful.0.89
PT2If I was given BIM project delivery guidance and training, I would find BIM easy to implement.0.831
PT3If I was given BIM project delivery guidance and training, further BIM-related learning becomes easier for me.0.896
PU1Using BIM software and guidelines in my job will increase my productivity.0.947
PU2Using BIM software and guidelines in my job will improve my performance.0.894
PU3Using BIM software and guidelines in my job will enhance my effectiveness.0.962
PU4Using BIM software and guidelines in my job is more useful than using CAD.0.825
SF1I prefer BIM because of the proportion of colleagues who use it.0.593
SF2The senior management in this organisation is helpful when implementing BIM.0.744
SF3My Line Manager advocates BIM implementation.0.647
SF4In general, the organisation has Adopted BIM.0.67
TT1If I was given technical training, it would be easy for me to become skilful using BIM software.0.928
TT2If I was given technical training, I would find BIM software easy to use.0.865
TT3If I was given technical training, further learning becomes easier for me.0.874
UC1I could complete a job or task using BIM software with minimal support from others.0.892
UC2I could complete a job or task using BIM software if I could call someone for instant help if I got stuck.0.868
UC3My perception of BIM for project delivery is clear and understandable.0.616
Table 3. AVE and composite reliability results.
Table 3. AVE and composite reliability results.
VariablesCodeρc ≥ 0.6Σλ2 > 0.5
AttitudeAT0.9040.76
BIM UptakeBU0.9110.774
Behavioural Intention to UseBI0.9690.913
CI/KaizenCI0.9610.891
Knowledge AcquisitionKA0.9020.754
LeanLN0.9010.647
Organisational LearningOL0.9170.735
Organisational SupportOS0.9010.646
Perceived Ease of UsePE0.9290.813
Perceived UsefulnessPU0.950.826
Process TrainingPT0.9060.762
Social FactorsSF0.7940.562
Technical TrainingTT0.9190.791
User CompetencyUC0.9040.825
Table 4. Cross loadings results.
Table 4. Cross loadings results.
VariablesIndicatorsATBIBUCIKALNOLOSPEPTPUSFTTUC
AttitudeAT10.8980.5250.4830.190.3240.1260.0040.1710.420.3150.7180.2190.3690.495
AT20.8890.4720.3790.0930.259−0.017−0.103−0.0270.2540.330.6910.1190.3140.402
AT30.8260.4070.4960.1940.2750.08−0.0150.0890.1550.4050.5510.2610.4670.387
Behavioural Intention to UseBI10.5540.9480.3750.2730.3510.1510.1460.1580.3840.4550.5260.1940.510.59
BI20.4610.9580.330.2350.3980.1430.1050.1970.3170.4530.4710.1980.4990.588
BI30.5250.960.3960.2470.2920.1280.1490.0840.2130.4450.4830.1570.5320.515
BIM UptakeBU10.5830.3110.8960.5120.3240.440.2960.4530.1670.2820.5180.4860.3790.385
BU20.3630.4340.8920.5310.2750.5470.4910.5560.2440.350.4150.5310.5040.335
BU30.4190.260.8510.4310.1810.3160.30.2690.0420.3060.3980.3060.3160.125
CI/KaizenCI10.1870.2310.560.9470.40.5040.3380.3340.110.4610.4080.1860.4870.354
CI20.1520.220.5170.9420.3550.4420.3510.3180.1710.4470.3980.2490.4970.348
CI30.1760.2920.5140.9430.4330.5190.3620.3880.1770.4950.3440.2110.5420.421
Knowledge AcquisitionKA10.2360.2120.2370.3590.8470.3310.2240.2960.2670.4610.3080.1930.4680.428
KA20.3130.2960.2980.4040.9050.2490.2550.3450.2640.5040.40.2070.4220.513
KA30.3020.4120.2460.3350.8520.2210.1390.2350.3110.5310.4340.2230.4520.603
LeanLN20.1590.2140.4630.3460.1780.7960.6110.5410.110.1260.0830.5410.2320.221
LN3−0.0330.040.3270.4010.210.7410.5260.4490.0650.223−0.0910.3560.3310.129
LN40.0260.0980.3710.5220.310.8740.4820.60.250.160.1270.5980.2690.181
LN50.0790.0920.50.4760.2250.820.4910.5060.2270.1460.1750.3780.2210.1
LN60.0560.1450.3470.3150.2960.7840.510.4130.2910.1080.1370.3780.1870.141
Organisational LearningOL1−0.0340.1120.2890.3110.1550.4180.8040.297−0.0840.2020.0270.2860.3070.072
OL20.0010.1160.290.3030.2630.6130.830.4160.0140.0580.0260.3770.230.169
OL3−0.1640.0330.2880.2060.0750.4640.8640.485−0.10.08−0.1190.3530.226−0.002
OL40.0140.1840.4880.4080.2710.6760.9270.5020.0680.3010.0650.4290.4120.153
Organisational SupportOS10.0920.1180.550.4730.3090.5070.3960.7170.1660.2040.1130.4360.2720.25
OS2−0.0460.0450.2750.1920.2330.5040.4180.7670.2810.0290.0060.4380.1740.213
OS30.0550.1110.3340.2210.2460.440.3380.8370.3550.0740.0830.480.1370.244
OS40.0750.1370.3990.30.3270.550.410.8930.4120.2520.150.5290.3120.322
OS50.1780.1810.4420.3110.2310.5320.4570.7930.3910.1780.1470.5060.270.274
Perceived Ease of UsePE10.3140.2790.1610.1780.3170.185−0.0670.330.8970.1890.40.3040.3410.475
PE20.2360.2940.1750.0840.2580.2640.0150.4590.8990.1960.2630.3290.2390.424
PE30.3270.290.1510.1710.2990.190.020.3230.9080.190.4320.260.2010.361
Process TrainingPT10.3970.4390.3520.5120.4130.1750.2170.0770.1240.8880.5220.1280.70.492
PT20.2770.4040.250.3170.6470.1530.1250.3110.2960.8320.390.1980.5970.529
PT30.360.390.3210.4580.4650.1650.1990.1220.1460.8970.4520.1390.7280.385
Perceived UsefulnessPU10.6850.4720.4640.4380.4370.12900.1330.410.4890.9470.2480.5330.513
PU20.5650.4060.4640.4070.3230.1570.0810.1250.4190.5140.8940.2480.5730.375
PU30.6950.4880.5040.3980.4320.1230.0120.1320.3790.5020.9620.2680.560.515
PU40.7730.5030.4080.2320.4050.004−0.0420.0770.2820.4080.8250.105 0.3830.463
Social FactorsSF20.0960.0720.4340.3040.1740.5820.3130.6360.3310.1240.1730.7690.0950.165
SF30.2690.230.2810.0170.1790.190.2690.2060.1990.1810.2390.7610.2740.178
SF40.0990.0860.5190.2720.1990.680.4540.6490.2160.0470.0660.7190.2110.139
Technical TrainingTT10.4420.4560.3810.4720.4790.2890.3240.30.3480.6420.5620.2670.9270.619
TT20.3250.4390.3750.3640.3660.1640.2880.2760.2610.6870.3650.2190.8640.441
TT30.3810.5350.4650.5770.5040.3430.340.2140.1690.7410.5380.2090.8760.524
UserCompetencyUC10.5060.5620.1720.2430.480.05−0.0410.1830.4950.4740.4870.1690.4830.905
UC20.3950.5110.4250.4770.6070.2990.2580.4080.3550.5050.4560.2270.6080.912
Table 5. Fornell-Larker results.
Table 5. Fornell-Larker results.
VariablesIndicatorsATBUBICIKALNOLOSPEPUPTSFTTUC
AttitudeAT0.872
BIM UptakeBU0.5190.880
Behavioural Intention to UseBI0.540.3860.956
CI/KaizenCI0.1820.5620.2640.944
Knowledge AcquisitionKA0.330.3010.3610.4210.868
LeanLN0.0740.5020.1470.5190.3030.804
Organisational LearningOL-0.0430.4160.1410.3720.2340.6490.857
Organisational SupportOS0.0920.4950.1510.3690.3350.6320.5040.804
Perceived Ease of UsePE0.3260.180.3190.1620.3250.234-0.0130.4080.902
Perceived UsefulnessPU0.7540.5070.5180.4040.4440.1110.0110.1280.4080.909
Process TrainingPT0.3980.3550.4720.4970.5770.1890.2090.1890.2120.5250.873
Social FactorsSF0.2280.5110.1910.2270.240.5720.4280.5980.330.2380.1760.750
Technical TrainingTT0.4360.460.5390.540.5140.3080.3590.2940.2910.5610.7740.2620.889
User CompetencyUC0.4940.3310.590.3980.5990.1950.1230.3280.4670.5190.5390.2190.6020.908
Table 6. Significance and hypotheses testing results.
Table 6. Significance and hypotheses testing results.
IDHypothesest-Values p-ValuesResults
H1User Competency -> Behavioural Intention to Use3.393**0.001Accepted
H2Knowledge Acquisition -> Behavioural Intention to Use2.318*0.021Accepted
H3Organisational Support -> Perceived Ease of Use2.14*0.032Accepted
H4Technical Training -> Perceived Ease of Use1.648 0.099Rejected
H5Process Training -> Perceived Ease of Use0.531 0.596Rejected
H6Technical Training -> User Competency2.34*0.019Accepted
H7Process Training -> User Competency0.588 0.557Rejected
H8User Competency -> Perceived Ease of Use2.715**0.007Accepted
H9User Competency -> Perceived Usefulness6.038**0Accepted
H10Lean -> Perceived Ease of Use2.136*0.033Accepted
H11CI/Kaizen in Process Training -> User Competency1.597 0.11Rejected
H12CI/Kaizen in Technical Training -> User Competency1.779 0.075Rejected
H13CI/Kaizen in Knowledge Acquisition -> User Competency1.1 0.271Rejected
H14Knowledge Acquisition ->Organisational Support1.992*0.046Accepted
H15CI/Kaizen to Knowledge Acquisition -> Perceived Usefulness1.08 0.28Rejected
H16Knowledge Acquisition -> Attitude2.848**0.004Accepted
H17Technical Training -> Attitude1.992**0.046Accepted
H18Attitude -> Behavioural Intention to Use6.059**0Accepted
H19Attitude -> BIM Uptake1.268 0.205Rejected
H20Social Factors -> Perceived Usefulness1.442 0.149Rejected
H21User Competency -> Social Factors1.954*0.049Accepted
H22User Competency -> BIM Uptake3.49**0Accepted
H23CI/Kaizen -> User Competency2.996**0.003Accepted
H24Lean -> BIM Uptake0.965 0.335Rejected
H25Lean -> Organisational Learning0.897 0.37Rejected
Hypotheses Testing-(p < 0.05 * and p < 0.01 **).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Semaan, J.; Underwood, J.; Hyde, J. An Investigation of Work-Based Education and Training Needs for Effective BIM Adoption and Implementation: An Organisational Upskilling Model. Appl. Sci. 2021, 11, 8646. https://doi.org/10.3390/app11188646

AMA Style

Semaan J, Underwood J, Hyde J. An Investigation of Work-Based Education and Training Needs for Effective BIM Adoption and Implementation: An Organisational Upskilling Model. Applied Sciences. 2021; 11(18):8646. https://doi.org/10.3390/app11188646

Chicago/Turabian Style

Semaan, Jalal, Jason Underwood, and Jason Hyde. 2021. "An Investigation of Work-Based Education and Training Needs for Effective BIM Adoption and Implementation: An Organisational Upskilling Model" Applied Sciences 11, no. 18: 8646. https://doi.org/10.3390/app11188646

APA Style

Semaan, J., Underwood, J., & Hyde, J. (2021). An Investigation of Work-Based Education and Training Needs for Effective BIM Adoption and Implementation: An Organisational Upskilling Model. Applied Sciences, 11(18), 8646. https://doi.org/10.3390/app11188646

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop