Next Article in Journal
How Do the Labour Force Characteristics Encounter COVID-19 Economic Consequences—A Canadian Experience
Previous Article in Journal
For Telework, Please Dial 7—Qualitative Study on the Impacts of Telework on the Well-Being of Contact Center Employees during the COVID19 Pandemic in Portugal
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessing Educators’ Soft Skills: Developing a Self-Assessment Instrument

1
Teaching Innovation & Research, Te Pūkenga Whitireia and WelTec, Wellington 5010, New Zealand
2
Business Management Programme, Te Pūkenga Whitireia and WelTec, Wellington 5010, New Zealand
3
School of Health Sciences, Massey University, Palmerston North 4410, New Zealand
4
Healthcare Department, New Zealand Quality Research and Innovation, Auckland 4442, New Zealand
5
Information Technology Programmes, Te Pūkenga Whitireia and WelTec, Wellington 5010, New Zealand
*
Author to whom correspondence should be addressed.
Adm. Sci. 2023, 13(9), 208; https://doi.org/10.3390/admsci13090208
Submission received: 24 June 2023 / Revised: 11 September 2023 / Accepted: 11 September 2023 / Published: 19 September 2023
(This article belongs to the Section Leadership)

Abstract

:
Educators play multifaceted roles in supporting students’ academic growth, necessitating a diverse knowledge base and a variety of soft skills. The COVID-19 pandemic has brought about significant changes in the education environment, compelling educators to adapt to these new demands. Consequently, nurturing soft skills among educators has become crucial to effectively address evolving educational challenges. This paper presents the development and validation process of an online questionnaire aimed at measuring Te Pūkenga educators’ self-assessment of their soft skills before and during the COVID-19 pandemic. The questionnaire comprises 28 Likert-type scale questions, encompassing 14 identified soft skills, alongside 6 additional questions on sociological and academic factors. A two-stage approach for questionnaire development and validation was used. In stage one, the questionnaire was created through a literature review and the identification of soft skills and independent variables. Stage two involved a content validity check by 10 educators and academic experts, leading to refinement based on their feedback. Subsequently, a pilot study was conducted with 50 random respondents to determine the validity and reliability of the instrument, and a preliminary data analysis was performed. The results of the validation process confirmed the questionnaire’s validity and reliability, as we hypothesised, indicating its potential as a useful research tool for a planned research project. Further research involving a broader range of tertiary institutions can enhance the scale’s validity and reliability, thereby strengthening its applicability to be utilised by diverse educational institutions and in diverse research settings to measure educators’ self-assessments of their acquired soft skills and self-evaluations of how challenging experiences and events in their professional environment influence the implementation of these skills. By embracing and fostering soft skills among educators, educational institutions can better equip their staff to meet the evolving demands and complexities of modern education.

1. Introduction

In this increasingly competitive and changing world, every academic institution works to develop its employees and up-skill staff ability to enhance the positive impact on its students. Regardless of the technical skills required for a vocation, it is vital to develop numerous soft skills such as communication and interaction skills, problem-solving skills, and behavioural skills to utilise technical skills and knowledge in a workplace (Junrat et al. 2014).
Soft skills are intangible assets that characterise and qualify an individual’s personality to meet the requirements of workplace proficiency (Fernández-Arias et al. 2021). These skills are not technical or specific to a particular job but are strongly linked to personal qualities and attitudes towards social and management skills. Therefore, the acquisition of these soft skills is essential for educators to successfully navigate sudden changes in workplace situations, interpret and understand complex scenarios, and effectively guide the design of professional development training courses that enhance their personal qualities (De Pietro et al. 2019). By acquiring these soft skills, educators can elevate their overall effectiveness and adaptability in the ever-changing educational landscape.
The COVID-19 pandemic, which started in early 2020, was a global catastrophe with a profound impact on various aspects of our lives. In particular, it has necessitated significant changes in the teaching methods employed by educational institutions (Babbar and Gupta 2022). As a result, there has been an increased demand for staff to develop their skills in distance/online delivery methods and the implementation of new technical tools that facilitate teaching and learning activities in the higher education system (Ahmad et al. 2020; Antón-Sancho et al. 2021).
This necessary shift in teaching approach during the COVID-19 pandemic has significantly affected educators’ professional lives, potentially influencing how they utilise their soft skills. Due to the intangibility of soft skills, along with the impact of COVID-19, recognising, quantifying, and evaluating them requires the measurement of educators’ level of acquisition of each skill and an assessment of how the implementation of these skills have been impacted during the pandemic.
In New Zealand, Te Pūkenga—New Zealand Institute of Skills and Technology serves as the largest tertiary vocational education provider, comprising 16 institutes of technology and polytechnics. During the COVID-19 pandemic, the teaching methods at Te Pūkenga have undergone a major transformation, rapidly transitioning from face-to-face learning to digital and distance learning options.
However, it remains unclear to what extent the educators in Te Pūkenga have had the ability to adapt to the changes in teaching methods and the altered working environment during COVID-19. It is, therefore, appropriate to measure their level of awareness regarding the required soft skills in their professional lives and understand the pandemic’s impact on their ability to apply these skills.
Hence, following the New Zealand government’s announcement to end the COVID-19 Protection Framework’s “traffic light” system in September 2022 and the subsequent restoration of normal operations for tertiary education providers, we developed a research project to assess the self-awareness of Te Pūkenga educators with regard to the soft skills required in their professional roles under ordinary working circumstances. This research project also aimed to measure educators’ self-awareness of how external factors, such as COVID-19, could influence their soft skill abilities and behaviours in response to changes in and challenges of their working tasks and environment. This research project provides valuable insights into Te Pūkenga educators’ perceptions of their soft skills and their preparedness to effectively maintain their soft skill performance, even when faced with challenging factors.
Since soft skills are intangible unless manifested in behaviours, a self-assessment questionnaire, which allows for educators’ self-reflection on behaviours and personal opinions on their soft skills, is an appropriate tool to measure these skills (Jardim et al. 2022). Consequently, to conduct this intended research project, an online self-assessment questionnaire was developed to measure Te Pūkenga educators’ self-awareness of their level of acquisition of the required soft skills and how their use of these skills was impacted during the COVID-19 pandemic.
This paper presents the process of developing the online self-assessment questionnaire tool for our proposed main research project. Our study objective in this paper is to assess our hypothesis: “The questionnaire instrument is expected to exhibit strong internal consistency (reliability) demonstrated by a Cronbach’s alpha coefficient, and will demonstrate validity by showing positive correlations among constructs, as well as positive association between the variables and the questionnaire questions”. By testing this hypothesis, our intention is to confirm the appropriateness of the development process of the questionnaire and to validate the suitability of the developed questionnaire as a research instrument for our subsequent main project.

2. Background

There are limited self-assessment instruments to measure educators’ soft skills. Most of the available tools were developed to examine the level of students’ soft skills (Chamorro-Premuzic et al. 2010; Escolà-Gascón and Gallifa 2022; Jardim et al. 2022) in higher education.
Chamorro-Premuzic et al. (2010) designed a self-report inventory to assess the importance and development of 15 soft skills among tertiary students in UK universities. The assessed skills comprised self-management, communication, interpersonal, teamwork, working under pressure, creativity, critical thinking, willingness to learn, attention to detail, taking responsibility, planning and organising, insight, maturity, professionalism, and emotional intelligence. The authors studied the relationship between these skills and both academic and occupational success and investigated the influence of individual factors on these skills. However, the authors did not provide the comprehensive psychometric properties of the inventory.
Escolà-Gascón and Gallifa (2022) developed the SKILLS and Attitudes in the ONE Questionnaire (SKILLS-in-ONE) to study the most prevalent soft skills in the educational community. This self-reported inventory assesses 13 soft skills measured through 74 items using Likert scales ranging from 0 (completely disagree) to 5 (completely agree). The 13 soft skills are commitment, originality, integrity, entrepreneurial orientation, ineffectiveness, critical thinking, rigorousness, accuracy, underdetermination, involvement, environmental awareness, teamwork, and autonomy. The authors presented a comprehensive psychometric analysis of this inventory by analysing the Cronbach’s alpha (<0.6) and McDonald’s omega coefficient for each scale. In addition, the construct validity of the instrument was measured using an Exploratory Factor Analysis (EFA) and a Confirmatory Factor Analysis (CFA) on different samples, which proved the reliability and validity of the instrument. However, SKILLS-in-ONE was only applied to students aged 16 to 18 years old, thus limiting its interpretation and generalisability. A similar study was conducted by Jardim et al. (2022), who developed a soft skills inventory to evaluate the effects of intra- and interpersonal skills as well as professional skills on students’ success in an educational environment. The inventory comprised 49 items that were grouped into six domains: self-determination, resilience, empathy, assertiveness, social support, and teamwork. A psychometric analysis showed that this inventory is a reliable and valid instrument. Cronbach’s alpha measured the internal consistency values of each item, while EFA, CFA, and the item response theory tested the validity of the items. Although all these instruments have proven psychometric properties, they could not be used to meet the research objectives of our study, as none of these prior studies addressed the impact of external factors, such as COVID-19, on educators’ use of soft skills. Considering the gap in the literature, this study aims to develop and validate an online self-assessment questionnaire instrument that targets Te Pūkenga educators to measure not only their self-awareness of their possessed and deployed soft skills but also the impact that occurred regarding their strategic utilisation of these skills during the COVID-19 pandemic. The participants’ responses serve as valuable data for our proposed research project to gauge the participants’ awareness of the soft skills related to their professional role and COVID-19’s possible impact on utilising these skills.

3. Bandura’s Social Cognitive Theory

Bandura’s social cognitive theory (SCT) focuses on the interplay among individuals, their environment, and their behaviour (Bandura 1986). This theory emphasises that educators actively shape and influence their environment and behaviour. At its core lies the concept of self-efficacy, which reflects educators’ confidence in their ability to adeptly handle the responsibilities, tasks, and challenges within their working environment. This self-efficacy significantly affects critical academic outcomes, such as students’ achievements and motivation as well as the overall well-being within the working environment (Brennan 2017; Govindaraju 2021).
Incorporating various influential elements, such as an individual’s personality and cognitive processes and the surrounding environment, this cognitive theory analyses decision-making and competency behaviour. Soft skills, in contrast, refer to personality traits that enable educators to succeed in their roles and interactions with others. Within the framework of social cognitive theory (SCT), it is posited that personality traits, encompassing personal, cognitive, and environmental factors, are interconnected and play a role in driving an individual’s engagement with specific tasks, obligations, and challenges related to their professional role (Denler et al. 2014; Ghazali et al. 2021).
This understanding is correlated with our questionnaire structure, as it acknowledges the interconnection between the COVID-19 pandemic and educators’ behaviour and professional implementation of their soft skills. This correlation with soft skills implies that individuals who closely match their occupational orientation, possess solid social competencies, and have a positive psychological makeup are inclined to display favourable occupational behaviour (De Pietro et al. 2019). Nevertheless, the COVID-19 pandemic disrupted the anticipated connections among these skills. However, research examining the impact of the pandemic on the application of these skills remains relatively limited (Antón-Sancho et al. 2021).
Numerous studies utilised SCT to explore the impact of the pandemic on the behaviour of educators. For example, Alolaywi (2021) used SCT to investigate how the pandemic affected the teaching practices of Pakistani educators, finding that their attitudes and beliefs towards technology were significant predictors of their online teaching behaviour during the pandemic. Similarly, Perifanou et al. (2021) applied SCT to examine how the pandemic affected the teaching practices of Spanish university professors, discovering that their self-efficacy and beliefs about online teaching predicted their online teaching behaviour during the pandemic.
In our study, a set of 14 soft skills was adopted and defined according to the Bochum Inventory of Personality and Competences (BIP) questionnaire. These competencies are related to positive job outcomes.
By comprehending individuals’ self-perception of their personality traits, social skills, and other personal qualities, as well as their recognition of the influence of external factors like the pandemic on their self-efficacy and utilisation of soft skills, a comprehensive understanding of an individual’s strengths, weaknesses, and overall behavioural tendencies can be achieved. These insights are essential for educational institutions to effectively support their educators during challenging times and ensure a conducive learning environment for students.

4. Method

Our research methodology is based on a well-established model for online survey questionnaire development, adapted from Strachota et al. (2006).
Strachota et al. (2006) proposed that utilising an online survey questionnaire proves to be an effective means of collecting valuable employee feedback for research, evaluation, and fostering continuous professional development. These researchers developed a model specifically tailored for online survey questionnaire targeted at the development of adult educators and Human Resources staff. This model was further supported by two case examples, which offer a practical demonstration of the systematic process involved in creating online surveys. We have utilised the Strachota model as the foundation for our instrument development process and tailored a two-stage model for the creation of our questionnaire, as shown in Figure 1.

4.1. Stage One: Development of Initial Questionnaire

This consists of three steps: literature review and identification of soft skills, identification of independent variables, and questionnaire development.

4.1.1. Step 1: Literature Review and Identification of Soft Skills

It is important to define and outline the soft skills to be measured; therefore, we conducted a thorough review of previous studies to identify the most relevant and accepted soft skills for professionals in the workplace that are relevant to the education context.
After conducting an extensive literature review, our focus shifted to the Bochum Inventory of Personality and Competences (BIP) questionnaire. This scientifically developed assessment tool was specifically designed to evaluate 14 personality traits, commonly referred to as soft skills. These critical soft skills include Ambition, Influence, Leadership, Conscientiousness, Adaptability, Taking Action, Interpersonal Communication, Sociability, Social Sensitivity, Teamwork, Sharing Opinions and Ideas with Others, Emotional Robustness, Working and Coping under Pressure, and Self-Confidence. The BIP questionnaire organises these traits into four primary constructs: occupational orientation, occupational behaviour, social competencies, and psychological constitutions. This categorisation enables the comprehensive understanding and assessment of an individual’s soft skills within the context of their job role. Consequently, it provides valuable insights into their potential for success in the workplace (Batista-Foguet et al. 2016).

4.1.2. Step 2: Identification of Independent Variables

Two types of independent variable were identified: (a) sociological factors including gender and ethnicity and (b) academic factors including area of knowledge, work experience, and occupational background. These variables were selected based on their potential to influence the usage of soft skills by educators (Batista-Foguet et al. 2016).

4.1.3. Step 3: Questionnaire Development

The questions were developed through multiple rounds of focus group discussions led by the researchers, who are academic educators and managers. The initial questionnaire was drafted based on the soft skills identified in step 1 and the independent variables identified in step 2. The questionnaire consisted of two main parts.
In the first part, there are 28 questions using a Likert-type scale response ranging from “strongly agree” (1) to “strongly disagree” (5). The Likert-type response format was selected because it is the most popular way of collecting survey data and has several advantages: it is easy to use for self-assessment responses and adaptable for simultaneously measuring many different constructs, making it a suitable choice for this research (Artino et al. 2014).
Each soft skill is evaluated through a pair of consecutive questions, with both questions relying on educators’ self-assessment. The initial question, which pertains to the period before the COVID-19 pandemic in our analysis, aims to empower participants to assess their awareness of the specific soft skill being measured within the context of their professional practice. In this question, we furnish a definition of the soft skill (Appendix B) in question to ensure a consistent understanding among all participants as they respond. Participants’ responses span a range of agreements or disagreements, enabling us to gauge the extent to which educators are aware of the importance of these soft skills as contributing factors to the maintenance of their professional competence.
The second question, which pertains to the period during the COVID-19 pandemic in our analysis, is designed to facilitate participants to conduct self-assessments regarding how they perceive COVID-19’s impact on the application of the measured soft skill. This builds upon their earlier self-awareness and evaluation of the significance of this skill in their professional roles. Participants’ responses encompass a range of agreements or disagreements, allowing us to uncover educators’ perspectives on how their utilisation of these soft skills was affected throughout the COVID-19 pandemic, while considering the self-awareness established in the preceding questions.
The second part of the questionnaire comprises six questions related to sociological and academic factors: gender, ethnicity, area of knowledge, work experience, and occupational background. These questions are included to gain a better understanding of how these factors influence the cognitive development of soft skills among Te Pūkenga educators.

4.2. Stage Two: Validation Stage

This consists of three steps: content validity check, pilot study, and preliminary analysis for construct validity and reliability.

4.2.1. Step 4: Content Validity Check

The content validity of a survey questionnaire is related to the clarity and relevance of the designed questions, ensuring that it measures what it is supposed to measure (Strachota et al. 2006). This is also known as face validity, and it involves experts from the field providing feedback on the questionnaire structure in terms of representativeness, clarity of language, appropriateness of wording, and relevance to the measured items. For our research questionnaire, 10 educators and academic experts working under Te Pūkenga were invited to verify content validity in terms of question clarity, readability, representativeness, accuracy, and relevance regarding the constructs. Feedback from these experts was used to refine the questionnaire for the next step, pilot study.

4.2.2. Step 5: Conducting a Pilot Study

Once our research questionnaire was established and refined based on the experts’ feedback on content validity, a pilot study was conducted. The SurveyMonkey tool was used to administer the final questionnaire (Appendix A), and the planned delivery mode was Web-based. The participants were selected using a systematic probability sampling method, also known as the systematic random sampling method The questionnaire link was distributed through email invitation to the research offices of the 16 polytechnic institutions of Te Pūkenga, asking them to share the email invitation with their academic staff. The eligibility criteria for respondents’ participation were included, as follows: (1) currently being associated with Te Pūkenga as an educator and (2) having experience working as an educator both prior to and during the COVID-19 pandemic. These eligibility criteria ensured that the participants had valuable insights and experiences related to the impact of the pandemic on their soft skills as educators.
A minimum sample size of 30 was deemed sufficient for conducting the validity and reliability tests for the questionnaire instrument (Artino et al. 2014; de Winter et al. 2009; Jung 2013; Rouquette and Falissard 2011). A total of 200 participants successfully completed the questionnaire. Following this, we decided that a sample size of 50 participants would be appropriate for carrying out our analysis of reliability and validity tests. To achieve this, we employed a systematic sampling method with an interval of 4. Our sampling process was facilitated using an Excel spreadsheet. The participant list was sufficiently randomised, and representatives were chosen utilising an Excel formula. Furthermore, the starting point (row) for the selection process was randomly determined to minimise bias in participant selection.
The 50 randomly selected participants encompassed a diverse range of individuals. In terms of gender, 48% of the respondents were female, 46% were male, and 4% identified as non-binary. The majority of educators were of Pakeha (European) ethnicity, constituting 56%, while Māori and Pacifica each accounted for 8%, and Asians represented 10%.
The participants in this pilot study came from different areas of expertise. The domains with the highest representation were Business, Management, and Accounting, as well as Social Science, each comprising 15% of the respondents. Closely following, Learning Support accounted for 13%, Information Technology for 11%, and Health for 8% of the surveyed educators. The participants’ years of expertise varied, with “over 21” years of experience representing the largest percentage, at 38%, followed by “4 to 9” years, at 33%.
In terms of occupational background, around 42% of the participants were from both the Industry and Education sectors, while 46% were solely from the Education sector. Focusing on the polytechnics involved in the study, the participants’ distribution spanned 11 institutions. Approximately 35% were from Toi Ohomai Institute of Technology, 21% from The Open Polytechnic of New Zealand, and 16% from Wellington Institute of Technology (WelTec). A preliminary data analysis for participants responses was conducted to assess the validity and reliability of the instrument, as described in step 6.

4.2.3. Step 6: A Preliminary Analysis for Construct Validity and Reliability

In this pilot study, our hypothesis posits that the survey questionnaire exhibits satisfactory measures of validity and reliability. These findings underscore its appropriateness as a self-assessment instrument for measuring educators’ self-awareness regarding the acquisition of soft skills, as well as reporting their observations on how the implementation of these skills was impacted during COVID-19.
To support this hypothesis, SPSS 28 software was used to conduct our preliminary analysis for responses collected from the pilot study participants (n = 50) to verify construct validity, factor analysis, and reliability analysis to determine whether the questions in the questionnaire suitably captured the educator’s self-awareness of their soft skills acquisition and the influence of COVID-19 on implementing these skills.

5. Results

5.1. Construct Validity

Pearson correlation analysis was conducted to calculate the correlation coefficient (r) to signify the expected positive statistical relationships among the measured constructs in our questionnaire. The calculated Pearson correlations results for pre-COVID-19, which show positive linear relationships between the constructs, are presented in Table 1. This verifies that these constructs are related to each other in meaningful ways, and the strength of each relationship is a strong positive correlation, as all correlation coefficients (r) are around 0.6 or higher, which is typically considered to signify a strong relationship.
Also, correlations coefficient analysis was conducted for during COVID-19 to determine the linear relationships among the constructs under the impact of COVID-19.
Based on our literature review, we expected that the constructs’ relationships were affected during COVID-19, and Pearson correlations results for during COVID-19 were calculated as presented in Table 2. The results show that the linear relationships among the constructs during the pandemic also remained positive but varied in strength from weak to moderately positive correlations. A moderately positive correlation is presented among occupational orientation, occupational behaviour, and social competencies. While psychological constitution has weak correlation relationship with all other constructs. These results verify that COVID-19 affected the educator implementation of soft skills and interrupted the relationships among these constructs, which aligns with our expectations.
The identified positive linear correlations among the evaluated constructs pre-COVID-19 and during the COVID-19 period signify that the questions formulated within our questionnaire are valid to appraise the acquisition of these constructs As a result, these questions adeptly encapsulate the intended constructs, thus affirming the validity of the constructs. Additionally, the variability of scores within each construct is closely clustered around the mean, as indicated by the standard deviation (SD) results for both the pre-COVID-19 and during COVID-19 periods. This consistency in interpretation might be attributed to the inclusion of item definitions in the related questions, and this could contribute to affirming the construct validity.

5.2. Factor Analysis

To further validate our questionnaire model and confirm the alignment of questions with their corresponding soft skills pre-COVID-19 and during COVID-19, we conducted an Exploratory Factor Analysis (EFA) using a principal axis factor analysis to uncover the underlying connections among the items and the latent factor. The number of each item in the provided results tables (Table 3 and Table 4) represents the position that the same item occupies in the administered questionnaire, while the initial question pertained to obtaining consent.
EFA was performed using a cohort of 50 educators. Although EFA is typically applied to larger sample sizes (N ≥ 200), prior research demonstrated that EFA can yield reliable results for samples of N ≤ 50 under specific conditions: when the data exhibit a limited number of factors, substantial loadings, and elevated communalities (de Winter et al. 2009; Jung 2013; Rouquette and Falissard 2011). Moreover, the Kaiser–Meyer–Olkin Measure of Sampling Adequacy (KMO) value exceeding 0.6 (Kaiser 1974), along with Bartlett’s Test of Sphericity yielding a value of <0.001 (Zwick and Velicer 1986), supports the appropriateness of the sample for EFA. This study identified four factors, with most items displaying high loadings and communalities exceeding 0.6, affirming the suitability of EFA for this sample (Mundfrom et al. 2005).
We opted for a principal axis factor analysis over a principal component analysis because our study’s focus was on exploring the factor structure through the examination of the relationships among items, rather than solely reducing the item count. In general, researchers typically deem factors with fewer than three items and item loadings below 0.3 as undesirable outcomes. Hence, these results warrant potential removal or further investigation (Costello and Osborne 2005; Yong and Pearce 2013). This study found that the factors were correlated (Table 1); therefore, an oblique method of factor rotation was selected (Meyers et al. 2016). The decision to choose this method was based on an orthogonal rotation, presuming the factors to be uncorrelated, while the personal characteristics identified in this study to measure soft skills were pragmatically correlated with each other to some extent (Costello and Osborne 2005). To simplify the structure of the output, the direct oblimin method was chosen (Yong and Pearce 2013).
The EFA test conducted on the 14 pre-COVID items included a principal axis factor analysis with an oblimin rotation. Table 3 displays the items and the factor loading results for each item. The items under each factor were tested for loading because the items were designed to index four constructs: occupational orientation, occupational behaviour, social competencies, and psychological constitution. The first factor accounted for 61% of the variance, the second factor 70%, the third factor 74% and the fourth factor 80% of the variance.
The first factor, which aimed to index occupational orientation, had strong loadings on the attributed three items. The second factor, which aimed to index occupational behaviour, had strong loadings overall, with a very strong loading (0.991) on the first attributed item (question 10), “I am able to adapt to the changes required in my work”. The third factor, which aimed to index social competencies, was highly loaded on the attributed five items, 14, 16, 18, 20, and 22, while the first attributed item (14), which referred to question, “I am able to maintain good relationships with colleagues”, had its highest loading (0.905). The fourth factor, which aimed to index the psychological constitution, strongly loaded on all attributed items, 24, 26, and 28, with loadings of 0.843, 0.896, and 0.770, respectively.
The high factor loading result for the items related to the pre-COVID-19 period indicate a strong connection between our questionnaire questions and the latent factors representing soft skills acquisition in the pre-COVID-19 period. This finding supports our hypothesis for the validity of this questionnaire for accurately measuring the intended self-awareness of soft skill acquisition in the pre-COVID-19 period.
Examining the during COVID-19 results, when the same method was adopted to explore the factor structure of the 14 items measuring the impact of COVID-19 on the use of these skills, the EFA results did not produce a distinct factor structure for the during COVID-19 items. As a result, a further EFA was conducted, requesting four factors to meet the theoretical conceptualisation. These EFA results identified four factors that accounted for 64% of the variance. However, the fourth factor only had a loading on item 13. A factor should have at least three items to be labelled as a factor (Tabachnick et al. 2007). Therefore, factor 4 was deleted. Item loadings with values <0.3 were omitted (Costello and Osborne 2005) for clarity. The results of the three-factor model, as presented in Table 4, showed that the items were loaded under different factors than expected. For instance, items 25 and 27, measuring soft skills under psychological constitution; items 23 and 19, measuring social competencies; and item 5, measuring occupational orientation skills, were loaded under factor 1. The three-factor model identified through the EFA is different in structure from the BIP model used in the item-generation phase. Regardless, the impact of COVID-19 on each of the soft skills is captured in the items; this provides evidence to affirm our hypothesis that this questionnaire is a valid self-assessment instrument to report the impact of COVID-19 on educators’ implementation of soft skills.

5.3. Reliability Test

To measure the consistency and stability of our questionnaire, the internal consistency (Cronbach’s alpha coefficient) method was utilised for both pre-COVID-19 and during COVID-19, as shown in Table 5. As with the Cronbach’s alpha results, all the scale reliabilities for the soft skill domains (pre-COVID-19 and during COVID-19) ranged between 0.72 and 0.85 for both periods, which indicated an appropriate internal consistency and stability for our questionnaire, and this proved the high reliability of our instrument (Hair 2009).

6. Discussion

The objective of this pilot study was to assess the validity and reliability of our newly developed questionnaire, which was intended to serve as a suitable research instrument for the self-assessment of educators’ self-awareness of soft skills, as well as for the self-reporting of their perceptions of how the incorporation of these skills was impacted during the COVID-19 pandemic. This pilot study was conducted for Te Pūkenga institutions’ educators, and the findings of this study’s test determine this questionnaire’s suitability for being used by educational institutions.
We used a mixed-method approach to achieve the research goal. The first stage, qualitative research, included a literature review to identify the most relevant soft skills for professionals in the workplace, leading us to 14 soft skills, which were grouped into four constructs based on the Bochum Inventory of Personality and Competences (BIP). These 14 soft skills became the variables of our questionnaire. The initial questionnaire was developed, which included 28 questions using a Likert-type scale response, and each soft skill was measured by two questions. The second stage was questionnaire validation and was conducted in two steps, using qualitative research to conduct a content validity test with 10 experts. This stage confirmed the acceptability of the questionnaire and helped to refine the items. This was followed by quantitative methods of pilot testing using 50 participants from Te Pūkenga. Mixed statistical methods were utilised to verify the questionnaire’s validity and reliability including the Pearson correlation, EFA, and Cronbach’s alpha.
The Pearson correlation was computed to investigate the construct validity, pre-COVID-19, and the results presented in Table 1 show that all four of the pairs of constructs were significantly correlated. The strongest positive correlation, which would be considered a very large effect size according to (Cohen 2013), was between social competencies and occupational behaviour: r = 0.850; p < 0.01. This means that respondents who found sociological competencies a highly relevant soft skill were very likely to rate occupational behaviour as highly relevant. Occupational behaviour was also positively correlated with occupational orientation scores (r = 0.596), which is a medium-to-large effect or correlation according to Cohen (2013).
The during COVID-19 correlation results are presented in Table 2. The purpose of this analysis was to investigate the impact of COVID-19 on soft skills construction and the work environment. Table 2 shows that four of the six pairs of constructs were significantly correlated. There was a medium positive correlation, according to Cohen (2013), between social competencies and occupational behaviour: r = 0.445; p < 0.01. This means that when COVID-19 affected social-competencies-related soft skills, it was likely that occupational-behaviour-related soft skills such as adaptability, implementing decisions, and conscientiousness were also affected by COVID-19. A similar interpretation applies to occupational behaviour and occupational orientation (r = 0.421), social competencies and occupational orientation (r = 0.380), and psychological constitution and social competencies (r = 0.311). The correlations between psychological constitution and occupational orientation and between psychological constitution and occupational behaviour were statistically not significant.
The Pearson correlation results for both the pre-COVID-19 and during-COVID-19 periods substantiated the construct validity demonstrated by the positive linear correlations. These outcomes validated that the designed questionnaire items effectively capture the intended underlying constructs, further bolstering the questionnaire’s validity as a suitable research tool.
The further validation method, by conducting EFA, confirmed that a four-factor structure provides an acceptable conceptual basis for assessing the soft skills of educators in the pre-COVID-19 period. The factor structure can be interpreted as follows. Factor 1, occupational orientation (including items 2, 4, and 6), highlights the work factor that professionally motivates an individual. It includes the purpose for which an individual engages in achievement behaviour and their general orientation towards the task, including beliefs and feelings about success, failure, and ability (Elliot and Thrash 2001). It indicates the desire to influence others and the work process (Busch 2018) and the ability to inspire and guide others towards achieving organisational goals (Paais and Pattiruhu 2020). Factor 2, occupational behaviour (including items 8, 10, and 12), is based on the attitude with which educators approach tasks. This includes beliefs in and engagement with goal-directed behaviours and the demonstration of responsibility and accuracy (Mirković et al. 2020). It incorporates the ability to effectively adapt and manage oneself in changing environments (Kim 2017) and the capacity to convert intentions into actions (Phillips et al. 2020). Factor 3, social competencies, indicates skills related to responding appropriately to the emotional state of others, the ability to effectively communicate in all contexts (Ting-Toomey and Dorjee 2018), maintaining collaborative connections and interactions with colleagues (Mokgwane 2021), and teamwork and efficiency in sharing opinions (containing items 14, 16, 18, 20, and 22), while factor 4, psychological constitution, heavily draws on the ability and resilience to cope with stress and challenging situations (Shydelko 2017), the capacity to work under pressure, and self-awareness (containing items 24, 26, and 28).
The during COVID-19 EFA analysis identified a three-factor model. Even though the derived model was different from the BIP model used in the item-generation phase, it appeared to adequately present all 14 soft skills relevant to educators. Notably, the analysis captured the differential deployment of these skills: for example, items under psychological constitution, such as the impact on emotional robustness due to COVID-19, overlapped with social competency skills regarding the ability to maintain relationships with colleagues and to share views. Baker et al. (2021) reported that during the pandemic, most teachers identified emotional and mental stressors due to increased workload, online teaching challenges, and the negative impact of work on family/self. Their study found an increased inter-relation between the stressors and their impact on social and occupational competencies. Their findings corroborated the significant correlation among the constructs reported in our findings.
Further evidence of the validity and reliability of the questionnaire may be taken from its construct being closely linked to the soft skills assessed in a study conducted in the American context on university teachers from different regions (Fernández-Arias et al. 2021). A similar study was conducted to assess the soft skills required to enhance digital competency in the COVID-19 environment for Latin American university teachers from the lowest Global Innovative Index (GII) countries, which are less digitally developed (Antón-Sancho et al. 2021). The constructs identified in both these studies are in line with the findings of our study. The five constructs of these studies include work motivation (e.g., results orientation, the initiative for change, and leadership), work behaviour (conscientiousness, flexibility, and action orientation), social skills (social intelligence, sociability, teamwork, and influence), psychic structure (emotional stability, work ability, and self-confidence) and additional competences (sense of control, competitiveness, mobility, leisure orientation, and image distortion). Some of these constructs were also addressed by other instruments studied by authors in an educational context; for example, Jardim et al. (2022) extracted five factors—self-determination, social support, teamwork, assertiveness, and resilience—in their soft skills inventory. Alpay and Walsh (2008) and Chamorro-Premuzic et al. (2010) identified self-determination and teamwork. The fact that all these constructs presented medium-to-high correlations supports the argument that they are complementary skills that contribute to the professional development of educators.
The EFA results of this study validate the relevance of each question in the developed questionnaire, demonstrating its effectiveness in assessing the attributed soft skills and capturing the influence of COVID-19 on their proficient application within their respective constructs. This validation reinforces the questionnaire’s capacity to effectively capture the influence of COVID-19. Consequently, this confirms the appropriateness of this questionnaire as a measuring tool for our broader main study, which seeks to comprehensively understand the ramifications of COVID-19, in addition to assessing educators’ self-awareness regarding their acquisition of necessary soft skills.
The results of the questionnaire’s reliability test aim to measure the internal consistency of the items within their constructs for both the pre-COVID-19 and during COVID-19 periods. The evaluated constructs include occupational orientation, occupational behaviour, social competencies, and psychological constitution. Notably, the Cronbach’s Alpha test yielded high values across all constructs for both periods, indicating a robust internal consistency. This underscores the questionnaire’s reliability as a research tool for our subsequent main research.
In summary, by confirming both the validity and reliability of our developed questionnaire through various methods, the tested hypotheses are substantiated. This underscores the efficacy of our developed self-assessment questionnaire as a dependable research tool for accurately collecting data in line with our proposed main research.

7. Limitations and Future Studies

The potential limitations identified from aspects of this research process include sample size, the inherent influence of independent variables, validity considerations, and the type of survey questions used.
Obtaining a sufficiently large sample size can be a challenge for any research. While larger samples can yield more reliable results and enhance the validity of the instrument (Hicks et al. 1996; Perneger et al. 2015), the scale development in this study was based on a small pilot test, and a larger sample size was used for the main study. Our view is that further research with a larger sample size would further enhance the validity and reliability of the newly developed scale.
The questionnaire solely relied on self-reported data, and there is the possibility of participants rating themselves highly due to social desirability bias. Previous research showed that social desirability bias could affect the accuracy of the data collected through self-reported instruments (Fisher and Katz 2000). In the context of our self-assessment questionnaire, the data provided could be skewed due to respondents’ tendency to more positively perceive themselves. Our data collection process was planned to reduce incentives for respondents to provide overly positive self-assessments because participation was voluntary and anonymous (Steenkamp and De Jong 2010). Therefore, incorporating additional measures or sources of data would add value and may be necessary to complement the self-reported data.
The developed questionnaire was designed for Te Pūkenga educators and was assessed by Te Pūkenga experts and educators. Future studies could be conducted to replicate the validity and reliability of this instrument in measuring educators’ soft skills across other higher education institutions. A qualitative approach was used to construct the questions, and a mixed approach, which is highly favourable, was used to test the validity and reliability. However, it is essential to assess the measurement’s discriminant validity, ensuring that it accurately measures distinct personality traits and is not influenced by or conflated with other related measurements (Batista-Foguet et al. 2016).
Moreover, all the questions in this instrument are close-ended, using the Likert rating scale. There is the potential that the use of a single question type does not provide researchers with the extensive depth of responses possibly offered with mixed question types and constrains respondents to choose one of the provided ratings. Future studies could aim at developing an open-ended questionnaire or an interview approach to gain better insights on soft skills, although the analysis of such responses could be challenging (Kelley et al. 2003).
In summary, while this study uncovered valuable insights into educators’ soft skills and their relevance during the COVID-19 pandemic, it is not without its limitations. Future research should endeavour to surmount these limitations by expanding sample sizes, diversifying data sources, and refining questionnaire designs. By addressing these areas, subsequent studies can advance our understanding of educators’ soft skills and their strategic utilisation, ultimately contributing to more comprehensive and robust research in this domain.

8. Conclusions

Educational institutions encounter challenges in identifying and understanding the necessary soft skills of their educators, which require targeted professional development training to enhance the professional practice of these skills. Consequently, it is essential to inform educators about the specific soft skills required for effective professional practice, raise their awareness of their level of acquisition for each skill, and assess their ability to apply these skills under unforeseen changes in their working conditions.
The conducted pilot study test represents a satisfactory and sufficiently representative sample of the normal population, ensuring that the preliminary data results are reliable and analysable. The correlation coefficient values, factor analysis results, and Cronbach’s alpha measures for this research questionnaire confirm the validity and reliability of this tool as a research instrument to measure the proposed constructs and provide valuable insights into how Te Pūkenga educators perceive the significance of soft skills in their profession and how the COVID-19 pandemic has impacted the application of these skills.
Completing this questionnaire also facilitates non-formal feedback for the participants. As they personally assess their grasp of the requisite soft skills for their professional roles and gauge how COVID-19 influenced their application of these skills, this introspection confers several significant advantages. These include the cultivation of heightened self-awareness concerning the essential soft skills that are vital for effective educational practice, as well as the precise identification of areas necessitating improvement and targeted training. Consequently, this questionnaire serves to promote their personal and professional development.
We developed a self-assessment research tool aimed at measuring how an individual’s awareness of soft skills influences their capacity to effectively identify the impact of challenges on their professional role and evaluate their personal adaptability traits during these challenges within their work environment. This self-assessment tool has the potential for future adaptation and refinement to tackle analogous challenging scenarios or events that may impact educators’ performance within their professional environment. Such adaptability could be conducted by educational institutions to comprehensively evaluate their educators’ proficiency in the essential soft skills, while assessing the extent to which external challenges may affect these competencies. This strategic approach to conceptualising and measuring soft skills stands poised to further assist educational institutions in designing training programs tailored to address areas in which their educators have attained lower scores.
For the future development of the pilot study, it becomes necessary to assess soft skills within varied samples sourced from diverse higher educational institutions. This strategic approach facilitates the evaluation of the instrument’s validity across different educational contexts, supporting subsequent effective refinements. Additionally, alternative sampling methods should be explored to minimise the potential for bias in establishing a representative sample. Our ultimate goal for the future is to position this questionnaire as a dependable and valid research instrument, not exclusively designed for Te Pūkenga but open to adaptation by fellow researchers and educational institutions seeking similar self-assessment tools for measuring educators’ self-awareness regarding their acquisition of soft skills and the exploration of external factors’ impact.

Author Contributions

Conceptualization, A.A.-S., P.Y. and E.A.; methodology, A.A.-S., E.A. and R.P.; validation, P.Y., R.P., O.G. and E.A.; formal analysis, P.Y., R.P. and O.G.; investigation, A.A.-S. and C.A.M.; resources, C.A.M.; data curation, O.G., E.A. and A.A.-S.; writing—original draft preparation, All authors; writing—review and editing, All authors; visualization, A.A.-S. and R.P.; supervision, A.A.-S.; project administration, C.A.M.; funding acquisition, A.A.-S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by [Whitireia & WelTec] grant number [RWHW 072023-31].

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of Whitireia & WelTec (RP 332–2022 date of approval 20 June 2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data that support the findings of this study are available upon request from the corresponding author ([email protected]). The data are not publicly available due to their containing information that could compromise the privacy of the research participants.

Acknowledgments

The authors would like to thank the participants and the anonymous reviewers for their contribution. In addition, we would like to thank the Wellington Institute of Technology and the Whitireia Community Polytechnic Ethics Committee.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Questionnaire Questions
Q2: As an educator, personal achievement (success) is a key motivator in my job.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q3: Working remotely during COVID-19 gave me the same professional success/achievement opportunities.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q4: It is important to me to be able to influence work environment processes and thinking as part of my job.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q5: Working remotely during COVID-19 impacted my ability to influence workplace decisions, processes or thinking as part of my job.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q6: Leadership opportunities are important for me as an educator.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q7: Working remotely during COVID-19 I had the same of opportunities for professional leadership and role modelling.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q8: As an educator, being conscientious is a key motivator in my job.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q9: During COVID-19, completion of tasks and meeting deadlines were a valuable skill for me as an educator.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q10: I am able to adapt to the changes required in my work.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q11: I am able to adapt to the changes required to teach during COVID-19.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q12: I am effective at implementing decisions as an educator.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q13: Working remotely during COVID-19 impacted my effectiveness in implementing decisions.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q14: I am able to respond appropriately to the emotional state of others in social situations within a work context.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q15: I am able to respond appropriately to work-related sensitive situations during COVID-19.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q16: I am able to foster/strengthen my professional connections in my work.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q17: During COVID-19, fostering/strengthening my professional connections was impacted.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q18: I am able to maintain good relationships with colleagues.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q19: During COVID-19, I am able to maintain good relationships with colleagues.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q20: I value collaborative activities and teamwork as part of my work.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q21: I am able to contribute effectively to my professional teamwork activities during COVID-19.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q22: I am able to assert and share my professional opinions and ideas effectively.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q23: During COVID-19, using online technologies affected my ability to share my views effectively.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q24: I feel that I am emotionally robust.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q25: COVID-19 impacted my emotional robustness.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q26: As an educator, I feel I am able to work under pressure.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q27: I felt additional stress working during COVID-19.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q28: I am confident in myself in work setting.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q29: My confidence in work setting has been impacted by COVID-19.
Strongly agreeAgreeNeither agree or disagreeDisagreeStrongly disagree
Q30. What gender do you identify as?
  • Male
  • Female
  • Non-binary
  • Other
  • Prefer not to answer
Q31. What is your ethnicity? (Please select all that apply.)
  • Māori
  • Pacific
  • Pakeha (NZ European)
  • Asian
  • Latino or American
  • African
  • European
  • Middle Eastern
  • Australian
  • Other (please specify)
Q32. Your main area of knowledge.
  • Humanities and Art
  • Social Science
  • Science
  • Engineering
  • Information Technology
  • Health
  • Art
  • Creative and Design
  • Māori and Pacific
  • Business, Management, and Accounting
  • Hospitality
  • Trades
  • Learning Support
  • Other (please specify)
Q33. Years of experience.
  • Less than 4 Years
  • 4–9 Years
  • 10–15 Years
  • 16–20 Years
  • 21+ Years
Q34. Occupational background.
  • Education
  • Industry
  • Both
  • Other (please specify)
Q35. Which institution are you currently working for?
• Ara Institute of
Canterbury
• Otago Polytechnic• Unitec Institute of
Technology
• Eastern Institute of
Technology
• Southern Institute of
Technology
• Universal College of
Learning
• Manukau Institute of
Technology
• Tai Poutini Polytechnic• Waikato Institute of
Technology
• Nelson Marlborough
Institute of Technology
• The Open Polytechnic of
New Zealand
• Wellington Institute of
Technology
• NorthTec• Toi Ohomai Institute of
Technology
• Western Institute of
Technology at Taranaki
• Whitireia New Zealand• Other (please specify)

Appendix B

Soft Skills Definitions
No.Personal CharacteristicSoft SkillDescription
1Occupational orientationAmbition/success or achievement motivationThe drive to set and achieve personal and professional goals. A positive attitude towards success and the personal resilience to cope with challenges and failure (Elliot and Thrash 2001).
2Influence/impactThe interest in shaping and changing processes and environments. The ability to persuade others to change or expand their thinking/decisions in a professional setting (Busch 2018).
3Leadership/role modellingThe ability to inspire and guide others towards achieving organisational or professional goals. The active influence and demonstration of improved behaviours or ways of working (Paais and Pattiruhu 2020).
4Occupational behaviourConscientiousness: being careful/thorough/diligentThe consistent demonstration of responsible, deliberate, organised goal-directed behaviours, with a focus on accuracy over speed (Mirković et al. 2020).
5AdaptabilityThe capacity to effectively adapt and manage oneself in unfamiliar or changing circumstances (Kim 2017).
6Taking action/implementing decisionsEffectiveness in converting intentions into action and completing tasks. The ability to focus on the primary aspects of a task and mitigate distractions (Phillips et al. 2020).
7Social competenciesInterpersonal communicationThe ability to effectively communicate with others in a variety of contexts, including professional relationships, team settings, conflict resolution, and motivation (Ting-Toomey and Dorjee 2018).
8Sociability/collegialityThe inclination to support and encourage colleagues. The social skills that build and maintain collaborative connections and interactions with others (Mokgwane 2021).
Thew 9Social sensitivity The ability to understand and appropriately respond to the emotional state of others. The capacity to create respectful and cohesive environments in social and professional interactions (Borge and Mercier 2019).
10Teamwork/collaborationThe readiness to contribute to team activities and work well with others towards shared goals or outcomes. An interest and willingness to learn from the opinions and views of colleagues’ productivity (Ning 2011).
11Sharing opinions and ideas with othersThe confidence to express one’s own views, share professional opinions, and introduce new ideas to others (cf. Ames and Flynn 2007).
12Psychological constitutionEmotional robustnessThe ability and resilience to cope with stress and challenging situations in personal and professional settings. An understanding of effective strategies that support the ability to do so (Shydelko 2017).
13Working and coping under pressureThe ability to effectively perform professional roles even in stressful situations. The ability to recognise and manage symptoms of stress and use techniques and strategies to mitigate stress (Bardy et al. 2017).
14Self-confidenceThe recognition and acceptance of one’s own strengths and weaknesses and a positive belief in those abilities. The ability to manage realistic expectations and to handle criticism in a mature manner (Cusack 2018).

References

  1. Ahmad, Esraa, Ahmed Al-Sa’di, and Kieran Beggs. 2020. A formative assessment framework using game-quiz educational approach. Paper presented at the IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), Takamatsu, Japan, December 8–11. [Google Scholar]
  2. Alolaywi, Yasamiyan. 2021. Teaching online during the COVID-19 pandemic: Teachers’ perspectives. Journal of Language and Linguistic Studies 17: 2022–45. [Google Scholar] [CrossRef]
  3. Alpay, Esat, and Elaine Walsh. 2008. A skills perception inventory for evaluating postgraduate transferable skills development. Assessment and Evaluation in Higher Education 33: 581–98. [Google Scholar] [CrossRef]
  4. Ames, Daniel R., and Francis J. Flynn. 2007. What breaks a leader: The curvilinear relation between assertiveness and leadership. Journal of Personality and Social Psychology 92: 307. [Google Scholar] [CrossRef]
  5. Antón-Sancho, Álvaro, Diego Vergara, and Pablo Fernández-Arias. 2021. Self-assessment of soft skills of university teachers from countries with a low level of digital competence. Electronics 10: 2532. [Google Scholar] [CrossRef]
  6. Artino, Anthony R., Jr., Jeffrey S. La Rochelle, Kent J. Dezee, and Hunter Gehlbach. 2014. Developing questionnaires for educational research: AMEE Guide No. 87. Medical Teacher 36: 463–74. [Google Scholar] [CrossRef]
  7. Babbar, Mansi, and Tushita Gupta. 2022. Response of educational institutions to COVID-19 pandemic: An inter-country comparison. Policy Futures in Education 20: 469–91. [Google Scholar] [CrossRef]
  8. Baker, Courtney N., Haley Peele, Monica Daniels, Megan Saybe, Kathleen Whalen, Stacy Overstreet, and Trauma-Informed Schools Learning Collaborative The New Orleans The New Orleans. 2021. The experience of COVID-19 and its impact on teachers’ mental health, coping, and teaching. School Psychology Review 50: 491–504. [Google Scholar]
  9. Bandura, Albert. 1986. Social Foundations of Thought and Action. Englewood Cliffs: Prentice Hall, pp. 23–28. [Google Scholar]
  10. Bardy, Roland, Arthur Rubens, and Paul Eberle. 2017. Soft skills and job opportunities of migrants: Systemic relationships in the labor market. Business Ethics and Leadership 1: 5–21. [Google Scholar] [CrossRef]
  11. Batista-Foguet, Joan, Alaide Sipahi-Dantas, Laura Guillén, Rosario Martínez Arias, and Ricard Serlavós. 2016. Design and evaluation process of a personal and motive-based competencies questionnaire in Spanish-speaking contexts. Spanish Journal of Psychology 19: E14. [Google Scholar] [CrossRef]
  12. Borge, Marcela, and Emma Mercier. 2019. Towards a micro-ecological approach to CSCL. International Journal of Computer-Supported Collaborative Learning 14: 219–35. [Google Scholar] [CrossRef]
  13. Brennan, Mary Kay. 2017. Innovations in Assessing Practice Skills: Using Social Cognitive Theory, Technology, and Self-Reflection. Ph.D. thesis, University of St. Thomas, Saint Paul, MN, USA. [Google Scholar]
  14. Busch, Holger. 2018. Power motivation. In Motivation and Action. Cham: Springer, pp. 335–68. [Google Scholar]
  15. Chamorro-Premuzic, Tomas, Adriane Arteche, Andrew J. Bremner, Corina Greven, and Adrian Furnham. 2010. Soft skills in higher education: Importance and improvement ratings as a function of individual differences and academic performance. Educational Psychology 30: 221–41. [Google Scholar] [CrossRef]
  16. Cohen, Jacob. 2013. Statistical Power Analysis for the Behavioral Sciences. Routledge eBooks. New York: Academic Press. [Google Scholar] [CrossRef]
  17. Costello, Anna B., and Jason Osborne. 2005. Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research and Evaluation 10: 7. [Google Scholar]
  18. Cusack, Sandra. 2018. Critical educational gerontology and the imperative to empower. In Teaching and Learning in Later Life. New York: Routledge, pp. 61–75. [Google Scholar]
  19. De Pietro, O., N. Altomari, and O. De Pietro. 2019. A tool to measure teachers’ soft skills: Results of a pilot study. Advances in Social Science and Culture 1: 245–57. [Google Scholar] [CrossRef]
  20. de Winter, Joost C. F., Dimitra Dodou, and Peter A. Wieringa. 2009. Exploratory factor analysis with small sample sizes. Multivariate Behavioral Research 44: 147–81. [Google Scholar] [CrossRef]
  21. Denler, Heidi, Christopher Wolters, and Maria Benzon. 2014. Social Cognitive Theory. Available online: https://project542.weebly.com/uploads/1/7/1/0/17108470/social_cognitive_theory__education.com.pdf (accessed on 24 June 2023).
  22. Elliot, Andrew J., and Todd M. Thrash. 2001. Achievement goals and the hierarchical model of achievement motivation. Educational Psychology Review 13: 139–56. [Google Scholar] [CrossRef]
  23. Escolà-Gascón, Álex, and Josep Gallifa. 2022. How to measure soft skills in the educational context: Psychometric properties of the SKILLS-in-ONE questionnaire. Studies in Educational Evaluation 74: 101155. [Google Scholar] [CrossRef]
  24. Fernández-Arias, Pablo, Álvaro Antón-Sancho, Diego Vergara, and Amelia Barrientos. 2021. Soft skills of American University teachers: Self-concept. Sustainability 13: 12397. [Google Scholar] [CrossRef]
  25. Fisher, Robert J., and James E. Katz. 2000. Social-desirability bias and the validity of self-reported values. Psychology and Marketing 17: 105–20. [Google Scholar] [CrossRef]
  26. Ghazali, Ahmad Faiz, Ahmad Kamalrulzaman Othman, Yusnita Sokman, Noor Azrin Zainuddin, Aishah Suhaimi, Nurkhairany Amyra Mokhtar, and Rahmawati Mohd Yusoff. 2021. Investigating social cognitive theory in online distance and learning for decision support: The case for community of Inquiry. International Journal of Asian Social Science 11: 522–38. [Google Scholar] [CrossRef]
  27. Govindaraju, Vimala. 2021. A review of social cognitive theory from the perspective of interpersonal communication. Multicultural Education 7: 1–5. [Google Scholar]
  28. Hair, Joseph F. 2009. Multivariate Data Analysis. Pearson eBooks. London: Pearson Education. Available online: https://ci.nii.ac.jp/ncid/BB03463866 (accessed on 24 June 2023).
  29. Hicks, C., D. Hennessy, and F. Barwell. 1996. Development of a psychometrically valid training needs analysis instrument for use with primary health care teams. Health Services Management Research 9: 262–72. [Google Scholar] [CrossRef] [PubMed]
  30. Jardim, Jacinto, Anabela Pereira, Paula Vagos, Inês Direito, and Sónia Galinha. 2022. The Soft Skills Inventory: Developmental procedures and psychometric analysis. Psychological Reports 125: 620–48. [Google Scholar] [CrossRef] [PubMed]
  31. Jung, Sunho. 2013. Exploratory factor analysis with small sample sizes: A comparison of three approaches. Behavioural Processes 97: 90–95. [Google Scholar] [CrossRef] [PubMed]
  32. Junrat, Sitthisomjin, Chaiwan Jenphop, Rongraung Suravee, and Somprach Kanokorn. 2014. Soft skills for university library staff in Thailand. Procedia—Social and Behavioral Sciences 112: 1027–32. [Google Scholar] [CrossRef]
  33. Kaiser, Henry F. 1974. An index of factorial simplicity. Psychometrika 39: 31–36. [Google Scholar] [CrossRef]
  34. Kelley, Kate, Belinda Clark, Vivienne Brown, and John Sitzia. 2003. Good practice in the conduct and reporting of survey research. International Journal for Quality in Health Care 15: 261–66. [Google Scholar] [CrossRef]
  35. Kim, Young Yun. 2017. Cross-cultural adaptation. In Oxford Research Encyclopedia of Communication. Oxford: Oxford University Press. [Google Scholar] [CrossRef]
  36. Meyers, Lawrence S., Glenn Gamst, and Anthony J. Guarino. 2016. Applied Multivariate Research: Design and Interpretation. Newbury Park: Sage Publications. [Google Scholar]
  37. Mirković, Biljana, Ivana Zečević, and Nela Marinković. 2020. The big five personality traits as determinants of teachers’ achievement motivation. Nastava i Vaspitanje 69: 171–82. [Google Scholar] [CrossRef]
  38. Mokgwane, Pako. 2021. The role of leader sociability on follower functionality: Literature review. Pan-African Journal of Education and Social Sciences 2: 1. [Google Scholar]
  39. Mundfrom, Daniel J., Dale G. Shaw, and Tian Lu Ke. 2005. Minimum sample size recommendations for conducting factor analyses. International Journal of Testing 5: 159–68. [Google Scholar] [CrossRef]
  40. Ning, Huiping. 2011. Adapting cooperative learning in tertiary ELT. ELT Journal 65: 60–70. [Google Scholar] [CrossRef]
  41. Paais, Maartje, and Jozef R. Pattiruhu. 2020. Effect of motivation, leadership, and organizational culture on satisfaction and employee performance. Journal of Asian Finance, Economics and Business 7: 577–88. [Google Scholar] [CrossRef]
  42. Perifanou, Maria, Anastasios A. Economides, and Katerina Tzafilkou. 2021. Teachers’ digital skills readiness during COVID-19 pandemic. International Journal of Emerging Technologies in Learning 16: 238–51. [Google Scholar] [CrossRef]
  43. Perneger, Thomas V., Delphine S. Courvoisier, Patricia M. Hudelson, and Angèle Gayet-Ageron. 2015. Sample size for pre-tests of questionnaires. Quality of Life Research 24: 147–51. [Google Scholar] [CrossRef]
  44. Phillips, Jack, Patti Phillips, and Rebecca Ray. 2020. Proving the Value of Soft Skills: Measuring Impact and Calculating ROI. Alexandria: American Society for Training and Development. [Google Scholar]
  45. Rouquette, Alexandra, and Bruno Falissard. 2011. Sample size requirements for the internal validation of psychiatric scales. International Journal of Methods in Psychiatric Research 20: 235–49. [Google Scholar] [CrossRef] [PubMed]
  46. Shydelko, Anna V. 2017. Emotional stability of an individual: Research into the topic. Science and Education 17: 85–89. [Google Scholar] [CrossRef]
  47. Steenkamp, Jan-Benedict E. M., and Martijn G. De Jong. 2010. A global investigation into the constellation of consumer attitudes toward global and local products. Journal of Marketing 74: 18–40. [Google Scholar] [CrossRef]
  48. Strachota, Elaine M., Simone C. O. Conceição, and Steven W. Schmidt. 2006. An instrument development model for online surveys in human resource development and adult education. New Horizons in Adult Education and Human Resource Development 20: 24–37. [Google Scholar] [CrossRef]
  49. Tabachnick, Barbara G., Linda S. Fidell, and Jodie B. Ullman. 2007. Using Multivariate Statistics. Boston: Pearson, vol. 5. [Google Scholar]
  50. Ting-Toomey, Stella, and Tenzin Dorjee. 2018. Communicating across Cultures. New York: Guilford Publications. [Google Scholar]
  51. Yong, An Gie, and Sean Pearce. 2013. A beginner’s guide to factor analysis: Focusing on exploratory factor analysis. Tutorials in Quantitative Methods for Psychology 9: 79–94. [Google Scholar] [CrossRef]
  52. Zwick, William R., and Wayne F. Velicer. 1986. Comparison of five rules for determining the number of components to retain. Psychological Bulletin 99: 432–42. [Google Scholar] [CrossRef]
Figure 1. Instrument development. Source: the authors.
Figure 1. Instrument development. Source: the authors.
Admsci 13 00208 g001
Table 1. Pearson correlations among different constructs and mean (M) and standard deviations (SD) (pre-COVID-19). Source: the authors.
Table 1. Pearson correlations among different constructs and mean (M) and standard deviations (SD) (pre-COVID-19). Source: the authors.
ConstructOccupational OrientationOccupational BehaviourSocial
Competencies
Psychological
Constitution
MSD
Occupational orientation 1 1.990.65
Occupational behaviour 0.596 **1 1.670.64
Social competencies0.601 **0.850 **1 1.820.75
Psychological constitution 0.622 **0.751 **0.824 **11.910.76
** Correlation is significant at the 0.01 level (2-tailed). N = 50.
Table 2. Pearson correlations among different constructs and mean (M) and standard deviations (SD) (during COVID-19). Source: the authors.
Table 2. Pearson correlations among different constructs and mean (M) and standard deviations (SD) (during COVID-19). Source: the authors.
ConstructOccupational OrientationOccupational BehaviourSocial
Competencies
Psychological ConstitutionMSD
Occupational orientation 1 2.680.78
Occupational behaviour 0.421 **1 2.24 0.65
Social competencies0.380 **0.445 **1 2.34 0.60
Psychological constitution0.0560.1100.311 *12.831.04
** Correlation is significant at the 0.01 level (2-tailed). * Correlation is significant at the 0.05 level (2-tailed). N = 50.
Table 3. EFA-based factor loadings with oblimin rotation (pre-COVID-19). Source: the authors.
Table 3. EFA-based factor loadings with oblimin rotation (pre-COVID-19). Source: the authors.
ItemFactor 1Factor 2Factor 3Factor 4
20.679
60.640
40.603
10 0.991
12 0.690
8 0.554
14 0.905
16 0.863
20 0.819
22 0.796
18 0.740
26 0.896
24 0.843
28 0.770
Table 4. EFA-based factor loadings with oblimin rotation (during COVID-19). Source: the authors.
Table 4. EFA-based factor loadings with oblimin rotation (during COVID-19). Source: the authors.
ItemFactor 1Factor 2Factor 3
250.835
270.777
290.710
230.493
50.437
190.335
15 0.900
21 0.845
17 0.618
11 0.368
3 0.851
7 0.538
9 0.444
Table 5. Scale reliability coefficient alphas for the constructs. Source: the authors.
Table 5. Scale reliability coefficient alphas for the constructs. Source: the authors.
Construct
(N = 50)
ItemCronbach’s Alpha
(Pre-COVID-19)
Cronbach’s Alpha
(during COVID-19)
Occupational OrientationAchievement motivation 0.7530.776
Leadership
Influence/impact
Occupational BehaviourAdaptability 0.7210.748
Implementing decisions
Conscientiousness
Social
Competencies
Social sensitivity0.7230.772
Interpersonal communication
Teamwork
Sharing opinions and ideas with others
Sociability/collegiality
Psychological ConstitutionEmotional robustness 0.7430.851
Working and coping under pressure
Self-confidence
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Al-Sa’di, A.; Yamjal, P.; Ahmad, E.; Panjabi, R.; Allott McPhee, C.; Guler, O. Assessing Educators’ Soft Skills: Developing a Self-Assessment Instrument. Adm. Sci. 2023, 13, 208. https://doi.org/10.3390/admsci13090208

AMA Style

Al-Sa’di A, Yamjal P, Ahmad E, Panjabi R, Allott McPhee C, Guler O. Assessing Educators’ Soft Skills: Developing a Self-Assessment Instrument. Administrative Sciences. 2023; 13(9):208. https://doi.org/10.3390/admsci13090208

Chicago/Turabian Style

Al-Sa’di, Ahmed, Parina Yamjal, Esraa Ahmad, Richa Panjabi, CAM Allott McPhee, and Olkan Guler. 2023. "Assessing Educators’ Soft Skills: Developing a Self-Assessment Instrument" Administrative Sciences 13, no. 9: 208. https://doi.org/10.3390/admsci13090208

APA Style

Al-Sa’di, A., Yamjal, P., Ahmad, E., Panjabi, R., Allott McPhee, C., & Guler, O. (2023). Assessing Educators’ Soft Skills: Developing a Self-Assessment Instrument. Administrative Sciences, 13(9), 208. https://doi.org/10.3390/admsci13090208

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop