Next Article in Journal
Designation, Incentivisation and Farmer Participation—Exploring Options for Sustainable Rural Landscapes
Next Article in Special Issue
Motivation and Its Impact on Language Achievement: Sustainable Development of Ethnic Minority Students’ Second Language Learning
Previous Article in Journal
An Integrated Method for Locating Logistic Centers in a Rural Area
Previous Article in Special Issue
Building and Sustaining a Group of Chinese EFL Learners’ Imagined Identities and Agency
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Towards a Sustainable Multidimensional Approach to English Proficiency Proof in the Post-Pandemic Era: Learning from the Legacy of COVID-19

1
Shanghai Centre for Research in English Language Education, Shanghai International Studies University, Shanghai 200083, China
2
School of Foreign Studies, Shanghai University of Finance and Economics, Shanghai 200433, China
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(9), 5568; https://doi.org/10.3390/su14095568
Submission received: 7 April 2022 / Revised: 24 April 2022 / Accepted: 3 May 2022 / Published: 5 May 2022
(This article belongs to the Special Issue Towards Sustainable Language Learning and Teaching)

Abstract

:
The COVID-19 pandemic has created unprecedented impacts on higher education, particularly on the admission of international applicants required to submit English proficiency proof (EPP). To learn from the legacy of COVID-19, this study examines 77 top-ranking higher education institutions’ (HEI) changing practices regarding EPP across two application rounds in 2020 and 2021. Given that HEIs function as test users in the admission process, we adopted the construct of test utilisation to inform data analysis. Findings suggest a trend of HEIs embracing newly emerging at-home online tests and combining traditional standardised tests with other options, such as online interviews and/or inhouse placement or diagnostic tests; such approaches extended the enrolment timeline. Thus, the study demonstrates the enriched scope of test utilisation and proposes a sustainable multidimensional approach to EPP for higher-education admissions in the postpandemic era featuring digital and temporal elements. Practical implications for university admissions are also discussed.

1. Introduction

The COVID-19 pandemic has caused sweeping global societal changes in which individuals’ lives are ‘tele-everything’ and tech-driven under new social norms [1]. Education and academia, as crucial components of life, were not immune to the unprecedented disruptions caused by the pandemic [2,3,4]. In particular, due to lockdown and social distancing requirements, higher-education institutions (HEIs) began to deliver virtual classes or experienced temporary shutdowns (see the Higher Education Research and Development 2020 special collection of essays Reflecting on the impact of COVID-19 global impact). The impact also extended to universities’ admission practices due to the cancellation of relevant high-stake sit-in tests, such as the International English Language Testing System (IELTS) and the Test of English as a Foreign Language (TOEFL). Consequently, international applicants intending to study in “English as the medium of instruction” (EMI) contexts were forced to submit alternative forms of English proficiency proof (EPP) that HEIs would accept [5]. To resolve this issue and for the sake of sustainable development, HEIs chose to accept new versions of online standardised tests, such as at-home TOEFL [6] and IELTS [7]. Some HEIs also altered their methods of delivering university-wide tests; for example, Iowa State University delivered its English placement test in an outdoor location with test takers and raters wearing masks, face shields, and gloves (see the Language Assessment Quarterly 2021 special issue The quest to maintain language assessment quality during a pandemic: Stories from higher education). In addition, the pandemic created possibilities for change in higher education, including the increased digitalisation of teaching and testing [4,8]. Given the pandemic’s impact and potential opportunities, this study explores admission requirements and examines the changing practices surrounding the submission of English proficiency proof (EPP). As such, for the sustainable development of higher education in the post-pandemic era, a multidimensional approach is proposed and discussed as our lesson from the pandemic.
As HEIs often become test users during the admission process, test utilisation [9] has emerged as a highly relevant issue. Because international applicants were unable to access the required tests during COVID-19, HEIs risked delivering inequitable admission decisions if they focused solely on test scores. To avoid this potential inequity, HEIs were expected to adapt their utilisation of language tests or introduce new practices to assess international applicants during the pandemic. Thus, we find it imperative to examine the changing practices of HEIs as test users during COVID-19. With the aim of learning from the legacy of COVID-19, this study offers insight into the practices of higher-education admissions in the post-pandemic era. To examine the trend, the study involves a two-year time frame and analyses documents from the official websites of 77 top-ranking EMI HEIs in 2020 and 2021. The two research questions (RQs) below are addressed:
RQ1: How have HEIs changed their practices of using English tests in response to COVID-19?
RQ2: How have HEIs changed their practices of EPPs in response to COVID-19?

2. Literature Review

This study focuses on the changing practices of HEIs as test users during the COVID-19 emergency. Thus, we first review the role of HEIs as test users in creating EPP-related policies, followed by the role that language tests play among various EPPs in higher-education admissions. Then, we turn to the construct of test utilisation [9] that this study adopts.

2.1. Role of HEIs as Test Users in Admissions

While HEIs do not set admission requirements alone, they often play a decisive role in requiring language tests for the admission of international students. HEIs must follow nationwide policies by governments, who are viewed as macrolevel actors [10]. For example, the Australian government released a set of criteria to regulate EPP requirements and language support in Australian universities [11]. In Britain, the British Council similarly released the Inspection Criteria for British Council Accreditation to regulate the language practices and management of EMI universities, although these criteria only apply to HEIs that voluntarily join the council. Few additional government-imposed regulatory measures exist, indicating that HEIs themselves act as the true decision makers in ‘develop(ing) explicit language policies in response to a changing language context’ [12] (p. 231). For admission decisions, HEIs often rely on language test scores to stipulate admission requirements. Thus, test utilisation is a highly relevant issue.
However, applying score-based EPP stipulations across HEIs also invites problems. Most HEIs claim to set their minimal TOEFL/IELTS scores on the basis of prospective students’ needed level of English proficiency for studying in an EMI context. However, previous research indicates that HEIs as test users may possess varying levels of knowledge about standardised tests and the significance of their scores [13,14,15,16]. Stakeholders may also have reservations about the use of IELTS scores to predict prospective students’ academic performance [17]. This issue could be exacerbated when admission policy makers are given limited time to decide whether to adopt new tests in emergency situations such as the pandemic.
Moreover, previous research was problematised using traditional standardised tests as the sole source of EPP. For example, HEIs may admit students of mixed proficiency levels when multiple language tests are accepted without a clear equivalence among them [18]. Consequently, HEIs’ EPP criteria are likely to sometimes select applicants with insufficient English proficiency or communicative competence in an academic context [9]. Students at an EMI HEI were also reported to be weak in academic skills [19,20]. With standardised tests having become temporarily unavailable during the pandemic, it may be high time for HEIs to adjust their EPP-related policies and innovate their utilisation of tests.

2.2. EPPs in University Admissions

The previous literature on EPPs primarily runs in two strands. First, HEIs accepted TOEFL and/or IELTS scores as the major if not the only types of EPP for many years [21,22]. However, the stipulation of minimum entry requirements is difficult due to the dilemma between educational standards and business imperatives [18]. When the low quality or low success rates of graduates negatively reflect the quality of an HEI’s offered education, the HEI may consider raising its language entry requirements; this ensures that only those with sufficient language proficiency are admitted, thereby increasing the chances of their academic success [23]. However, if an HEI’s language requirement is set too high, it may risk receiving fewer eligible applicants, and hence face financial and managerial pressure [24]. Therefore, some HEIs offer presessional language courses for applicants whose language test scores fall slightly below the expected minimal requirements to better prepare them for EMI education [25]. Such a compromise may be reasonable given the scepticism that still surrounds the appropriate use of test scores and the practice of using standardised tests as linguistic gatekeepers or predictors of academic success [22].
The other strand of research concerns other forms of EPP than language test scores. Oliver, Vanderford, and Grote conducted a survey among Australian universities about their English language entry requirements for students from non-English-speaking backgrounds [22]. The survey revealed that, although standardised tests such as TOEFL and IELTS still topped the list of acceptable EPPs, some Australian universities also accepted other types of proof. For example, a non-score-related proof was EMI learning experience in postsecondary schooling, followed by intensive academic English courses and similar experiences.
Apart from those pre-enrolment qualifications, HEIs also tend to use inhouse assessments, such as post-enrolment language assessment (PELA) [26,27]. Compared to large-scale standardised language tests, PELA may serve many purposes. One of its important purposes is to provide precise feedback reflecting newly enrolled students’ English proficiency or needs in the target discipline. One example, the Diagnostic English Language Needs Assessment [28] proves that language education based on PELA diagnostic feedback can be highly effective and discipline-specific after enrolment [29].
The above research demonstrates that EPPs are not limited to standardised test scores or pre-enrolment requirements; this idea has important implications for HEIs’ practices in adapting their EPP-related practices during COVID-19.

2.3. Test Utilisation for University Admissions

Since standardised language tests are often associated with high-stakes education decisions, HEIs’ accommodation efforts for students taking tests became more important during COVID-19. Isbell and Kremmel [30] indicated that test users, while prioritising sustainable development of higher education, may be concerned about selecting at-home tests, particularly regarding the new test construct, which allows for more interactivity, multimodality, and authentic representation of lifelike communication compared to paper-based tests. However, despite these advantages, concerns about the fairness of online tests are even more pronounced [31,32,33]. Test takers have had varying access to tests during the pandemic, which greatly affected HEIs’ decisions about which tests to use and how to use them.
The extant literature established that test fairness depends not only on the tests themselves but also on how they are utilised by HEIs. The Standards for Educational and Psychological Testing regard test fairness as a crucial test quality, linking it directly to validity [34]. Kunnan suggested test fairness as an overarching quality that encompasses all aspects of a test, including test validity, absence of bias, access to the test, conditions of test administration, and social consequences [35]. Among them, social consequences, which is a broad notion, can be narrowed down to the washback or impact of language tests on learning and teaching. They are particularly relevant to higher-education admissions, and were more explicitly conceptualised as test utilisation in Xi’s work [9]. In a systematic approach, Xi defined test fairness as ‘comparable validity for all relevant groups’ of different backgrounds [9]. In other words, inferences related to domain definition, evaluation, generalisation, explanation, extrapolation, and utilisation are established and interconnected to create an overall argument for test fairness. Specifically, test utilisation, commonly understood as how a test is used for various purposes, draws connections between ‘score-based interpretations and test use’ in ways that ensure ‘test scores and other information provided to users are relevant, useful and sufficient for evaluating the adequacy of international students’ English proficiency’ [9] (p. 157). By ‘relevant’, test users are supposed to be clear about the test construct, which is highly related to the necessary English language ability as required in an academic context. Likewise, test score is expected to be reported in a ‘useful’ manner, for example, reporting subscores of different language skills instead of total score alone. ‘Sufficient’ can be reflected in the meanings of the test score, where descriptions of the extent to which prospective candidates can do with the English language are detailed. With adequate information, test users are able to interpret test (sub)scores and appropriately use tests. In the case of this study, test users may refer to test scores for multiple purposes, such as admission, coursework provision and teaching assistant selection (see Figure 1). That said, whether a test is used as intended depends on the appropriate use of test scores and other information provided to test users; in this context, HEIs act as test users who must decide whether the information is relevant, useful, and sufficient for evaluating the sufficiency of international applicants’ English proficiency in an EMI context.
Drawing on the above literature, this study views test utilisation as essential for HEIs in making admission decisions. Informed by Xi’s construct [9], as illustrated in Figure 1, this study divides the use of test scores by HEIs into three categories: (1) Type 1: decisions about admission; (2) Type 2: providing appropriate English support to international students; and (3) Type 3: selecting international teaching assistants. These three types of test utilisation informed the data analysis in this study (see Section 3.2).

3. Methodology

We conducted a qualitative study analysing the changing EPP-related admission practices of 77 HEIs across two application years. In addressing the RQs, we collected data from the official websites of HEIs’ admissions offices in 2020, when they were first affected by the pandemic outbreak, and in 2021, when they continued to be affected. We drew on the construct of test utilisation (see Figure 1) to facilitate systematic data analysis on EPP-related practices.

3.1. Data Collection

Purposive sampling was adopted to collect policy documents from 77 HEIs on the basis of several criteria [36]. We began with the top 100 QS ranking list in 2019 because it was then the most updated international ranking approved by the International Ranking Expert Group after COVID-19 took hold. We only included EMI institutions that required EPP submissions, ultimately targeting 77 HEIs (see Appendix A).
Subsequently, we searched for EPP-related documents on the websites of the chosen HEIs, which yielded textual data totalling more than 120,000 words. Regarding the timeline of the data collection, we carefully considered the most significant intervening factor in this study, namely, COVID-19. The 2020 data collection period thus spanned from 11 March to 15 May 2020. We chose to begin data collection on 11 March 2020, because on that date, the World Health Organisation escalated COVID-19 from an epidemic to a pandemic, and at the time of writing, this classification is still globally in effect. Since it takes time for HEIs to respond and update their EPP policies, we chose 15 May 2020 as the end of the data collection period, as most of the surveyed HEIs closed their applications around that date. The 2021 data covering the same 77 HEIs followed the same criteria in terms of collection period and procedures. The two sets of policy data were collected for analysis.

3.2. Data Analysis

We used NVivo 12 to assist with data analysis, which was bottom–up and theory-informed. Bearing RQ1 in mind, we started with a bottom–up approach by performing open coding on all relevant data since we initially did not have a clear idea of which practices were specific responses to the pandemic. As coding continued, we found keywords such as ‘temporarily’ or ‘interim’ helpful in indicating COVID-19-related practices, which narrowed down the scope of coding. We identified the varieties of language tests used by HEIs and then categorised them into two major groups: international and regional standardised tests. Within each group, we used descriptive statistics to analyse the use of specific tests that were sometimes conditionally used (see Figure 2 and Figure 3).
Informed by the construct of test utilization [9], we continued with content analysis [36] to examine how those tests were utilised (RQ2). Building upon the above open coding of the data on tests, we conducted axial coding and adopted the three types of test-utilisation purposes [9] (see Figure 1) to generate the analytic codes. For example, Extract 1 relates to how the Duolingo English Test (DET), a newly emerged online test, was used together with an additional in-house test or post-enrolment coursework; this example falls within Type 2 test utilisation [9], or providing appropriate language support (text underscored, followed by HEI number and year of data collection).
(1) The minimal Duolingo English Test score to be eligible for admission to… However, applicants who are selected for admission but have a score below 130 may be required to complete additional English language testing or coursework (76/2020).
Axial coding revealed that the majority of the open codes could be interpreted as using scores for admissions (Type 1 test utilisation) and ESL coursework or language support (Type 2 test utilisation), while very few codes pertained to selecting teaching assistants (Type 3 test utilisation). While the tests were employed for different purposes, they were also used in combination with other forms of assessment. Hence, patterns of using multiple assessments were generated as well. For example, Extract 1 above was also coded as ‘new online test used conditionally with PELA’. All coding was independently completed by two researchers and one research assistant, and disagreements, if any existed, were resolved through discussion.

4. Results

4.1. Changing Practices of Using English Tests

4.1.1. Embracing Online Tests and Extending Test Validity Period

Document analysis reveals one major trend—the rise of online testing in the form of either traditional or newly emerged tests. For example, for sustainable development, more HEIs turned to embrace online standardised tests, such as the TOEFL Home Edition and the IELTS Indicator, both of which are newly adapted online versions of traditional tests. As shown in Figure 2, most of the HEIs in 2020 accepted TOEFL Home Edition (N = 56) and the IELTS Indicator (N = 55) scores as their EPPs. Moreover, the number of HEIs accepting these two standardised tests increased in 2021 (TOEFL N = 65, IELTS N = 56), indicating a wider acknowledgement of online tests among HEIs. This finding reveals that online tests, although new in their form of test delivery, were vigorously used for admission decisions (Type 1 test utilisation) [9].
However, the use of online tests also presents a nuanced picture. First, not all of the HEIs were open to online testing. A slightly greater number of the HEIs rejected the use of the TOEFL ITP Plus, which combines a paper-based test with an unscored speaking video interview (for details, see https://www.ets.org/s/cv/toefl/institutions/itp-plus-china/ accessed on 7 May 2021), in 2021 (N = 6) than that in 2020 (N = 4). Similarly, even though only two HEIs rejected the IELTS Indicator in 2020, four more HEIs explicitly stated their rejection of this new form of IELTS in 2021. Moreover, they did not provide reasons for dismissing the TOEFL ITP Plus or the IELTS Indicator as EPPs, as illustrated in Extracts 2 and 3. This might be because HEIs as test users find it difficult to reach valid admission decisions using these new test forms, given that online tests may not be completely comparable with traditional sit-in tests [6,30,31,32].
(2) Any other proficiency tests, exams, and/or other measurement other than the TOEFL or IELTS are not accepted. This includes the IELTS Indicator test and the TOEFL ITP Plus (9/2020, 2021).
(3) For students applying for the 2021–2022 academic year, we are unable to accept the IELTS Indicator and TOEFL ITP Plus at this time (68/2021).
Another major trend relates to the use of newly emerged online tests. These tests, such as the DET, were originally designed to be taken at home and rarely appeared in the HEIs’ EPP lists before the pandemic. As shown in Figure 2, the DET witnessed increased popularity over the two years of our data collection, as the number of HEIs accepting it as a new type of EPP doubled from 2020 to 2021. Among these HEIs, we also identified two ways of using the DET: unconditionally or conditionally. In 2021, 31 HEIs, compared to 12 HEIs in 2020, unconditionally used the test, that is, they treated the DET score as equal to the TOEFL/IELTS scores, as shown in the extracts below.
(4) TOEFL, IELTS, Duolingo English Test … are recommended for applicants whose primary language is not English or students who had not attended an English language school for the last three years (20/2021).
(5) The Duolingo English Test will be accepted as an alternative to TOEFL and IELTS for proof of English proficiency for impacted applicants (54/2021).
Extracts 4 and 5 exemplify how the DET enjoys equal status with other traditional standardised tests and is even recommended as an alternative form of EPP. This example falls under Type 1 test utilisation of choosing a newly emerged standardised test for admission decisions [9].
Despite the increased acceptance of the DET, some HEIs elected to conditionally use the test, that is, its use was accompanied by additional conditions such as presessional coursework. As such, several HEIs treated the DET as a source of supplemental EPP information. For example, Extracts 6 and 7 reveal that the DET was not utilised for immediate admission decisions.
(6) The university approved additional English language tests, including TOEFL iBT Special Home Edition and Duolingo (for entry into the presessional English language course only) (45/2020).
(7) For the 2021 spring cohort only, the Duolingo test will be accepted for registration of presessional English courses (37/2021).
Extracts 6 and 7 indicate that the DET was used only to evaluate whether presessional language support would be appropriate. Both extracts fall within Type 2 test utilisation [9].
In addition to their aforementioned changing practices of test utilisation, the HEIs also shifted their focus from postponing the EPP submission deadline in 2020 to extending the validity of applicants’ existing EPPs in 2021. In 2020, 21 HEIs ‘extended’ their application deadlines, as Extract 8 shows. However, in 2021, instead of postponing EPP submission deadlines, approximately half of the surveyed HEIs ‘temporarily’ extended the validity period(s) of TOEFL/IELTS, as shown in Extract 9.
(8) In light of pressure resulting from COVID-19… the university extended certain academic deadlines for candidacy (19/2020).
(9) The validity of IELTS, TOEFL … was temporarily extended from two to three years (33/2021).
On the basis of the above findings, a general trend is that HEIs are increasingly accepting online tests, including at-home versions of standardised tests and newly emerged ones, for admission decisions. While the popularity of online tests is on the rise, the use of the DET presents a more nuanced picture. Tests such as the DET may only be used for decisions on language support or in combination with additional assessments. In addition, the HEIs changed their approach to EPPs by extending the validity periods of test scores.

4.1.2. Promoting Inhouse Tests and Accepting Regional Tests

Apart from using online standardised tests, the HEIs also invigorated their own inhouse tests. These tests, whose inception can be traced to before the pandemic, were used along with online standardised tests, and were thus given a greater role in the pandemic.
Inhouse tests are primarily used in two ways. First, as placement tests, they are used to classify international students into different language courses when their TOEFL/IELTS scores fall slightly below the expected thresholds. In other words, they are applied in combination with online standardised tests. Extracts 10 and 11 offer examples of HEIs using inhouse tests to determine whether applicants with borderline English proficiency need to take ‘recommended’ ESL (10) or ‘presessional’ courses (11). This could be useful when HEIs base admission decisions on online tests alone, since online tests, particularly TOEFL/IELTS, are not identical regarding the new test construct, which could lead to a concern of construct irrelevance [34]. Therefore, inhouse tests are combined with online standardised tests for Type 2 test utilisation of determining ESL coursework or language support [9].
(10) In certain circumstances, with programme approval, admission may be granted with the following scores. However, in order to be admitted, an English assessment test is required upon arrival, and any recommended English as a Second Language (ESL) course must be successfully completed in your first semester (44/2020, 2021).
(11) Your whole test will be reviewed to provide us with a holistic assessment of your English language proficiency and to determine whether you are eligible for direct entry or for one of our presessional English courses. If deemed necessary, an additional inhouse assessment may be required (72/2021)
Second, in-house tests also diagnose students’ post-enrolment English proficiency in an academic context, similar to PELA, as reviewed above (see Section 2.2). Extracts 12 and 13 show that, in addition to EPPs they had provided, students are also required to take diagnostic assessment or an English evaluation test to identify their ‘current level of academic English skills’.
(12) If you are enrolling in a bachelor’s degree programme …, you may need to meet the Academic English Language Requirement as screened in the diagnostic assessment.
(13) … newly admitted international graduate students should take the English Evaluation Test (EET) as a means of determining the current level of academic English skills. The EET provides new students and their advisors with information to help them prepare for success in their graduate activities… (1/2021)
The above extracts both fall into Type 2 test utilisation of using tests for determining the necessity of language support. Although this approach is not new, HEIs prefer to combine inhouse tests with at-home standardised tests, particularly during the pandemic, because inhouse tests are intended to enhance students’ academic success in a more discipline-specific context.
In addition to using self-developed tests, the HEIs also expanded the inventory of accepted regional tests, as illustrated in Figure 3. Over the two years, an increasing number of HEIs accepted regional or nationwide tests, such as International Baccalaureate, the Michigan English Test and the Trinity English Test, which are not globally accessible. One exception is the College English Test Band 6 (CET-6) administered in Mainland China. In 2020, one UK-based and two Hong Kong-based HEIs accepted CET-6 as an alternative EPP, possibly due to applicants’ inability to access IELTS/TOEFL. However, in 2021, only one HEI continued to accept this test, a sign of the interim use of this specific regional test.
As such, in addition to online tests, HEIs also employ their own inhouse tests for placement and diagnostic purposes. These tests are used either independently or along with evidence from standardised test scores to assess students’ academic language proficiency. Other existing regional tests also gained popularity as EPP sources diversified during the pandemic, while CET-6 was largely discontinued in test utilisation in 2021.

4.2. Changing Practices of Combining Different EPP Sources

Building upon the above findings on the changing types of EPP, data analysis revealed that multiple types of assessment are increasingly combined. HEIs are inclined to combine standardised tests with other alternatives for admission decisions, particularly in 2021. Three patterns were identified: (1) TOEFL/IELTS with newly emerged online tests, (2) TOEFL/IELTS with online interviews, and (3) newly emerged/regional tests with presessional coursework.
The first pattern highlights the use of multiple online tests in the digitalised era. For example, Extract 14 reveals that the DET was adopted as a temporary EPP solution and that applicants are still required to submit TOEFL/IELTS scores after admission. Such a practice might be interpreted as HEIs’ cautious approach to newly emerged online tests such as the DET.
(14) …accepts the DET as supplemental information but it does not fulfil the TOEFL/IELTS requirement. Candidates are still required to submit their TOEFL/IELTS score afterwards (44, 2021)
The second pattern features the use of online interviews combined with at-home TOEFL/IELTS. Extract 15 showcases an HEI’s recommendation that applicants submit an online interview recording through ‘a third-party provider’.
(15) …applicants can use the TOEFL Home Edition... in support of their admission application, non-native English speakers can also choose to submit a recorded, unscripted interview through a third-party provider as supplementary information (52, 2021)
Both of the above two patterns belong to Type 1 test utilisation. In some cases, however, the first two combinations coexist in HEIs’ EPP practices, but for different purposes. Extract 16 shows that compared with TOEFL/IELTS, the DET was used as an expedient to select teaching assistants, which illustrates Type 3 test utilisation [9].
(16) The DET cannot be used to satisfy the English proficiency requirement for teaching assistants. Students who submit a Duolingo score for admission need to either take the TOEFL/IELTS at a later date or take the English Proficiency Interview (EPI) (54/2021).
Extract 16 also illustrates that the HEI took extra measures to remedy the shortcomings of the DET when selecting teaching assistants who were required to have a higher command of oral English proficiency. Thus, they collected additional information by either requesting TOEFL/IELTS scores or interviewing candidates after the test.
In keeping with the third pattern, Extract 17 indicates that the HEI asked applicants with unsatisfactory DET scores to enrol in a presessional course. Likewise, Extract 18 shows that, although the HEI accepted the CET-6, it was not used as a straightforward entry requirement but as a way to evaluate the applicants’ English proficiency and determine presessional language support.
(17) Applicants with a DET score below 105 are required to take Academic English Study as presessional module (65/2021).
(18) …we accept the following additional tests as evidence of English language proficiency: … Chinese College English Test (note that this must be accompanied by Presessional English Programme) (58/2020).
Given that HEIs as test users may not be familiar enough with newly emerged and regional tests to employ them for admission decisions, combining the test scores with coursework may be a safer solution to ensure that international students possess adequate language proficiency in an EMI context.
In summary, HEIs favour collecting various sources of EPP. In particular, when sit-in tests are suspended while some newly emerged or regional tests are available, HEIs tend to combine them into a flexible approach of requesting EPPs. Even when only newly emerged online tests are available, the HEIs can supplement them with interviews or presessional coursework.

5. Discussion

To learn from the legacy of COVID-19, HEIs’ changing practices of using English tests and their tendency to combine different EPPs are revealed. Regarding RQ1, most of the surveyed HEIs embraced online tests, and most accepted at-home versions of TOEFL/IELTS as alternative solutions. However, the IELTS Indicator was rejected by several more HEIs in 2021 than those in 2020. This is likely because unlike the TOEFL Home Edition, which is almost identical to the TOEFL iBT except for its proctoring process [7], the IELTS Indicator, as its name suggests, serves only indicative purposes [6,32]. In other words, its potential new construct might alarm test users. These recent patterns could also be caused by the controversy surrounding the use of TOEFL/IELTS as the sole source of EPP [13,14]. When the pandemic forced testing companies to conduct their tests online, these tests might have been regarded as a less sound predictor of English proficiency, an idea that can be corroborated by existing studies [14,15,16,17]. However, given the unprecedented pandemic, a large number of HEIs chose to accept online standardised tests. This may also be understood as a solution in the face of financial and managerial pressure to sustainably develop higher education even in the post-pandemic era [18].
The digitalisation trend is also reflected in the fact that the DET, a representative of newly emerged online tests, played a more significant role as a source of EPP in 2021 than that in 2020. While some HEIs remain sceptical of its use, the DET was accepted by a growing number of HEIs in 2021. This might be attributed to not only its vigorous publicity efforts over the past year and an updated test review [37] but also its strength of having been an online test since its inception. After the pandemic is over, due to concerns surrounding test access, standardised tests may not be identically restored to their past versions. Instead, it is likely that, for sustainable HEI development, more standardised tests will go online [4], while more efforts are needed to ensure the comparability of sit-in tests and online ones in the digitalised era [30,31].
In timeline decisions, HEIs shifted from postponing EPP submission deadlines to extending the validity periods of test scores. This can be viewed as a practice shift, as in 2020, the HEIs seemed to passively wait and see how the pandemic would transpire; in contrast, in 2021, they proactively extended test validity periods. Given this timeline, HEIs implemented new policies in response to a changing context [9]. As the pandemic persists, HEIs attempt to prioritise their sustainable development, thus leaving much room for timeline adjustment.
Apart from online standardised tests, HEIs also applied their own inhouse tests, used more regional tests, and even combined various sources of EPPs. While using inhouse and regional tests is not new, combining these tests with online standardised tests can better inform HEIs in test utilisation. As a benefit, inhouse tests can work in concert with online standardised tests for admission decisions, as using standardised tests alone as gatekeepers may be controversial [10]. As another benefit, some inhouse tests such as PELA could help admissions staff in becoming better informed by assessing students’ discipline-specific academic literacy and communicative skills in an EMI context [27,28]. This practice may be viewed as conducive to the sustainable development of higher education, as it lengthens the quality of prospective students by setting thresholds at both admission and post-enrolment stages.
During COVID-19, HEIs combined online standardised tests, especially at-home TOEFL/IELTS, with newly emerged tests such as the DET or with online interviews. In some cases, they also combined newly emerged or regional tests with presessional coursework. Oliver, Vanderford, and Grote suggested that non-score-based evidence such as coursework can play a role in admission decisions [22]. In an EMI context, prior coursework can even better prepare students [25]. Therefore, on the basis of the findings, we adapted the construct of test utilisation [9] to propose a sustainable multidimensional approach to test utilisation for higher-education admissions in the post-pandemic era. This approach features added digital and temporal elements. In essence, for the sustainable development of higher education, a one-time standardised test score may no longer be the sole source of admission decision making. Instead, higher education admissions should be approached with increased flexibility both before and after enrolment in the digitalised era.
As illustrated in Figure 4, in the post-pandemic era, this sustainable multidimensional approach features a temporal dimension so that HEIs’ decision-making process can develop more fully over a longer period, so that an applicant’s chance of admission is not based on a single opportunity. Specifically, the approach involves pre- and post-enrolment assessments to provide a more precise picture of applicants’ academic English skills in an EMI context. International standardised tests, such as the TOEFL and the IELTS (upper left box in Figure 4), continue to play a pivotal role as EPP options. However, different EPP combinations for different test utilisations are also available. First, combined with TOEFL/IELTS, the requirement of oral interviews or regional tests can better screen prospective students, thus leading to more informed admission decisions. This is consistent with the first type of test utilisation [9], namely, admission decisions. Second, the combined use of inhouse tests with TOEFL/IELTS or regional tests can connect students with more relevant academic and language support in various discipline-specific studies, which is related to providing appropriate English support to international students, i.e., Type 2 test utilisation [9]. Third, conducting oral interviews may remedy the deficits of newly emerged tests or provide an authentic context for HEIs to select teaching assistants with the required oral proficiency (Type 3 test utilization [9]). By lengthening the quality control of prospective students both before and after enrolment, this approach helps in ensuring the sustainable development of higher education. In addition, the proposed approach is conducive to enhancing the test fairness of EPPs provided by international applicants.
Figure 4 also illustrates another major trend regarding EPPs: in the technology-driven era, more HEIs embrace the use of online tests, either at-home TOEFL/IELTS or newly emerged online tests such as the DET. This is primarily because the pandemic, while accelerating the digitalisation of teaching and testing [2,3,32], has also created greater public reliance on digital tools [1]. Given the collective embrace of ‘tele-everything’, HEIs likely opt for more diversified EPPs that are more conveniently accessible online. As such, by adapting Xi’s construct of test utilisation [9] and drawing on the legacy of HEIs’ changing EPP practices, this sustainable multidimensional approach may offer HEIs more leeway in utilising tests for admissions in the post-pandemic technology-driven era.

6. Conclusions

The study examined HEIs’ changing EPP practices in response to COVID-19. Specifically, HEIs embrace online tests, apply their own inhouse tests, and accept more regional tests as EPPs. They also combined the TOEFL/IELTS with newly emerged tests, oral interviews and/or presessional coursework for admission decisions. Although there might be some controversies concerning the weaknesses of online tests [31,32,33], for example, TOEFL home edition test takers reporting long waiting time for proctors to show up and give directions, they played an alternative role to traditional EPPs in the pandemic. Thus, HEIs’ changing practices may well inform EPP-related policymaking in the post-pandemic era.
On the basis of findings, we adapted the construct of test utilisation [9], where decision making based on different types of test utilisation is aligned with a temporal dimension, namely, pre- and post-enrolment phases. The sustainable multidimensional approach highlights that informed admission decisions in higher education are unlikely to stem from a single source; rather, they are based on assessments in both pre- and post-enrolment phases. This approach might not be just an interim solution because HEIs globally are still having a shrinkage of newly enrolled international students (for example, the overall international student enrolment fell by 16% in autumn 2020 in the US [38]). Therefore, for the sake of sustainable development, adopting a flexible approach to embracing online tests and using tests at different enrolment phases in the digitalised era is important. Even though the pandemic may soon be over, the proposed approach may better globally inform HEIs in trying new proofs by international applicants in the unpredictable future, where country- or regionwide public health issues might arise in the post-pandemic era.
However, our findings need to be interpreted with some caveats. First, the scope of the study is limited, as we only investigated 77 top-ranking global EMI HEIs. Investigating HEIs of different levels or rankings may yield a more comprehensive view. Second, by necessity, we imposed a specific time period for the investigation, although we were aware that HEIs might continue to update their admission policies depending on pandemic lockdown policies. Third, the study is largely text-based; our findings could be triangulated with additional studies, perhaps by interviewing test users such as HEI admission policy makers about the formulation of their institutions’ changing practices.

Author Contributions

Conceptualization, M.P. and J.T.; methodology, J.T.; software, J.T.; validation, M.P. and J.T.; formal analysis, M.P. and J.T.; investigation, M.P. and J.T.; resources, M.P. and J.T.; data curation, M.P. and J.T.; writing—original draft preparation, M.P. and J.T.; writing—review and editing, M.P. and J.T.; visualization, M.P.; supervision, J.T.; project administration, M.P.; funding acquisition, M.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Innovative Research Team of Shanghai International Studies University (grant number 2020114050).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. List of 77 EMI HEIs on QS Ranking Top 100

1. Massachusetts Institute of Technology2. Stanford University3. Harvard University
4. California Institute of Technology5. University of Oxford6. University of Cambridge
7. ETH Zurich8. Imperial College London9. University of Chicago
10. University College London 11. National University of Singapore
12. NanYang Technological University13. Princeton University14. Cornell University
15. Yale University16. Columbia University17. University of Edinburgh
18. University of Pennsylvania19. University of Michigan20. Johns Hopkins University
21. The Australian National University 22. The University of Hong Kong23. Duke University
24. University of California Berkeley25. University of Toronto26. The University of Manchester
27. King’s College London28. University of California Los Angeles 29. McGill University
30. Northwest University31. The Hong Kong University of Science and Technology
32. The London School of Economics and Political Science33. The University of Melbourne
34. University of California San Diego35. The University of Sydney36. New York University
37. The University of New South Wales38. Carnegie Mellon University
39. University of British Columbia40. The University of Queensland
41. The Chinese University of Hong Kong42. University of Bristol
43. Delft University of Technology44. University of Wisconsin 45. The University of Warwick
46. City University of Hong Kong47. Brown University48. University of Amsterdam
49. Monash University50. University of Texas at Austin51. University of Washington
52. Georgia Institute of Technology53. University of Glasgow
54. University of Illinois at Urbana-Champaign55. Durham University
56. The University of Sheffield57. University of Zurich58. University of Birmingham
59. University of Copenhagen60. KU Leuven61. University of Nottingham
62. University of North Carolina at Chapel Hill63. The University of Auckland
64. Rice University65. University of Malaya66. Ohio State University
67. The University of Western Australia68. Boston University69. Lund University
70. University of Leeds71. Pennsylvania State University72. University of Southampton
73. University of St Andrews74. Eindhoven University of Technology75. Purdue University
76. University of California Davis77. Washington University

References

  1. Anderson, J.; Rainie, L.; Vogels, E.A. Experts Say the ‘New Normal’ in 2025 Will Be Far More Tech-Driven, Presenting More Big Challenges. Available online: https://www.pewresearch.org/internet/2021/02/18/experts-say-the-new-normal-in-2025-will-be-far-more-tech-driven-presenting-more-big-challenges/ (accessed on 6 April 2022).
  2. Carusi, F.T.; Di Paolantonio, M.; Hodgson, N.; Ramaekers, S. Doing academia in “COVID-19 Times”. Antistasis 2020, 10, 1–8. [Google Scholar]
  3. Corbera, E.; Anguelovski, I.; Honey-Rosés, J.; Ruiz-Mallén, I. Academia in the time of COVID-19: Towards an ethics of care. Plan. Theory Pract. 2020, 21, 191–199. [Google Scholar] [CrossRef]
  4. Pokhrel, S.; Chhetri, R. A literature review on impact of COVID-19 pandemic on teaching and learning. High. Educ. Future 2021, 8, 133–141. [Google Scholar] [CrossRef]
  5. Ockey, J.G. An overview of COVID-19’s impact on English language university admissions and placement tests. Lang. Assess. Quar. 2021, 18, 1–5. [Google Scholar] [CrossRef]
  6. Clark, T.; Spiby, R.; Tasviri, R. Crisis, collaboration, recovery: IELTS and COVID-19. Lang. Assess. Quar. 2021, 18, 17–25. [Google Scholar] [CrossRef]
  7. Papageorgiou, S.; Manna, F.V. Maintaining access to a large-scale test of academic language proficiency during the pandemic: The launch of TOEFL iBT Home Edition. Lang. Assess. Quar. 2021, 18, 36–41. [Google Scholar] [CrossRef]
  8. Green, W.; Anderson, V.; Tait, K.; Ly Thi, T. Precarity, fear and hope: Reflecting and imagining in higher education during a global pandemic. High. Educ. Res. Dev. 2020, 39, 1309–1312. [Google Scholar] [CrossRef]
  9. Xi, X. How do we go about investigating test fairness? Lang. Test. 2010, 27, 147–170. [Google Scholar]
  10. Fenton-Smith, B.; Gurney, L. Actors and agency in academic language policy and planning. Curr. Issus Lang. Plan. 2016, 17, 72–87. [Google Scholar] [CrossRef]
  11. Murray, N. Standards of English in Higher Education: Issues, Challenges and Strategies; Cambridge University Press: Cambridge, UK, 2016. [Google Scholar]
  12. Liddicoat, A.J. Language planning in universities: Teaching, research and administration. Curr. Issus. Lang. Plan. 2016, 17, 231–241. [Google Scholar] [CrossRef] [Green Version]
  13. Hamid, O.M.; Hoang, N.T.H.; Kirkpatrick, A. Language tests, linguistic gatekeeping and global mobility. Curr. Issus. Lang. Plan. 2019, 20, 226–244. [Google Scholar] [CrossRef]
  14. Menken, K. High-stakes tests as de facto language education policies. In Language Testing and Assessment; Shohamy, E., Ed.; Springer: Berlin/Heidelberg, Germany, 2019; pp. 385–396. [Google Scholar]
  15. O’Loughlin, K. Developing the assessment literacy of university proficiency test users. Lang. Test. 2013, 30, 363–380. [Google Scholar] [CrossRef]
  16. Dunworth, K.; Drury, H.; Kralik, C.; Moore, T. Degrees of Proficiency: Building A Strategic Approach to University Students’ English Language Assessment and Development; Final Report; Australian Government Office for Learning and Teaching: Canberra, Australia, 2013.
  17. Hyatt, D. Stakeholders’ perceptions of IELTS as an entry requirement for higher education in the UK. J. Furth. High. Educ. 2013, 37, 844–863. [Google Scholar] [CrossRef]
  18. Murray, N. University gatekeeping tests: What are they really testing and what are the implications for EAP provision? JACET J. 2018, 62, 15–27. [Google Scholar]
  19. Berman, R.; Cheng, L. English academic language skills: Perceived difficulties by undergraduate and graduate students, and their academic achievement. Can. J. Appl. Linguist. 2010, 4, 25–40. [Google Scholar]
  20. Johnson, E.M. An investigation into pedagogical challenges facing international tertiary- level students in New Zealand. High. Educ. Res. Dev. 2008, 27, 231–243. [Google Scholar] [CrossRef]
  21. Coley, M. The English language entry requirements of Australian universities for students of non-English speaking background. High. Educ. Res. Dev. 1999, 18, 7–17. [Google Scholar] [CrossRef]
  22. Oliver, R.; Vanderford, S.; Grote, E. Evidence of English language proficiency and academic achievement of non-English-speaking background students. High. Educ. Res. Dev. 2021, 31, 541–555. [Google Scholar] [CrossRef] [Green Version]
  23. Benzie, H.J. Graduating as a ‘native speaker’: International students and English language proficiency in higher education. High. Educ. Res. Dev. 2010, 29, 447–459. [Google Scholar] [CrossRef]
  24. Ruegg, R.; Petersen, N.; Hoang, H.; Ma, M. Effects of pathways into university on the academic success of international undergraduate students. High. Educ. Res. Dev. 2021, 40, 1283–1297. [Google Scholar] [CrossRef]
  25. Green, A. Washback to learning outcomes: A comparative study of IELTS preparation and university pre-sessional language courses. Assess. Educ. Princ. Policies Pract. 2007, 14, 75–97. [Google Scholar] [CrossRef]
  26. Read, J. Issues in post-entry language assessment in English-medium universities. Lang. Teach. 2015, 48, 217234. [Google Scholar] [CrossRef]
  27. Elder, C.; Bright, C.; Bennett, S. The role of language proficiency and academic success: Perspectives from a New Zealand university. Melb. Pap. Lang. Test. 2007, 12, 24–58. [Google Scholar]
  28. Read, J. Assessing English Proficiency for University Study; Palgrave Macmillan: London, UK, 2015. [Google Scholar]
  29. Murray, N.; Hicks, M. An institutional approach to English language proficiency. J. Fur. High. Educ. 2016, 40, 170–187. [Google Scholar] [CrossRef]
  30. Isbell, D.; Kremmel, B. Test review: Current options in at-home language proficiency tests for making high stakes decisions. Lang. Test. 2020, 37, 600–619. [Google Scholar] [CrossRef]
  31. Ilgaz, H.; Adanır, G.A. Providing online exams for online learners: Does it really matter for them? Educ. Inf. Technol. 2020, 25, 1255–1269. [Google Scholar] [CrossRef]
  32. Raman, R.; Vachharajani, H.; Nedungadi, P. Adoption of online proctored examinations by university students during COVID-19: Innovation diffusion study. Educ. Inf. Technol. 2021, 26, 7339–7358. [Google Scholar] [CrossRef]
  33. Gulevich, O. Fairness of exams: Learning motivation and students’ assessment of teachers’ actions. Soc. Psychol. Soc. 2013, 4, 130–142. [Google Scholar]
  34. AERA; APA; NCME. Standards for Educational and Psychological Testing; National Council on Measurement in Education and the American Council on Education: Washington, DC, USA, 2014. [Google Scholar]
  35. Kunnan, A.J. Test fairness. In European Language Testing in a Global Context: Proceedings of the ALTE Barcelona Conference; Milanovic, M., Weir, C., Eds.; Cambridge University Press: London, UK, 2014; pp. 27–48. [Google Scholar]
  36. Cohen, L.; Manion, L.; Morrison, K. Research Methods in Education, 7th ed.; Sage: New York, NY, USA, 2011. [Google Scholar]
  37. Wagner, E. Duolingo English Test (Revised Version). Lang. Assess. Quar. 2020, 17, 300–315. [Google Scholar] [CrossRef]
  38. US Association of International Educators. 2020. Available online: https://www.nafsa.org/policy-and-advocacy/policy-resources/sustain-international-student-enrollment-us-colleges-and-universities-impacted-covid-19 (accessed on 6 April 2022).
Figure 1. Test-utilisation for higher-education admissions.
Figure 1. Test-utilisation for higher-education admissions.
Sustainability 14 05568 g001
Figure 2. HEI practices of using international standardised tests in 2020 and 2021. * The TOEFL Home Edition only refers to TOEFL ITP Plus.
Figure 2. HEI practices of using international standardised tests in 2020 and 2021. * The TOEFL Home Edition only refers to TOEFL ITP Plus.
Sustainability 14 05568 g002
Figure 3. HEI practices of using regional tests in 2020 and 2021.
Figure 3. HEI practices of using regional tests in 2020 and 2021.
Sustainability 14 05568 g003
Figure 4. A sustainable multidimensional approach to test utilisation for higher-education admissions.
Figure 4. A sustainable multidimensional approach to test utilisation for higher-education admissions.
Sustainability 14 05568 g004
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Pan, M.; Tao, J. Towards a Sustainable Multidimensional Approach to English Proficiency Proof in the Post-Pandemic Era: Learning from the Legacy of COVID-19. Sustainability 2022, 14, 5568. https://doi.org/10.3390/su14095568

AMA Style

Pan M, Tao J. Towards a Sustainable Multidimensional Approach to English Proficiency Proof in the Post-Pandemic Era: Learning from the Legacy of COVID-19. Sustainability. 2022; 14(9):5568. https://doi.org/10.3390/su14095568

Chicago/Turabian Style

Pan, Mingwei, and Jian Tao. 2022. "Towards a Sustainable Multidimensional Approach to English Proficiency Proof in the Post-Pandemic Era: Learning from the Legacy of COVID-19" Sustainability 14, no. 9: 5568. https://doi.org/10.3390/su14095568

APA Style

Pan, M., & Tao, J. (2022). Towards a Sustainable Multidimensional Approach to English Proficiency Proof in the Post-Pandemic Era: Learning from the Legacy of COVID-19. Sustainability, 14(9), 5568. https://doi.org/10.3390/su14095568

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop