Evaluating Perceptions towards the Consequential Validity of Integrated Language Proficiency Assessment
Abstract
:1. Introduction
2. Materials and Methods
2.1. Participants and Setting
2.2. Data Collection Instruments
2.3. Teacher Questionnaire and Interviews
2.4. Student Questionnaire
3. Results
- How well PEP graduates were prepared
- Identified strengths and weaknesses in students’ academic performance
When I got a class in the very beginning of the semester, I asked how many students came from PEP. About twenty out of thirty raised hands. Out of those twenty, five or six will be very good, very well-equipped. Although they are still, especially in speaking they would be very shy and very insufficient. Let’s say five out of twenty would be equipped in terms of writing and can do the work in discipline. About ten will not be up to standard, really. So, they struggle. About five, they shouldn’t be there at all. And I am trying to be realistic. I mean knowing the context, knowing the educational background and the possibilities what could be done in PEP in a certain amount of time, feasibility…. I’m taking in all those factors and I’m trying to give a kind of realistic and generous answer; Thirty per cent of prep school graduates shouldn’t be there. They are not ready. They are effectively, really, still in intermediate or even pre- intermediate level, in some cases. And somehow, they managed to slip through the net.
- Students’ performance in their departmental courses
Unfortunately, most of the students are unable to follow the class because of the language problem. And they are unable to ask questions in foreign language. And that affects the course very bad and negatively. We talk, we show, we discuss, we explain, and we expect students to interact with us to join to the class to contribute to the class, ask the questions, discuss the concepts with us. But they prefer to stay silent and just watch. Then I feel, and most of us feel like, we’re just lecturing in front of a wall. That’s a big concern (T1).
There is a mismatch between students’ educational background and the skills required at the university. Schools are busy with teaching students how to solve a multiple-choice question without having the knowledge. Students focus on the correct answer rather than why that’s the correct answer. Often why is never asked. Thinking, evaluating, criticizing is a mind-set and most students do not seem to have that. Here at the university, they need to be formatted and it is very difficult (T2).
The ability to synthesize ideas is really, widely-challenging. Sometimes we’re not sure whether it is language issue or whether it is a critical thinking issue. I mean, synthesizing is putting ideas together. For example, seeing, detecting the patterns, similarities, new connections between them, there is a critical thinking skill. We’re not sure, and the students are not generally very good at it (T4).
- Suggestions for improvement
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Appendix A.1. Consequential Validity Teacher Questionnaire
1–5 | 6–10 | 11–15 | 16–20 | More than 20 years |
- Do you think students who completed Preparatory School Program are well-prepared regarding their English language ability and academic skills training? Check the degree in the table below that reflects your point of view.
Not Prepared Fairly Prepared Prepared Well Prepared 1. Reading academic texts and understanding the main ideas 2. Taking reading notes 3. Understanding lectures 4. Taking listening notes 5. Writing an organised essay 6. Discussing ideas and expressing opinions clearly and accurately in their speech 7. Asking questions 8. Using a range of vocabulary appropriately 9. Using a range of grammatical structures in their written and spoken work 10. Using different sources (notes, summaries etc.) to support ideas in their written and spoken work 11. Giving feedback to peers 12. Revising own written work based on given feedback If you have further comments about any other English language ability and academic skills, please write them here: - Where do you see the strengths and weaknesses of the students who completed Preparatory School Program? Please comment under relevant headings in the table below.
Strengths of the Students Who Completed Preparatory School Program Weaknesses of the Students Who Completed Preparatory School Program In Writing In Reading In Listening In Speaking In using Grammar and Vocabulary If you have any further comments about any other use of English for Academic Purposes at mainstream departmental university courses, please write them here. - Do you have any suggestions for improving the Preparatory School Program?
Appendix A.2. Consequential Validity Student Questionnaire
- What is your department?
- Where do you see the strengths and weaknesses of the students who completed Preparatory School Program? Please comment under relevant headings in the table below.
Strengths of the Students Who Completed Preparatory School Program Weaknesses of the Students Who Completed Preparatory School Program In Writing In Reading In Listening In Speaking In using Grammar and Vocabulary If you have any further comments about any other use of English for Academic Purposes at mainstream departmental university courses, please write them here. - Do you have any suggestions to improve to Preparatory School Program to prepare you better for the use of English for Academic Purposes at mainstream departmental university courses?
Appendix A.3. Interview Guide
- Correspondence between exit criteria of the English proficiency test and expected academic skills at the departmentsCould you share your observations regarding student achievement in:In WritingIn ReadingIn ListeningIn SpeakingIn use of Grammar and VocabularyIn academic skills
- What skills are required at the department?
- Do you think that students who complete Prep Program can cope with the academic demands of your department
- What needs to be done in prep program to better equip these students for their departments?
- Other Views
- Round up and Thanks
References
- Abbas, Asiba, and Sahiba Sarwar Thaheem. 2018. Washback Impact on Teachers’ Instruction Resulting from Students’ Apathy. Research on Humanities and Social Sciences 8: 45–54. [Google Scholar]
- Bachman, Lyle F., and Adrian S. Palmer. 2010. Language Assessment in Practice: Developing Language Assessments and Justifying Their Use in the Real World. Oxford: Oxford University Press. [Google Scholar]
- Barnes, Melissa. 2017. Washback: Exploring what constitutes “good” teaching practices. Journal of English for Academic Purposes 10: 1–12. [Google Scholar] [CrossRef]
- Beran, Tanya N., and Jennifer L. Rokosh. 2009. Instructors’ perspectives on the utility of student ratings of instruction. Instructional Science 37: 171–84. [Google Scholar] [CrossRef]
- Bereiter, Carl, and Marlene Scardamalia. 1987. The Psychology of Written Composition. Hillsdale: Erlbaum. [Google Scholar]
- Bogdan, Robert, and Sari Knopp Biklen. 1998. Qualitative Research for Education: An Introduction to Theory and Methods. Boston: Allyn and Bacon. [Google Scholar]
- Cheng, Lying. 2013. Consequences, Impact, and Washback. The Companion to Language Assessment. Edited by Antony John Kunnan. Newyork: John Wiley & Sons Inc., pp. 1130–47. [Google Scholar]
- Cumming, Alister, Robert Kantor, Donald Powers, Terry Santos, and Carol Taylor. 2000. TOEFL 2000 writing framework. Princeton: Educational Testing Service. [Google Scholar]
- Cumming, Alister. 2006. Analysis of Discourse Features and Verification of Scoring Levels for Independent and Integrated Tasks for the New TOEFL. TOEFL Monograph No. MS-30 Rm 05-13. Princeton: ETS. [Google Scholar]
- Cumming, Alister, Robert Kantor, Kyoko Baba, Usman Erdosy, Keanre Eouanzoui, and James Mark. 2005. Differences in written discourse in independent and integrated prototype tasks for next generation TOEFL. Assessing Writing 10: 5–43. [Google Scholar] [CrossRef]
- Cumming, Alister, Conttia Lai, and Hyeyoon Cho. 2016. Students’ writing from sources for academic purposes: A synthesis of recent research. Journal of English for Academic Purposes 23: 47–58. [Google Scholar] [CrossRef]
- Cumming, Alister, Luxin Yangc, Chenhui Qiuc, Lian Zhangc, Xiaoling Jid, Junju Wange, Ying Wange, Ju Zhanf, Fengjuan Zhangf, Chunyan Xuf, and et al. 2018. Students’ practices and abilities for writing from sources in English at universities in China. Journal of Second Language Writing 39: 1–15. [Google Scholar] [CrossRef]
- Currie, Pat. 1998. Staying out of trouble: Apparent plagiarism and academic survival. Journal of Second Language Writing 7: 1–18. [Google Scholar] [CrossRef]
- Ferman, Irit. 2004. The washback of an EFL national oral matriculation test to teaching and learning. In Washback in Language Testing: Research Contexts and Methods. Edited by Liying Cheng, Yoshinori Watanabe and Andy Curtis. Manwah: Lawrence Erlbaum Associates, pp. 3–18. [Google Scholar]
- Green, Anthony. 2007. IELTS Washback in Context: Preparation for Academic Writing in Higher Education. Cambridge: Cambridge University Press. [Google Scholar]
- Gokturk Saglam, Asli Lidice. 2018. Can exams change how and what teachers teach? Investigating the washback effect of a university English language proficiency test in the Turkish Context. Eurasian Journal of Applied Linguistics 4: 155–176. [Google Scholar] [CrossRef]
- Gokturk Saglam, Asli Lidice, and Hossein Farhady. 2019. Can Exams Change How and What Learners Learn? Investigating the Washback Effect of a University English Language Proficiency Test in the Turkish Context. Advances in Language and Literary Studies 10: 177–187. [Google Scholar] [CrossRef]
- Haertel, Edward. 2013. How Is Testing Supposed to Improve Schooling? Measurement: Interdisciplinary Research and Perspectives 11: 1–18. [Google Scholar] [CrossRef]
- Hale, Gordon, Carol Taylor, Brent Bridgeman, Joan Carson, Barbara Kroll, and Robert Kantor. 1996. A Study of Writing Tasks Assigned in Academic Degree Programs. Princeton: Educational Testing Service. [Google Scholar] [CrossRef]
- Hamp-Lyons, Liz, and Barbara Kroll. 1996. Issues in ESL writing assessment: An overview. College ESL 6: 52–72. [Google Scholar]
- Hawkey, Roger. 2009. A study of the Cambridge Proficiency in English (CPE) exam washback on textbooks in the context of Cambridge ESOL exam validation. In Language Testing Matters: Investigating the Wider Social and Educational Impact of Assessment, Proceedings of the ALTE Cambridge Conference, April 2008. Edited by Lynda Taylor and Cyril Weir. Cambridge: Cambridge University Press, pp. 326–43. [Google Scholar]
- Hirvela, Alan, and Qian Du. 2013. “Why am I paraphrasing?”: Undergraduate ESL writers’ engagement with source-based academic writing and reading. Journal of English for Academic Purposes 12: 87–98. [Google Scholar] [CrossRef]
- Huang, Heng-Tsung Danny, Shao-Ting Alan Hung, and Lia Plakans. 2018. Topical knowledge in L2 speaking assessment: Comparing independent and integrated speaking test tasks. Language Testing 35: 27–49. [Google Scholar] [CrossRef]
- Iliescu, Dragos, and Samuel Greiff. 2021. On consequential validity. European Journal of Psychological Assessment 37: 163–66. [Google Scholar] [CrossRef]
- Kane, Michael. 2013. The Argument-Based Approach to Validation. School Psychology Review 42: 448–57. [Google Scholar] [CrossRef]
- Leki, Ilona, and Joan Carson. 1997. Completely different worlds: EAP and the writing experiences of ESL students in university courses. TESOL Quarterly 31: 36–69. [Google Scholar] [CrossRef]
- Mansourizadeh, Kobra, and Ummul K. Ahmad. 2011. Citation practices among non-native expert and novice scientific writers. Journal of English for Academic Purposes 10: 152–161. [Google Scholar] [CrossRef]
- Messick, Samuel. 1989. Validity. In Educational Measurement, 3rd ed. Edited by Robert L. Linn. New York: Macmillan, pp. 13–103. [Google Scholar]
- Messick, Samuel. 1995. Validity of psychological assessment. Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American Psychologist 50: 741–49. [Google Scholar] [CrossRef]
- Messick, Samuel. 1996. Validity and washback in language testing. Language Testing 13: 241–56. [Google Scholar] [CrossRef]
- Michaelides, Michalis P. 2014. Validity considerations ensuing from examinees’ perceptions about high-stakes national examinations in Cyprus. Assessment in Education: Principles, Policy & Practice 21: 427–41. [Google Scholar] [CrossRef]
- Pecorari, Diane, and Bojana Petrić. 2014. Plagiarism in second-language writing. Language Teaching 47: 269–302. [Google Scholar] [CrossRef] [Green Version]
- Plakans, Lia. 2009. Discourse synthesis in integrated second language writing assessment. Language Testing 26: 561–87. [Google Scholar] [CrossRef]
- Reckase, Mark D. 1998. Consequential validity from the test developer’s perspective. Educational Measurement: Issues and Practice 17: 13–16. [Google Scholar] [CrossRef]
- Rosenfeld, Michael, Susanna Leung, and Philip K. Oltman. 2001. The Reading, Writing, Speaking, and Listening Tasks Important for Academic Success at the Undergraduate and Graduate Levels. TOEFL Monograph Series 21; Princeton: Educational Testing Service. [Google Scholar]
- Shi, Ling. 2008. Textual appropriation and citing behaviors of university undergraduates. Applied Linguistics 31: 1–24. [Google Scholar] [CrossRef]
- Spivey, Nancy N. 1984. Discourse Synthesis: Constructing Texts in Reading and Writing. Outstanding Dissertation Monograph. Newark: International Reading Association. [Google Scholar]
- Spratt, Mary. 2005. Washback and the classroom: The implications for teaching and learning of studies of washback from exams. Language Teaching Research 9: 5–29. [Google Scholar] [CrossRef]
- Storch, Neomi. 2009. The impact of studying in a second language (L2) medium university on the development of L2 writing. Journal of Second Language Writing 18: 103–18. [Google Scholar] [CrossRef]
- Taleporos, Elizabeth. 1998. Consequential Validity: A Practitioner’s Perspective. Educational Measurement: Issues and Practice 17: 20–23. [Google Scholar] [CrossRef]
- Taylor, Lynda. 2009. Developing assessment literacy. Annual Review of Applied Linguistics 29: 21–36. [Google Scholar] [CrossRef] [Green Version]
- Thompson, Celia, Janne Morton, and Neomy Storch. 2013. Where from, who, why and how? A study of the use of sources by first year L2 university students. Journal of English for Academic Purposes 12: 99–109. [Google Scholar] [CrossRef]
- Tiekstra, Marlous, Alexander Minnaert, and Marco G. P. Hessels. 2016. A review scrutinizing the consequential validity of dynamic assessment. Educational Psychology 1: 112–37. [Google Scholar] [CrossRef]
- Tsagari, Dina. 2009. The Complexity of Test Washback: An Empirical Study. Frankfurt am Main: Peter Lang GmbH. [Google Scholar]
- Tsagari, Dina. 2014. ‘Investigating the face validity of Cambridge English First in the Cypriot context’. Research Notes 57: 23–31. Available online: http://www.cambridgeenglish.org/images/177881-research-notes-57-document.pdf (accessed on 18 February 2022).
- Tsagari, Dina, and Lying Cheng. 2016. ‘Washback, Impact and Consequences’. In Encyclopedia of Language and Education, 3rd ed. Edited by Elana Shohamy and Nancy H. Hornberger. Cham: Springer, vol. 7, pp. 1–13. [Google Scholar] [CrossRef]
- Tsagari, Dina. 2020. Language Assessment Literacy: Concepts, challenges and prospects. In Perspectives on Language Assessment Literacy: Challenges for Improved Student Learning. Edited by Hidri Sahbi. New York: Routledge, pp. 13–33. [Google Scholar]
- Weigle, Sara Cushing. 2004. Integrating reading and writing in a competency test for non-native speakers of English. Assessing Writing 9: 27–55. [Google Scholar] [CrossRef]
- Weigle, Sara Cushing, and Keisha Parker. 2012. Source text borrowing in an integrated reading/writing assessment. Journal of Second Language Writing 21: 118–33. [Google Scholar] [CrossRef]
- Wette, Rosemary. 2017. Source text use by undergraduate post-novice L2 writers in disciplinary assignments: Progress and ongoing challenges. Journal of Second Language Writing 37: 46–58. [Google Scholar] [CrossRef]
- Wette, Rosemary. 2018. Source-based writing in a health sciences essay: Year 1 students’ perceptions, abilities, and strategies. Journal of English for Academic Purposes 36: 61–75. [Google Scholar] [CrossRef]
- Windsor, Angela, and Sang-Soon Park. 2014. Designing L2 reading to write tasks in online higher education contexts. Journal of English for Academic Purposes 14: 95–105. [Google Scholar] [CrossRef]
- Wingate, Ursula. 2015. Academic Literacy and Student Diversity: The Case for Inclusive Practice. Bristol: Multilingual Matters. [Google Scholar]
Departments | N |
---|---|
Psychology | 2 |
International Relations | 2 |
Under-graduate English | 2 |
Architecture | 4 |
Mathematics | 2 |
Aviation | 1 |
Hotel Management and Tourism | 2 |
Engineering | 3 |
Business Administration | 1 |
Not Prepared | Fairly Prepared | Prepared | Well Prepared | ||||||
---|---|---|---|---|---|---|---|---|---|
f | (%) | f | (%) | f | (%) | f | (%) | SD | |
Reading academic texts and understanding the main ideas | 3 | (18) | 9 | (53) | 5 | (29) | 0 | (0) | 0.83 |
Taking reading notes | 1 | (6) | 13 | (77) | 2 | (12) | 0 | (0) | 0.44 |
Understanding lectures | 1 | (6) | 8 | (47) | 7 | (41) | 0 | (0) | 0.62 |
Taking listening notes | 3 | (18) | 8 | (47) | 4 | (24) | 1 | (6) | 0.83 |
Writing an organized essay | 3 | (18) | 9 | (53) | 3 | (18) | 0 | (0) | 0.65 |
Discussing ideas and expressing opinions clearly and accurately in their speech | 9 | (53) | 5 | (29) | 2 | (12) | 1 | (6) | 0.92 |
Asking questions in class | 8 | (47) | 5 | (29) | 3 | (18) | 1 | (6) | 0.95 |
Using a range of vocabulary appropriately | 5 | (29) | 11 | (65) | 1 | (6) | 0 | (0) | 0.56 |
Using a range of grammatical structures in their written and spoken work | 7 | (41) | 9 | (53) | 1 | (6) | 0 | (0) | 0.61 |
Using different sources (notes, summaries, etc.) to support ideas in their written and spoken work | 10 | (59) | 4 | (24) | 2 | (12) | 0 | (0) | 0.73 |
Giving feedback to peers | 7 | (41) | 7 | (41) | 2 | (12) | 0 | (0) | 0.70 |
Revising own written work based on given feedback | 5 | (29) | 7 | (41) | 3 | (18) | 0 | (0) | 0.74 |
Not Prepared | Fairly Prepared | Prepared | Well Prepared | ||||||
---|---|---|---|---|---|---|---|---|---|
f | (%) | f | (%) | f | (%) | f | (%) | SD | |
Reading academic texts and understanding the main ideas | 1 | (3) | 7 | (18) | 20 | (51) | 10 | (26) | 0.75 |
Taking reading notes | 2 | (5) | 11 | (28) | 18 | (46) | 7 | (18) | 0.81 |
Understanding lectures | 1 | (3) | 9 | (23) | 20 | (51) | 9 | (23) | 0.76 |
Taking listening notes | 1 | (3) | 12 | (31) | 12 | (31) | 14 | (36) | 0.89 |
Writing an organized essay | 3 | (8) | 6 | (15) | 13 | (33) | 17 | (44) | 0.95 |
Discussing ideas and expressing opinions clearly and accurately in their speech | 2 | (5) | 14 | (36) | 18 | (46) | 5 | (13) | 0.77 |
Asking questions in class | 2 | (5) | 10 | (26) | 19 | (49) | 8 | (21) | 0.81 |
Using a range of vocabulary appropriately | 4 | (10) | 12 | (31) | 16 | (41) | 7 | (18) | 0.90 |
Using a range of grammatical structures in their written and spoken work | 4 | (10) | 11 | (28) | 18 | (46) | 6 | (15) | 0.87 |
Using different sources (notes, summaries, etc.) to support ideas in their written and spoken work | 5 | (13) | 14 | (36) | 15 | (39) | 5 | (13) | 0.88 |
Giving feedback to peers | 3 | (8) | 15 | (39) | 12 | (31) | 9 | (23) | 0.92 |
Revising own written work based on given feedback | 3 | (8) | 8 | (21) | 13 | (33) | 14 | (36) | 0.96 |
Students’ Weaknesses in Language and Academic Skills | N | % |
---|---|---|
Reading | 7 | 37 |
Listening | 4 | 21 |
Writing | 8 | 42 |
Speaking | 15 | 79 |
Grammar | 9 | 47 |
Vocabulary | 10 | 53 |
Understanding exam questions | 6 | 32 |
Academic Skills (discourse synthesis, knowledge of citation, evaluation of online sources) | 3 | 16 |
Themes | N | % |
---|---|---|
Inadequate speaking skills of the students | 15 | 79 |
Effect of educational background on student achievement | ||
Exam orientedness | 3 | 16 |
Mismatch between prior learning and university culture | 4 | 21 |
Low motivation for reading | 8 | 42 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gokturk Saglam, A.L.; Tsagari, D. Evaluating Perceptions towards the Consequential Validity of Integrated Language Proficiency Assessment. Languages 2022, 7, 65. https://doi.org/10.3390/languages7010065
Gokturk Saglam AL, Tsagari D. Evaluating Perceptions towards the Consequential Validity of Integrated Language Proficiency Assessment. Languages. 2022; 7(1):65. https://doi.org/10.3390/languages7010065
Chicago/Turabian StyleGokturk Saglam, Asli Lidice, and Dina Tsagari. 2022. "Evaluating Perceptions towards the Consequential Validity of Integrated Language Proficiency Assessment" Languages 7, no. 1: 65. https://doi.org/10.3390/languages7010065
APA StyleGokturk Saglam, A. L., & Tsagari, D. (2022). Evaluating Perceptions towards the Consequential Validity of Integrated Language Proficiency Assessment. Languages, 7(1), 65. https://doi.org/10.3390/languages7010065