Washback Effects of Diagnostic Assessment in Greek as an SL: Primary School Teachers’ Perceptions in Cyprus
Abstract
:1. Introduction
2. On Washback and Language Assessment Literacy
3. The Context of the Study
3.1. The Integrating Policy in Cyprus Primary School Education
3.2. The GDLA Tool
4. Materials and Methods
4.1. Research Questions
4.2. Study Design, Instrumentation, and Data Collection
5. Results
5.1. Questionnaire
5.1.1. Construct Validity and Reliability
5.1.2. Background Information
5.1.3. Analysis per Factor
5.2. Interviews
It’s the love the kids give you. It’s the immediate results you can have (.) It’s what the learners share, they say “I learned that from you”, the love, ↑<the results, what you see in children>(I2)
When you teach for the first time, it’s kind of a stressful procedure (.) I’m telling you the truth about me and how it fits my personality. Of course, you get moral satisfaction from the learners and their progress. I believe if I do it for a second year in a row the satisfaction or the creativity would be greater. I need to first clear things in my head, understand the procedure (.) and then I can be creative too.(I3)
↑The groups (.) in this case, we had a big problem. The available teaching hours were very few (…) for example, I had eight teaching hours per week, and I used five of them in the A1 group. But what was left for the A2 one? How could I work through the descriptors and cover the communication themes of the curriculum? When you experience such a stressful situation, I think that creativity is reduced.(I1)
Two teaching hours per week! (…) the teaching hours were the minimum (…) I had the students for two hours per week (.) but of course they were not enough.(I4)
Diagnostic assessment is of pivotal importance for young migrant learners: It shows you the way. You gain valuable feedback = especially in the oral modality.(I2)
The tool is well-structured and so are its guidelines.(I2)
↑I think I had a lot of help from the tool (.) I mean the rapport I developed with the students during the speaking component had helped me a lot to understand each one’s personality.(I3)
I now turn back some months ago and I realize that the tool gave me a clear picture of the students’ strengths and weaknesses.(I4)
That was a fair procedure with a <carefully planned scoring>. ↑ However, reflecting on the students’ written productions gave me a hard::: time. I didn’t know how to do it successfully (.) I need more detailed guidelines, more training on that.(I1)
The kids could not respond to the tool’s activities at the beginning of the year. It took them more time (…) we did not have such problems during the formative or summative assessment (.) At the final assessment things happened faster.(I2)
We had problems in the communication themes that the students had not yet been taught. For example, the weather forecast theme (.) That was difficult for them.(I2)
We implemented the test at the ↑beginning of the school year (.) and so we could see what some students remembered from the previous year. Summer holidays were in between (.) Maybe they needed some revision courses (…) maybe that was unfair for them.(I4)
I4: There was not enough time for the test (…) some children read in <a very slow pace>, some of them spelled out the words (.)Researcher: Wasn’t that indicative of their needs, though?I4: It is a parameter, ↑ yes. I wrote that down back then (.) it is a parameter that I had to consider.
I now see my notes <after scoring and reflecting on the learners’ performance> (.) and they give me information that might not cοme to my notice.(I2)
The tool gave us information, for sure. It helped us not to form groups based just on our intuitive judgements.(I2)
I could look at the results and say “↑Ok, good, here <in the reading section> some students could manage the task (.) and some other kids could not read at all: =so it helped me in forming my level-appropriate groups.(I4)
The four skills that we emphasize on were very clear (.) so we had help from the tool on that, to have a vivid picture of where the students are in listening and speaking or in reading and writing (…) the tool’s contribution was crucial and it definitely helped us a lot to proceed with a fair placement of the learners in the appropriate groups for the preparatory courses.(I2)
I could give extra lessons to a student of mine, if I noticed from the tool that she needed help in listening skills (.) ↑I knew I needed to plan more listening activities.(I2)
The tool was a BIG help:: Till now, I used to plan my teaching in a theme-oriented way (.) The skills didn’t really matter. I could have several lessons in a row teaching vocabulary and grammar (.) < e.g., clothes and accessories and how to build simple sentences on shopping>. =And as for listening (.) I was content with the teacher–student exchange within the classroom (.) I am involved in a change process now I search for age- and level-appropriate material to include it in listening tasks (.) real and authentic listening tasks::(I3)
The tool gave me guidance on what activities and tasks to choose or plan (.) I moved forward one step at a time.(I4)
Yes, the tool guides me in preparing my lessons as well (.) Till now I used extended narratives and drill and practice activities(.) but ↑content matters. The tool has dialogues, posters, invitations, and other text types.(I5)
I can’t say I looked at the test content (.) no, no (.) we weren’t so far away from what the test contained (.) the test did not determine how I was going to handle the content (.) I didn’t have the test as a model in order to decide on the material.(I2)
(…) I think, being based on this tool, I could design my own tests and plan other assessment processes for the evaluation of each thematic unit I teach.(I5)
The activities are of graded difficulty (.) They function as a compass in grading the difficulty of the tasks we plan so that all students’ needs are met.(I5)
The test is structured in a hierarchical way, progressing from the simple to the more difficult activities (.) I tried the same with my classes.(I3)
The test is based on the new curriculum (…) that means there is consistency and continuity.(I3)
I must admit that having such a material available, I felt safe. (.) There was a textbook, a guide, a curriculum to rely on; (.) I was confident to “transmit” the relevant knowledge and develop the appropriate skills.(I3)
6. Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
- Questionnaire
- Q1. Education:
- Q2. Years of teaching experience in preparatory courses of Greek as a second language:
- Q3. Teaching hours of Greek as a second language per week during this school year:
- Q4. Ι teach preparatory classes of Greek as a second language to:
- Q5. During this school year I have set up…preparatory courses of Greek as a second language:
- Q6. I participated in the Teacher Training Networks for Greek as a second language.
- Q1. Teaching Greek as a second language in preparatory classes gives space to my creativity.
- Q2. Teaching Greek as a second language in preparatory classes gives me satisfaction.
- Q3. I consider the provision of a diagnostic tool during the first contact with migrant students:
- Q4. A diagnostic tool is useful to provide feedback on oral skills.
- Q5. A diagnostic tool is useful to provide feedback on written skills.
- Q6. I apply diagnostic assessment to all migrant students.
- Q7. During the implementation of diagnostic assessment, I provide students with additional guidelines, each time they cannot understand the aim of a task.
- Q8. The time needed to apply the GDLA tool was the one given in the guidelines.
- Q9. The GDLA tool is aligned with the new curriculum for Greek as an SL.
- Q10. The GDLA tool is aligned with the curriculum’s accompanying descriptors.
- Q11. The GDLA tool includes useful implementation guidelines.
- Q12. The GDLA tool includes a valid and reliable scoring system for the students.
- Q13. The GDLA tool includes an easy scoring system for the educator.
- Q14. The GDLA tool contributes to highlighting the strengths and needs of the students.
- Q15. The GDLA tool contributes to the successful placement of students in the respective preparatory classes based on their level of Greek proficiency.
- Q16. The GDLA tool helps me determine the duration of the teaching I dedicate to each skill.
- Q17. The GDLA tool guides me on my teaching material (content, worksheets, tasks, tools, etc.).
- Q18. The GDLA tool helps me to apply differentiated learning instruction.
- Q19. The GDLA tool shows me on which skills to focus during my teaching.
- Q20. The GDLA tool shows me how to best design language tests and assessments.
Appendix B
- Do you feel content or creative when teaching Greek as a second language in preparatory classes? Yes or no? Why do you think you feel that way?
- Do you think that using a diagnostic tool is important when teaching migrant students? Yes or no? Why?
- Which language skills are important to receive feedback on when using a diagnostic tool?
- Do you see an alignment between the GDLA tool with the new SL curriculum and its accompanying descriptors? If yes, where do you see this alignment?
- How do you find the implementation process of the GDLA tool?
- How do you find the scoring system of the GDLA tool?
- Do you believe that the GDLA tool may influence your teaching in preparatory classes? Yes or no? If yes, in what ways?
- Do you believe that the GDLA tool may influence the assessment forms you use in the preparatory classes? If yes, in what ways?
Appendix C
Factors (F) | Items | Mean | SD |
---|---|---|---|
F1: GDLA’s Washback on Teaching and Assessment | Classroom time management | 3.32 | 0.96 |
Teaching material | 3.50 | 0.92 | |
Differentiated teaching | 3.44 | 0.93 | |
Skills/modalities to emphasize on | 3.74 | 0.86 | |
Language testing and assessment | 3.47 | 0.89 | |
Needs analysis | 3.76 | 0.89 | |
Placement | 3.75 | 0.90 | |
F2: GDLA’s Usefulness and Credibility | Useful guidelines | 3.87 | 0.89 |
Valid and reliable scoring | 3.66 | 0.94 | |
Easy scoring | 3.83 | 0.94 | |
F3: Feedback and Importance of Diagnostic Assessment | Feedback on skills in the oral modality | 4.29 | 0.87 |
Feedback on skills in the written modality | 4.17 | 0.85 | |
Importance of diagnostic assessment | 4.38 | 0.85 | |
Implementation of diagnostic assessment | 4.55 | 0.74 | |
F4: GDLA’s Alignment with the SL Curriculum | Alignment with the SL curriculum descriptors | 3.71 | 0.83 |
Alignment with the CGSL’s programmatic text | 3.76 | 0.77 | |
F5: Motivation in SL Teaching | Creativity | 3.51 | 1.05 |
Satisfaction | 3.68 | 1.03 |
1 | The following example illustrates such a policy. A qualified GAL teacher serves at school A, which has a high share of migrant students. Due to the current transfer model for teachers that applies in Cyprus primary schools, if a more experienced teacher (in total years of employment but not in teaching years in GAL preparatory classes) asks to be transferred to that specific school, school A, then the qualified GAL teacher might be obliged to transfer to any other school (that might or might not have a need for GAL teachers). It must also be noted that school leaders are not involved in teacher recruitment for their schools (European Commission 2019a). |
2 | The GDLA’s component for first graders is built upon an integrated approach to oral communication (listening and speaking). The students participate in a discussion with their teacher on everyday routines (school, home, siblings, hobbies, pets, etc.) and they are asked to talk about pictures tightly connected to their experiences and interests, e.g., listening to music, playing, etc. They also listen and point at the relevant picture (this might include greetings, comparisons, etc.). Some of the tasks aim at the understanding of contextualized or/and illustrated functional vocabulary (“show and tell” or “find the picture”). The GDLA’s component for first graders is available at https://www.pi.ac.cy/pi/files/epimorfosi/entaxi/diagnostiko_dokimio_a_dimotikou.pdf (accessed on: 4 November 2021). |
3 | The GDLA’s component for second to sixth graders assesses the learners’ performance in all four language skills through authentic/genuine texts (both oral and written, mainly multimodal texts: i.e. posters, invitations, weather forecast, visiting a store, shopping at the school canteen, writing an email, etc.). Available at https://www.pi.ac.cy/pi/files/epimorfosi/entaxi/diagnostiko_dokimio_v_eos_st_taxeis.pdf (accessed on: 4 November 2021). |
References
- Alderson, Charles J. 2005. Diagnosing Foreign Language Proficiency: The Interface between Learning and Assessment. London and New York: Continuum. [Google Scholar]
- Alderson, Charles J. 2011. Innovations and Challenges in Diagnostic Testing. In Selected Papers from 2011 PAC/The Twentieth International Symposium on English Teaching. Taipei: English Teachers’ Association of the Republic of China. [Google Scholar]
- Alderson, Charles J., and Diane Wall. 1993. Does washback exist? Applied Linguistics 14: 115–29. [Google Scholar] [CrossRef]
- Alderson, Charles J., and Liz Hamp-Lyons. 1996. TOEFL Preparation Courses: A Study of Washback. Language Testing 13: 280–97. [Google Scholar] [CrossRef]
- Andrews, Stephen. 2004. Washback and Curriculum Innovation. In Washback in Language Testing: Research Contexts and Methods, 1st ed. Edited by Liying Cheng, Yoshinori Watanabe and Andy Curtis. Mahwah: Lawrence Erlbaum Associates, New York: Routledge. [Google Scholar]
- Antonopoulou, Niovi. 2004. The Washback Effect of Testing on Teaching Greek as a Second/Foreign Language. Language Testing Update 36: 100–4. [Google Scholar]
- Bachman, Lyle F., and Adrian S. Palmer. 1996. Language Testing in Practice: Designing and Developing Useful Language Tests. Oxford: Oxford University Press. [Google Scholar]
- Bailey, Alison L. 2017. Assessing the language of young learners. In Encyclopedia of Language and Education, 3rd ed. Edited by Elana Shohamy, Iair G. Or and Stephen May. Berlin: Springer, pp. 323–42. [Google Scholar]
- Bailey, Alison L., Margaret Heritage, and Frances A. Butler. 2013. Developmental considerations and curricular contexts in the assessment of young language learners. The Companion to Language Assessment 1: 421–39. [Google Scholar]
- Brown, James D., and Thom Hudson. 1998. The alternatives in language assessment: Advantages and disadvantages. University of Hawai’i Working Papers in ESL 16: 79–103. [Google Scholar]
- Burrows, Catherine. 2004. Washback in classroom-based assessment: A study of the washback effect in the Australian adult migrant English program. In Washback in Language Testing, 1st ed. Edited by Lying Cheng and Yoshimori Watanabe. Mahwah: Lawrence Erlbaum, pp. 113–28. [Google Scholar]
- Butler, Yuko Goto. 2016. Assessing young learners. In Handbook of Second Language Assessment, 1st ed. Edited by Dina Tsagari and Jayanti Banerjee. Berlin and Boston: De Gruyter Mouton, pp. 359–76. [Google Scholar]
- Cheng, Liying, and Andy Curtis. 2004. Washback or backwash: A review of the impact of testing on teaching and learning. In Washback in Language Testing: Research Contexts and Methods. Edited by Liying Cheng, Yoshinori Watanabe and Andy Curtis. Mahwah: Lawrence Erlbaum Associates, pp. 3–17. [Google Scholar]
- Cheng, Liying, and Yoshimori Watanabe. 2004. Washback in Language Testing Research Contexts and Methods. Mahwah: Lawrence Erlbaum Associates. [Google Scholar]
- Cheng, Liying. 2005. Changing Language Teaching through Language Testing: A Washback Study. Cambridge: Cambridge University Press. [Google Scholar]
- Collins, John B., and Nicholas H. Miller. 2018. The TOEFL (ITP): A survey of teacher perceptions. Shiken 22: 1–13. [Google Scholar]
- Coste, Daniel, Danièle Moore, and Geneviève Zarate. 2009. Plurilingual and Pluricultural Competence. Studies towards a Common European Framework of Reference for Language Learning and Teaching. Strasbourg: Council of Europe Publishing. [Google Scholar]
- Davies, Alan. 2008. Textbook trends in teaching language testing. Language Testing 25: 327–47. [Google Scholar] [CrossRef]
- Davies, Alan. 2014. Fifty Years of Language Assessment. In The Companion to Language Assessment. Edited by Antony John Kunnan. Hoboken: Wiley Online Library, vol. 1, pp. 1–21. [Google Scholar]
- Dörnyei, Zoltán. 2007. Research Methods in Applied Linguistics. Oxford: Oxford University Press. [Google Scholar]
- European Commission. 2019a. Peer Counselling on Integration of Students with a Migrant Background into Schools. Brussels: European Commission, Available online: https://www.pi.ac.cy/pi/files/epimorfosi/entaxi/Peer_counselling_integration_of_migrant_students_final_report.pdf (accessed on 19 September 2021).
- European Commission. 2019b. Proposal for a Council Recommendation on a comprehensive approach to the teaching and learning of languages. European Journal of Language Policy 11: 129–37. [Google Scholar]
- Fan, Tingting, Jieqing Song, and Zheshu Guan. 2021. Integrating Diagnostic Assessment into Curriculum: A Theoretical Framework and Teaching Practices. Language Testing in Asia 11: 1–23. [Google Scholar] [CrossRef]
- Fulcher, Glenn. 2012. Assessment Literacy for the Language Classroom. Language Assessment Quarterly 9: 113–32. [Google Scholar] [CrossRef]
- Fulcher, Glenn. 2020. Operationalizing Assessment Literacy. In Assessment Literacy. Edited by Dina Tsagari. Cambridge: Cambridge Scholars. [Google Scholar]
- Garcia, Ofelia, and Wei Li. 2014. Translanguaging: Language, Bilingualism, and Education. London: Palgrave Macmillan. [Google Scholar]
- Green, Anthony. 2007. Washback to Learning Outcomes: A Comparative Study of IELTS Preparation and University Pre-sessional Language Courses. Assessment in Education: Principles, Policy & Practice 14: 75–97. [Google Scholar]
- Green, Anthony. 2013. Washback in language assessment. International Journal of English Studies 13: 39–51. [Google Scholar] [CrossRef] [Green Version]
- Hamp-Lyons, Liz. 1997. Washback, Impact and Validity: Ethical Concerns. Language Testing 14: 295–303. [Google Scholar] [CrossRef]
- Hasselgreen, Angela, Cecile Carlsen, and Hidegunn Helness. 2004. European Survey of Language Testing and Assessment Needs. Report: Part One—General Findings. Available online: http://www.ealta.eu.org/documents/resources/survey-report-pt1.pdf (accessed on 29 October 2021).
- Hughes, Arthur. 1989. Testing for Language Teachers. Cambridge: Cambridge University Press. [Google Scholar]
- Inbar, Ofra, Elana Shohamy, and Claire Gordon. 2005. Considerations involved in the language assessment of young learners. ILTA Online Newsletter 2: 3. [Google Scholar]
- Inbar-Lourie, Ofra. 2013. Guest Editorial to the special issue on language assessment literacy. Language Testing 30: 301–7. [Google Scholar] [CrossRef]
- Inbar-Lourie, Ofra. 2017. Language assessment literacy. In Language Testing and Assessment, 3rd ed. Edited by Elana Shohamy, Iair G. Or and May Stephen. Cham: Springer, pp. 257–68. [Google Scholar]
- Jang, Eunice E. 2013. Diagnostic assessment in language classrooms. In The Routledge Handbook of Language Testing. Edited by Glenn Fulcher and Fred Davidson. Abingdon: Routledge, pp. 134–48. [Google Scholar]
- Jimola, Folasade Esther, and Graceful Onovughe Ofodu. 2019. ESL Teachers and Diagnostic Assessment: Perceptions and Practices. ELOPE: English Language Overseas Perspectives and Enquires 16: 33–48. [Google Scholar]
- Kościółek, Jakub. 2020. Children with migration backgrounds in Polish schools: Problems and challenges. Annales. Series Historia et Sociologia 30: 643–56. [Google Scholar]
- Kunnan, Antony John. 2013. Language assessment for immigration and citizenship. In The Routledge Handbook of Language Testing, 1st ed. Edited by Glenn Fulcher and Fred Davidson. New York: Routledge, pp. 176–91. [Google Scholar]
- Kyriakou, Nansia. 2014. Investigating Teaching and Learning Greek as an Additional Language in Public Primary Schools Is Cyprus. Paper presented at INTCESS14—International Conference on Education and Social Sciences, Istanbul, Turkey, February 3–5; Edited by Ferit Uslu. Istanbul: OCERINT, pp. 548–59. [Google Scholar]
- Lam, Ricky. 2015. Language Assessment Training in Hong Kong: Implications for Language Assessment Literacy. Language Testing 32: 169–97. [Google Scholar] [CrossRef]
- Lee, Yong-Won. 2015. Diagnosing Diagnostic Language Assessment. Language Testing 32: 299–316. [Google Scholar] [CrossRef]
- Leung, Constant, and Jo Lewkowicz. 2017. Assessing Second/Additional Language of Diverse Populations. In Encyclopedia of Language and Education, 3rd ed. Edited by Elana Shohamy, Iair G. Or and Stephen May. Berlin: Springer, pp. 343–58. [Google Scholar]
- Levi, Tziona, and Ofra Inbar-Lourie. 2020. Assessment literacy or language assessment literacy: Learning from the teachers. Language Assessment Quarterly 17: 168–82. [Google Scholar] [CrossRef]
- Liu, Jianda, and Ximei Li. 2020. Assessing Young English Learners: Language Assessment Literacy of Chinese Primary School English Teachers. International Journal of TESOL Studies 2: 36–49. [Google Scholar]
- Malone, Margaret E. 2013. The essentials of assessment literacy: Contrasts between testers and users. Language Testing 30: 329–44. [Google Scholar] [CrossRef]
- Menken, Kate. 2006. Teaching to the Test: How No Child Left Behind Impacts Language Policy, Curriculum, and Instruction for English Language Learners. Bilingual Research Journal 30: 521–46. [Google Scholar] [CrossRef]
- Menken, Kate. 2013. Emergent Bilingual Students in Secondary School: Along the Academic Language and Literacy Continuum. Language Teaching 46: 438–76. [Google Scholar] [CrossRef] [Green Version]
- Menken, Kate. 2017. High-Stakes Tests as De Facto Language Education Policies. In Encyclopedia of Language and Education, 3rd ed. Edited by Elana Shohamy, Iair G. Or and Stephen May. Berlin: Springer, pp. 385–96. [Google Scholar]
- Min, Jiayi, and Moonyoung Park. 2020. Investigating Test Practices and Washback Effects: Voices from Primary School English Teachers in Hong Kong. The Korea English Language Testing Association 15: 77–97. [Google Scholar] [CrossRef]
- Mitsiaki, Maria, and Ioannis Lefkos. 2018. ELeFyS: A Greek Illustrated Science Dictionary for School. Paper presented at XVIII EURALEX International Congress, Ljubljana, Slovenia, July 17–21; Edited by Jaka Čibej, Vojko Gorjanc, Iztok Kosem and Simon Krek. Ljubljana: Ljubljana University Press, Faculty of Arts, pp. 373–85. [Google Scholar]
- Mitsiaki, Maria, Chrysovalanti Giannaka, and Despo Kyprianou. 2020a. Greek Diagnostic Language Assessment tool: 1st Graders. Cyprus Pedagogical Institute. Available online: https://www.pi.ac.cy/pi/files/epimorfosi/entaxi/diagnostiko_dokimio_a_dimotikou.pdf (accessed on 14 November 2021).
- Mitsiaki, Maria, Chrysovalanti Giannaka, and Despo Kyprianou. 2020b. Greek Diagnostic Language Assessment Tool: 2nd to 6th Graders. Cyprus Pedagogical Institute. Available online: https://www.pi.ac.cy/pi/files/epimorfosi/entaxi/diagnostiko_dokimio_v_eos_st_taxeis.pdf (accessed on 14 November 2021).
- Mitsiaki, Maria. 2020. Curriculum of Greek as a Second Language. Cyprus Pedagogical Institute. Available online: https://www.pi.ac.cy/pi/files/epimorfosi/entaxi/aps_isbn.pdf (accessed on 14 November 2021).
- MOECSY. 2021. Intercultural Education. Statistics. Department of Primary Education. Available online: http://www.moec.gov.cy/dde/diapolitismiki/statistika_dimotiki.html (accessed on 20 September 2021).
- Papakammenou, Irini. 2020. The Importance of Washback Effect in Teachers’ Assessment Literacy. The Stepping Stone for more Learner-centred Exam-classes. In Language Assessment Literacy: From Theory to Practice. Edited by Dina Tsagari. Cambridge: Cambridge Scholars Publishing, pp. 286–303. [Google Scholar]
- Pehlivan Şişman, Emine, and Kağan Büyükkarcı. 2019. A Review of Foreign Language Teachers’ Assessment Literacy. Sakarya University Journal of Education 9: 628–50. [Google Scholar] [CrossRef]
- Petridou, Alexandra, and Yasemina Karagiorgi. 2017. Validation of Greek Language Proficiency Tests “Milas Ellinika I” in Cyprus Context. Epistimes Tis Agogis 3: 80–99. [Google Scholar]
- Poehner, Matthew E., Kristin J. Davin, and James P. Lantolf. 2017. Dynamic Assessment. In Language Testing and Assessment. Edited by Elana Shohamy, Iair G. Or and Stephen May. Cham: Springer International Publishing, pp. 243–56. [Google Scholar]
- Popham, James W. 2009. Assessment Literacy for Teachers: Faddish or Fundamental? Theory into Practice 48: 4–11. [Google Scholar] [CrossRef]
- Rea-Dickins, Pauline, and Catriona Scott. 2007. Washback from Language Tests on Teaching, Learning and Policy: Evidence from Diverse Settings. Assessment in Education: Principles, Policy & Practice 14: 1–7. [Google Scholar]
- Rea-Dickins, Pauline, and Sheena Gardner. 2000. Snares and Silver Bullets: Disentangling the Construct of Formative Assessment. Language Testing 17: 215–43. [Google Scholar] [CrossRef]
- Sachdev, Sheetal B., and Harsh V. Verma. 2004. Relative importance of service quality dimensions: A multisectoral study. Journal of Services Research 4: 93–116. [Google Scholar]
- Scarino, Angela. 2013. Language assessment literacy as self-awareness: Understanding the role of interpretation in assessment and in teacher learning. Language Testing 30: 309–27. [Google Scholar] [CrossRef]
- Shohamy, Elana G. 2009. Language Tests for Immigrants: Why Language? Why Tests? Why Citizenship? In Discourse Approaches to Politics, Society and Culture. Edited by Gabrielle Hogan-Brun, Clare Mar-Molinero and Patrick Stevenson. Amsterdam: John Benjamins Publishing Company, vol. 33, pp. 45–60. [Google Scholar]
- Shohamy, Elana G. 2017. Critical Language Testing. In Language Testing and Assessment, 3rd ed. Edited by Elana Shohamy, Iair G. Or and Stephen May. Cham: Springer, pp. 441–54. [Google Scholar]
- Shohamy, Elana. 2001. The Power of Tests: A Critical Perspective on the Uses of Language Tests. Language in Social Life Series; New York: Longman. [Google Scholar]
- Stecher, Brian, Tammi Chun, and Sheila Barron. 2004. The effects of assessment-driven reform on the teaching of writing in Washington State. In Washback in Language Testing: Research Contexts and Methods. Edited by Liying Cheng, Yoshinori Watanabe and Andy Curtis. Mahwah: Lawrence Erlbaum Associates, pp. 53–69. [Google Scholar]
- Taylor, Lynda, and Jayanti Banerjee. 2013. Communicating the theory, practice and principles of language testing to test stakeholders: Some reflections. Language Testing 30: 403–12. [Google Scholar]
- Tsagari, Dina, and Jayanti Banerjee. 2014. Language Assessment in the Educational Context. In The Routledge Handbook of Educational Linguistics, 1st ed. Edited by Martha Bigelow and Johanna Ennser-Kananen. New York: Routledge, pp. 339–52. [Google Scholar]
- Tsagari, Dina, and Karin Vogt. 2017. Assessment Literacy of Foreign Language Teachers around Europe: Research, Chal-lenges and Future Prospects. Papers in Language Testing and Assessment 6: 41–63. [Google Scholar]
- Tsagari, Dina, and Liying Cheng. 2017. Washback, Impact, and Consequences Revisited. In Language Testing and Assessment. Edited by Elana Shohamy, Iair G. Or and Stephen May. Berlin: Springer, pp. 359–372. [Google Scholar]
- Tsagari, Dina. 2007. Review of Washback in Language Testing: What Has Been Done? What More Needs Doing? Available online: https://files.eric.ed.gov/fulltext/ED497709.pdf (accessed on 14 November 2021).
- Tsagari, Dina. 2009. The Complexity of Test Washback: An Empirical Study. Frankfurt am Main: Peter Lang GmbH. [Google Scholar]
- Tsagari, Dina. 2011. Investigating the ‘assessment literacy’ of EFL state school teachers in Greece. In Classroom-Based Language Assessment. Edited by Dina Tsangari and Ildikó Csépes. Frankfurt am Main: Peter Lang, pp. 169–90. [Google Scholar]
- Vogt, Karin, and Dina Tsagari. 2014. Assessment Literacy of Foreign LanguageTeachers: Findings of a European Study. Language Assessment Quarterly 11: 374–402. [Google Scholar] [CrossRef]
- Watanabe, Yoshinori. 1996. Investigating Washback in Japanese EFL Classrooms: Problems of Methodology. Australian Review of Applied Linguistics. Supplement Series 13: 208–39. [Google Scholar] [CrossRef]
- Watanabe, Yoshinori. 1997. The Washback Effects of the Japanese University Entrance Examinations of English: Classroom-Based Research. Ph.D. Thesis, University of Lancaster, Lancaster, UK. [Google Scholar]
- Watanabe, Yoshinori. 2008. Methodology in washback studies. In Washback in Language Testing: Research Contexts and Methods. Edited by Cheng Liying and Watanabe Yoshinori. Mahwah: Lawrence Erlbaum Associates, pp. 19–36. [Google Scholar]
- Zhang, Limei, and Kaycheng Soh. 2016. Assessment Literacy of Singapore Chinese Language Teachers in Primary and Sec-ondary Schools. In Teaching Chinese Language in Singapore. Edited by Kaycheng Soh. Singapore: Springer Singapore, pp. 85–103. [Google Scholar]
Factors (F) | Items | 1 | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|---|
F1: GDLA’s Washback on Teaching and Assessment | Q.B16 Classroom time management | 0.884 | ||||
Q.B17 Teaching material | 0.862 | |||||
Q.B18 Differentiated teaching | 0.840 | |||||
Q.B19 Skills/modalities to emphasize on | 0.736 | |||||
Q.B20 Language testing and assessment | 0.710 | |||||
Q.B14 Needs analysis | 0.626 | |||||
Q.B15 Placement | 0.610 | |||||
F2: GDLA’s Usefulness and Credibility | Q.B11 Useful guidelines | 0.823 | ||||
Q.B12 Valid and reliable scoring | 0.781 | |||||
Q.B13 Easy scoring | 0.771 | |||||
F3: Feedback and Importance of Diagnostic Assessment | Q.B4 Feedback on skills in the oral modality | 0.801 | ||||
Q.B5 Feedback on skills in the written modality | 0.786 | |||||
Q.B3 Importance of diagnostic assessment | 0.636 | |||||
Q.B6 Implementation of diagnostic assessment | 0.600 | |||||
F4: GDLA’s Alignment with the SL Curriculum | Q.B10 Alignment with the SL curriculum descriptors | 0.838 | ||||
Q.B9 Alignment with the CGSL’s programmatic text | 0.821 | |||||
F5: Motivation in SL Teaching | Q.B1 Creativity | 0.904 | ||||
Q.B2 Satisfaction | 0.901 | |||||
Variance Explained (%) | 44.72 | 10.18 | 7.87 | 7.39 | 5.58 | |
Cumulative Variance | 44.72 | 54.90 | 62.77 | 70.16 | 75.74 |
n | % | ||
---|---|---|---|
Q.A1 Education | Bachelor’s Degree in General Education | 115 | 49.1 |
Master’s Degree in General Education | 105 | 44.9 | |
Master’s Degree in Greek as an SL | 14 | 6.0 | |
Q.A2 Years of Teaching in SL classes | Up to 2 years | 154 | 65.8 |
Up to 4 years | 53 | 22.6 | |
5 or more years | 27 | 11.6 | |
Q.A3 Hours of SL teaching per week | 0 to 6 h | 153 | 65.4 |
7 to 13 h | 57 | 24.3 | |
14 or more hours | 24 | 10.3 | |
Q.A4 Types of SL classes | 1st graders | 34 | 14.5 |
2nd–6th graders | 97 | 41.5 | |
1st–6th graders | 103 | 44.0 | |
Q.A5 Number of SL classes | Up to 2 classes | 134 | 57.2 |
Up to 4 classes | 65 | 27.8 | |
5 or more classes | 35 | 15.0 | |
Q.A6 Participation in SL Teacher Training Networks | Yes | 112 | 47.9 |
No | 122 | 52.1 |
Factors | M | SD |
---|---|---|
F1 GDLA’s Washback on Teaching and Assessment | 3.56 | 0.75 |
F2 GDLA’s Usefulness and Credibility | 3.78 | 0.85 |
F3 Feedback and Importance of Diagnostic Assessment | 4.34 | 0.62 |
F4 GDLA’s Alignment with the SL curriculum | 3.73 | 0.77 |
F5 Motivation in SL Teaching | 3.59 | 0.98 |
n | Interviewees(I) | ||
---|---|---|---|
Education | Bachelor’s Degree in General Education | ||
Master’s Degree in General Education | 6 | I1, I2, I3, I4, I5, I6 | |
Master’s Degree in Greek as an SL | |||
Years of Teaching in SL classes | Up to 2 years | 3 | I3, I6 |
Up to 4 years | 2 | I2, I4, I5 | |
5 or more years | 1 | I1 | |
Hours of SL teaching per week | 0 to 6 hours | ||
7 to 13 hours | 4 | I1, I2, I3 | |
14 or more hours | 2 | I4, I5 | |
Types of SL classes | 1st graders | ||
2nd–6th graders | |||
1st–6th graders | 6 | I1, I2, I3, I4, I5, I6 | |
Number of SL classes | Up to 2 classes | 1 | I2 |
Up to 4 classes | 3 | I1, I3, I6 | |
5 or more classes | 2 | I4, I5 | |
Participation in SL Teacher Training Networks | Yes | 6 | I1, I2, I3, I4, I5, I6 |
No |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mitsiaki, M.; Kyriakou, N.; Kyprianou, D.; Giannaka, C.; Hadjitheodoulou, P. Washback Effects of Diagnostic Assessment in Greek as an SL: Primary School Teachers’ Perceptions in Cyprus. Languages 2021, 6, 195. https://doi.org/10.3390/languages6040195
Mitsiaki M, Kyriakou N, Kyprianou D, Giannaka C, Hadjitheodoulou P. Washback Effects of Diagnostic Assessment in Greek as an SL: Primary School Teachers’ Perceptions in Cyprus. Languages. 2021; 6(4):195. https://doi.org/10.3390/languages6040195
Chicago/Turabian StyleMitsiaki, Maria, Nansia Kyriakou, Despo Kyprianou, Chrysovalanti Giannaka, and Pavlina Hadjitheodoulou. 2021. "Washback Effects of Diagnostic Assessment in Greek as an SL: Primary School Teachers’ Perceptions in Cyprus" Languages 6, no. 4: 195. https://doi.org/10.3390/languages6040195
APA StyleMitsiaki, M., Kyriakou, N., Kyprianou, D., Giannaka, C., & Hadjitheodoulou, P. (2021). Washback Effects of Diagnostic Assessment in Greek as an SL: Primary School Teachers’ Perceptions in Cyprus. Languages, 6(4), 195. https://doi.org/10.3390/languages6040195