Students’ Perception of the Use of a Rubric and Peer Reviews in an Online Learning Environment
Abstract
:1. Introduction
2. Materials and Methods
2.1. Participants
2.2. Instruments
2.3. Procedure
2.4. The Survey
2.5. Data Analysis
3. Results
Question Identifier | Strongly Agree | Agree | Neutral | Disagree | Strongly Disagree | |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
Category A: Peer assessment performance | ||||||
Q1 | My performance improved over the five peer assessment opportunities. | 14% | 40% | 26% | 17% | 2% |
Category B: Clearer learning outcomes and summative assessment requirements | ||||||
Q2 | The opportunity to be involved in peer assessment helped develop my content knowledge of what was expected in the final assessments. | 19% | 45% | 26% | 5% | 5% |
Q3 | The rubric has helped me understand the learning outcomes and final assessment requirements | 19% | 52% | 17% | 7% | 5% |
Q4 | I clearly understand the learning outcomes and final assessment requirements following the opportunity to engage in peer assessment. | 19% | 26% | 29% | 21% | 5% |
Category C: Better structure and quality writing | ||||||
Q5 | The rubric has helped structure my writing and improve my writing style. | 29% | 43% | 19% | 5% | 5% |
Q6 | I have developed the ability to write analytically, using evidence to support my answers. | 31% | 43% | 19% | 5% | 2% |
Category D: Deep approach to learning and critical thinking | ||||||
Q7 | I have developed the ability to analyse content and think critically about the subject knowledge. | 24% | 45% | 21% | 7% | 2% |
Q8 | The peer assessment helped me with a deeper understanding of the subject knowledge of the module. | 17% | 48% | 26% | 5% | 5% |
Category E: Better analysis, application and synthesis of theory and concepts | ||||||
Q9 | The peer assessment and rubric have helped apply the subject knowledge to real-life situations | 19% | 36% | 36% | 5% | 5% |
Q10 | I feel confident in answering essay-type questions for this module following the peer-assessment experience. | 10% | 36% | 31% | 14% | 10% |
4. Discussion
- “I followed the rubric and used all the elements, which improved my work” (Participant #41).
- “Upon self-reflection, I could reanalyse my submitted work and dedicate more time to investigating my errors and, as such, try to improve on the identified weak areas …” (Participant #3).
- “Through self-reflection, I could reanalyse my own work that was submitted. I dedicated time to understanding my own errors and tried my best to improve” (Participant #33)
- “I followed the rubric, used all the elements given, and this improved my work and the structure of my writing” (Participant #41).
- “I started using the structure of introduction, linking theory to the scenario, and concluding. The rubric helped me structure my writing” (Participant #35).
- “I followed the rubric and ensured I had an introduction, body and conclusion, checked my spellings and punctuation, and applied the concepts to the scenario instead of regurgitating the theory” (Participant #5).
- “Yes, I understood all the content from the textbook, but applying the content to the scenarios helped me develop a deeper understanding of the content” (Participant #41).
- “Yes, they (rubric and peer assessment) required deep analysis of the module, and the rubric helped with steering me in the correct direction in terms of answering and analysing’ (Participant #24). Another student stated that
- “It (the rubric) has developed my critical thinking ability and taught me how to apply such a skill in the real-life application as well as in different scenarios like essay-type questions” (Participant #9).
- “Yes … I used just to paraphrase content from the textbook, forgetting that evidence is crucial in the proper application. Now I can say it assisted me with thinking deeper about the content and giving analytical responses relating to the question at hand” (Participant #27).
- “Yes, I could use the textbook knowledge on real-life situations” (Participant #25).
- “It helped me because I realised I had to think a little bit beyond just what the textbook says; I need to apply my thoughts on whatever it is about and base my answers on that” (Participant #5).
- “Yes. I had a better understanding of the work without forgetting” (Participant #6).
- “Yes, because I knew that after stating a statement, I had to justify it using the given passage” (Participant #29). Another stated,
- “Yes, I was able to link the theory to the scenario without losing the context of the answer” (Participant #22).
- “I would answer or explain what is being asked without relating it to the scenario…” (Participant #48), whilst another said,
- “I would use anything but without evidence and linking theory to the case study/scenario given …” (Participant # 17).
- “It forced me to think critically about what I learned and how I could apply it to a given situation in a way that was both efficient and beneficial” (Participant #17).
- ○
- Another student (Participant #27) indicated that,
- “Applying the rubric forced him to think creatively in formulating his answers for the long questions”.
- “The reviews should be conducted by lectures and not students” (Participant#15).
- “Lecturers should mark the long questions” (Participant #43).
- “It has to be evaluated by the lecturers before the marks are given to students because some students were lazy when they carried out the reviews” (Participant #12).
5. Conclusions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Adachi, Chie, Joanna Tai, and Phillip Dawson. 2017. A Framework for Designing, Implementing, Communicating and Researching Peer Assessment. Higher Education Research & Development 37: 453–67. [Google Scholar] [CrossRef]
- Alqassab, Maryam, Jan Willem Strijbos, and Stefan Ufer. 2018. Training Peer-Feedback Skills on Geometric Construction Tasks: Role of Domain Knowledge and Peer-Feedback Levels. European Journal of Psychology of Education 33: 11–30. [Google Scholar] [CrossRef]
- Anderson, Rebecca S. 1998. Why Talk about Different Ways to Grade? The Shift from Traditional Assessment to Alternative Assessment. New Directions for Teaching and Learning 74: 5–16. Available online: https://eric.ed.gov/?id=EJ570381 (accessed on 10 October 2022).
- Andrade, Heidi. 2001. The Effects of Instructional Rubrics on Learning to Write. Current Issues in Education 4: 1–21. Available online: http://cie.asu.edu/ojs/index.php/cieatasu/article/download/1630/665 (accessed on 10 October 2022).
- Andrade, Heidi. 2005. Teaching With Rubrics: The Good, the Bad, and the Ugly. College Teaching 53: 27–31. [Google Scholar] [CrossRef]
- Andrade, Heidi, and Ying Du. 2005. Student Perspectives on Rubric-Referenced Assessment. Practical Assessment, Research, and Evaluation 10: 1–11. [Google Scholar] [CrossRef]
- Andrade, Heidi L., Ying Du, and Kristina Mycek. 2010. Rubric-referenced Self-assessment and Middle School Students’ Writing. Assessment in Education: Principles, Policy & Practice 17: 199–214. [Google Scholar] [CrossRef]
- Andresen, Martin A. 2009. Asynchronous Discussion Forums: Success Factors, Outcomes, Assessments, and Limitations. Journal of Educational Technology & Society 12: 249–57. [Google Scholar]
- Asikainen, Henna, and David Gijbels. 2017. Do Students Develop Towards More Deep Approaches to Learning During Studies? A Systematic Review on the Development of Students’ Deep and Surface Approaches to Learning in Higher Education. Educational Psychology Review 29: 205–34. [Google Scholar] [CrossRef]
- Barua, Ankur. 2013. Methods for Decision-Making in Survey Questionnaires Based on Likert Scale. Journal of Asian Scientific Research 3: 35–38. Available online: https://archive.aessweb.com/index.php/5003/article/view/3446 (accessed on 10 October 2022).
- Baruch, Yehuda. 1999. Response Rate in Academic Studies-A Comparative Analysis. Human Relations 52: 421–38. [Google Scholar] [CrossRef]
- Biggs, John, David Kember, and Doris Y. P. Leung. 2001. The Revised Two-Factor Study Process Questionnaire: R-SPQ-2F. The British Journal of Educational Psychology 71: 133–49. [Google Scholar] [CrossRef]
- Black, Paul, and Dylan Wiliam. 1998. Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice 5: 7–74. [Google Scholar] [CrossRef]
- Boase-Jelinek, Daniel, Jenni Parker, and Jan Herrington. 2013. Student Reflection and Learning through Peer Reviews. Issues in Educational Research 23: 119. [Google Scholar]
- Booth, Peter, Peter Luckett, and Rosina Mladenovic. 1999. The Quality of Learning in Accounting Education: The Impact of Approaches to Learning on Academic Performance. Accounting Education 8: 277–300. [Google Scholar] [CrossRef]
- Boud, David. 1990. Assessment and the Promotion of Academic Values. Studies in Higher Education 15: 101–11. [Google Scholar] [CrossRef]
- Boud, David. 1992. The Use of Self-Assessment Schedules in Negotiated Learning. Studies in Higher Education 17: 185–200. [Google Scholar] [CrossRef]
- Boud, David. 1995. Enhancing Learning Through Self-Assessment—David Boud—Google Books, 1st ed. New York: RoutledgeFalmer. [Google Scholar]
- Boud, David, Ruth Cohen, and Jane Sampson. 2006. Peer Learning and Assessment. Assessment & Evaluation in Higher Education 24: 413–26. [Google Scholar] [CrossRef]
- Brady, Anne Marie. 2005. Assessment of Learning with Multiple-Choice Questions. Nurse Education in Practice 5: 238–42. [Google Scholar] [CrossRef]
- Bretag, Tracey, Rowena Harper, Michael Burton, Cath Ellis, Philip Newton, Karen van Haeringen, Sonia Saddiqui, and Pearl Rozenberg. 2019. Contract Cheating and Assessment Design: Exploring the Relationship. Assessment and Evaluation in Higher Education 44: 676–91. [Google Scholar] [CrossRef]
- Brookhart, Susan M., and Fei Chen. 2015. The Quality and Effectiveness of Descriptive Rubrics. Educational Review 67: 343–68. [Google Scholar] [CrossRef]
- Brown, Byron W., and Carl E. Liedholm. 2002. Can Web Courses Replace the Classroom in Principles of Microeconomics? American Economic Review 92: 444–48. [Google Scholar] [CrossRef]
- Camac Bachelor, Kyle. 2018. Student-Centered Approach vs. Teacher-Centered Approach: Which Is More Effective for a Graduate Data Analytics Course in an e-Learning Environment? Columbia: University of South Carolina. [Google Scholar]
- Camarata, Troy, and Tony A. Slieman. 2020. Improving Student Feedback Quality: A Simple Model Using Peer Review and Feedback Rubrics. Journal of Medical Education & Curricular Development 7: 2382120520936604. [Google Scholar] [CrossRef]
- Carbonaro, Antonella, and Mirko Ravaioli. 2017. Peer Assessment to Promote Deep Learning and to Reduce a Gender Gap in the Traditional Introductory Programming Course. Journal of E-Learning and Knowledge Society 13: 121–29. [Google Scholar]
- Carless, David. 2009. Trust, Distrust and Their Impact on Assessment Reform. Assessment& Evaluation in Higher Education 34: 79–89. [Google Scholar] [CrossRef]
- Carless, David, and David Boud. 2018. The Development of Student Feedback Literacy: Enabling Uptake of Feedback. Assessment & Evaluation in Higher Education 43: 1315–25. [Google Scholar] [CrossRef]
- Chin, Christine, and David E. Brown. 2000. Learning in Science: A Comparison of Deep and Surface Approaches. Journal of Research in Science Teaching 37: 109–38. [Google Scholar] [CrossRef]
- Conrad, Dianne L. 2002. Engagement, Excitement, Anxiety, and Fear: Learners’ Experiences of Starting an Online Course. International Journal of Phytoremediation 21: 205–26. [Google Scholar] [CrossRef]
- Crowe, Jessica A., Tony Silva, and Ryan Ceresola. 2015. The Effect of Peer Review on Student Learning Outcomes in a Research Methods Course. Teaching Sociology 43: 201–13. [Google Scholar] [CrossRef] [Green Version]
- Culver, Christopher. 2022. Learning as a Peer Assessor: Evaluating Peer-Assessment Strategies. Assessment & Evaluation in Higher Education, 1–17. [Google Scholar] [CrossRef]
- Dabbagh, Nada, Nada Dabbagh, and Anastasia Kitsantas. 2004. Supporting Self-Regulation in Student-Centered Web-Based Learning Environments. International Journal on E-Learning 3: 40–47. [Google Scholar]
- Denscombe, Martyn. 2009. Item Non-response Rates: A Comparison of Online and Paper Questionnaires. International Journal of Social Research Methodology 12: 281–91. [Google Scholar] [CrossRef]
- Dixson, Dante D., and Frank C. Worrell. 2016. Formative and Summative Assessment in the Classroom. Theory Into Practice 55: 153–59. [Google Scholar] [CrossRef]
- Dochy, Filip, Mien Segers, and Dominique Sluijsmans. 1999. The Use of Self-, Peer and Co-Assessment in Higher Education: A Review. Studies in Higher Education 24: 331–50. [Google Scholar] [CrossRef] [Green Version]
- Dolmans, Diana H. J. M., Sofie M. M. Loyens, Hélène Marcq, and David Gijbels. 2016. Deep and Surface Learning in Problem-Based Learning: A Review of the Literature. Advances in Health Sciences Education: Theory and Practice 21: 1087–112. [Google Scholar] [CrossRef] [Green Version]
- Duff, Angus, and Sam McKinstry. 2007. Students’ Approaches to Learning. Issues in Accounting Education 22: 183–214. [Google Scholar] [CrossRef]
- Dumford, Amber D., and Angie L. Miller. 2018. Online Learning in Higher Education: Exploring Advantages and Disadvantages for Engagement. Journal of Computing in Higher Education 30: 452–65. [Google Scholar] [CrossRef]
- Ellery, Karen, and Lee Sutherland. 2004. Involving Students in the Assessment Process. Perspectives in Education 22: 99–110. Available online: https://journals.co.za/doi/abs/10.10520/EJC87239 (accessed on 10 October 2022).
- Entwistle, Noel, and Paul Ramsden, eds. 1983. Understanding Student Learning, 1st ed. London: Routledge. [Google Scholar] [CrossRef]
- Falchikov, Nancy, and David Boud. 1989. Student Self-Assessment in Higher Education: A Meta-Analysis. Review of Educational Research 59: 395–430. [Google Scholar] [CrossRef]
- Falchikov, Nancy, and Judy Goldfinch. 2000. Student Peer Assessment in Higher Education: A Meta-Analysis Comparing Peer and Teacher Marks. Review of Educational Research 70: 287–322. [Google Scholar] [CrossRef]
- Francis, Julie Elizabeth. 2018. Linking Rubrics and Academic Performance: An Engagement Theory Perspective. Journal of University Teaching & Learning 15: 3. [Google Scholar] [CrossRef]
- Garrison, Catherine, and Michael Ehringhaus. 2007. Formative and Summative Assessments in the Classroom. Available online: http://ccti.colfinder.org/sites/default/files/formative_and_summative_assessment_in_the_classroom.pdf (accessed on 10 October 2022).
- Gaytan, Jorge, and Beryl C. McEwen. 2007. Effective Online Instructional and Assessment Strategies. American Journal of Distance Education 21: 117–32. [Google Scholar] [CrossRef]
- Gielen, Sarah, Filip Dochy, and Patrick Onghena. 2010. An Inventory of Peer Assessment Diversity. An Inventory of Peer Assessment Diversity 36: 137–55. [Google Scholar] [CrossRef]
- Gijbels, David, and Filip Dochy. 2006. Students’ Assessment Preferences and Approaches to Learning: Can Formative Assessment Make a Difference? Educational Studies 32: 399–409. [Google Scholar] [CrossRef]
- Gijbels, David, Gerard van de Watering, Filip Dochy, and Piet van den Bossche. 2005. The Relationship between Students’ Approaches to Learning and the Assessment of Learning Outcomes. European Journal of Psychology of Education 20: 327–41. [Google Scholar] [CrossRef]
- Gipps, Caroline, and Gordon Stobart. 2003. Alternative Assessment. In International Handbook of Educational Evaluation. Dordrecht: Springer, pp. 549–75. [Google Scholar] [CrossRef]
- Gizem Karaoğlan-Yilmaz, Fatma, Ahmet Berk Üstün, and Ramazan Yilmaz. 2020. Investigation of Pre-Service Teachers’ Opinions on Advantages and Disadvantages of Online Formative Assessment: An Example of Online Multiple-Choice Exam. Journal of Teacher Education & Lifelong Learning (TELL) 2: 10–19. Available online: https://dergipark.org.tr/en/pub/tell/issue/52517/718396 (accessed on 10 October 2022).
- Green, Rosemary, and Mary Bowser. 2008. Observations from the Field Sharing a Literature Review Rubric. Journal of Library Administration 45: 185–201. [Google Scholar] [CrossRef]
- Greenberg, Kathleen P. 2015. Rubric Use in Formative Assessment: A Detailed Behavioral Rubric Helps Students Improve Their Scientific Writing Skills. Teaching of Psychology 42: 211–17. [Google Scholar] [CrossRef]
- Hadzhikoleva, Stanka, Emil Hadzhikolev, and Nikolay Kasakliev. 2019. Using Peer Assessment to Enhance Higher Order Thinking Skills Student Information Systems (SIS) View Project Educational Mobile Games for Children View Project Using Peer Assessment to Enhance Higher Order Thinking Skills. TEM Journal 8: 242–47. [Google Scholar] [CrossRef]
- Jhangiani, Rajiv S. 2016. The Impact of Participating in a Peer Assessment Activity on Subsequent Academic Performance. Teaching of Psychology 43: 180–86. [Google Scholar] [CrossRef]
- Jonsson, Anders, and Gunilla Svingby. 2007. The Use of Scoring Rubrics: Reliability, Validity and Educational Consequences. Educational Research Review 2: 130–44. [Google Scholar] [CrossRef]
- Kearney, Sean. 2013. Improving Engagement: The Use of ‘Authentic Self-and Peer-Assessment for Learning’ to Enhance the Student Learning Experience. Assessment and Evaluation in Higher Education 38: 875–91. [Google Scholar] [CrossRef] [Green Version]
- Krathwohl, David R. 2002. A Revision of Bloom’s Taxonomy: An Overview. Theory Into Practice 41: 212–18. [Google Scholar] [CrossRef]
- Kwan, Kam Por, and Roberta Leung. 1996. Tutor Versus Peer Group Assessment of Student Performance in a Simulation Training Exercise. Assessment & Evaluation in Higher Education 21: 205–14. [Google Scholar] [CrossRef]
- Landry, Ashley, Shoshanah Jacobs, and Genevieve Newton. 2015. Effective Use of Peer Assessment in a Graduate Level Writing Assignment: A Case Study. International Journal of Higher Education 4: 38–51. [Google Scholar] [CrossRef] [Green Version]
- Lindblom-Ylänne, Sari, Anna Parpala, and Liisa Postareff. 2019. What Constitutes the Surface Approach to Learning in the Light of New Empirical Evidence? Studies in Higher Education 44: 2183–95. [Google Scholar] [CrossRef] [Green Version]
- Lipnevich, Anastasiya A., and Jeffrey K. Smith. 2009. Effects of Differential Feedback on Students’ Examination Performance. Journal of Experimental Psychology: Applied 15: 319–33. [Google Scholar] [CrossRef]
- Lipnevich, Anastasiya A., Dana Murano, Maike Krannich, and Thomas Goetz. 2021. Should I Grade or Should I Comment: Links among Feedback, Emotions, and Performance. Learning and Individual Differences 89: 102020. [Google Scholar] [CrossRef]
- Lipnevich, Anastasiya A., Leigh N. McCallen, Katharine Pace Miles, and Jeffrey K. Smith. 2014. Mind the Gap! Students’ Use of Exemplars and Detailed Rubrics as Formative Assessment. Instructional Science 42: 539–59. [Google Scholar] [CrossRef]
- Liu, Ngar-Fun, and David Carless. 2007. Teaching in Higher Education Peer Feedback: The Learning Element of Peer Assessment. Teaching in Higher Education 11: 279–90. [Google Scholar] [CrossRef] [Green Version]
- Lladó, Anna Planas, Lídia Feliu Soley, Rosa Maria, Fraguell Sansbelló, Gerard Arbat Pujolras, Joan Pujol Planella, Núria Roura-Pascual, Joan Josep Suñol Martínez, and Lino Montoro Moreno. 2014. Assessment & Evaluation in Higher Education Student Perceptions of Peer Assessment: An Interdisciplinary Study. Assessment & Evaluation in Higher Education 39: 592–610. [Google Scholar] [CrossRef] [Green Version]
- López-Pastor, Victor, and Alvaro Sicilia-Camacho. 2017. Formative and Shared Assessment in Higher Education. Lessons Learned and Challenges for the Future. Assessment and Evaluation in Higher Education 42: 77–97. [Google Scholar] [CrossRef]
- Lunney, Margaret, Keville Frederickson, Arlene Spark, and Georgia McDuffie. 2008. Facilitating Critical Thinking through Online Courses. Journal of Asynchronous Learning Networks 12: 85–97. [Google Scholar] [CrossRef]
- Malan, Marelize, and Nerine Stegmann. 2018. Accounting Students’ Experiences of Peer Assessment: A Tool to Develop Lifelong Learning. South African Journal of Accounting Research 32: 205–24. [Google Scholar] [CrossRef]
- Marton, Ference, and Roger Säaljö. 1976. On qualitative differences in learning-ii outcome as a function of the learner’s conception of the task. British Journal of Educational Psychology 46: 115–27. [Google Scholar] [CrossRef]
- McDowell, Liz. 1995. The Impact of Innovative Assessment on Student Learning. Innovations in Education and Training International 32: 302–13. [Google Scholar] [CrossRef]
- Mertler, Craig A. 2019. Designing Scoring Rubrics for Your Classroom. Practical Assessment, Research, and Evaluation 7: 25. [Google Scholar] [CrossRef]
- Morris, Jenny. 2001. Peer Assessment: A Missing Link between Teaching and Learning? A Review of the Literature. Nurse Education Today 21: 507–15. [Google Scholar] [CrossRef]
- Moskal, Barbara M. 2002. Recommendations for Developing Classroom Performance Assessments and Scoring Rubrics. Practical Assessment, Research, and Evaluation 8: 14. [Google Scholar] [CrossRef]
- Moskal, Barbara M., and Jon A. Leydens. 2000. Scoring Rubric Development: Validity and Reliability. Practical Assessment, Research, and Evaluation 7: 10. [Google Scholar] [CrossRef]
- Mowl, Graham, and Rachel Pain. 2006. Using Self and Peer Assessment to Improve Students’ Essay Writing: A Case Study from Geography. Innovations in Education and Training International 32: 324–35. [Google Scholar] [CrossRef]
- Mukhtar, Khadijah, Kainat Javed, Mahwish Arooj, and Ahsan Sethi. 2020. Advantages, Limitations and Recommendations for Online Learning during COVID-19 Pandemic Era. Pakistan Journal of Medical Sciences 36: S27. [Google Scholar] [CrossRef]
- Na, Seung Joo, Young Geon Ji, and Dong Hyeon Lee. 2021. Application of Bloom’s Taxonomy to Formative Assessment in Real-Time Online Classes in Korea. Korean Journal of Medical Education 33: 191–201. [Google Scholar] [CrossRef]
- Nasab, Fatemeh Ghanavati. 2015. Alternative versus Traditional Assessment. Journal of Applied Linguistics and Language Research 2: 165–78. Available online: http://www.jallr.com/index.php/JALLR/article/view/136 (accessed on 10 October 2022).
- Newble, D., and R. Clarke. 1986. The Approaches to Learning of Students in a Traditional and in an Innovative Problem-based Medical School. Medical Education 20: 267–73. [Google Scholar] [CrossRef]
- Nieminen, Juuso Henrik, Henna Asikainen, and Johanna Rämö. 2021. Promoting Deep Approach to Learning and Self-Efficacy by Changing the Purpose of Self-Assessment: A Comparison of Summative and Formative Models. Studies in Higher Education 46: 1296–311. [Google Scholar] [CrossRef]
- Nordrum, Lene, Katherine Evans, and Magnus Gustafsson. 2013. Comparing Student Learning Experiences of In-Text Commentary and Rubric-Articulated Feedback: Strategies for Formative Assessment. Assessment and Evaluation in Higher Education 38: 919–40. [Google Scholar] [CrossRef]
- O’Donnell, Angela M., and K. J. Topping. 1998. Peers Assessing Peers: Possibilities and Problems. In Peer Assisted Learning. Edited by Keith Topping and Stewart Ehly. New York: Routledge, pp. 255–78. Available online: https://books.google.com/books?hl=en&lr=&id=pDSRAgAAQBAJ&oi=fnd&pg=PA255&dq=Peers+assessing+peers:+Possibilities+and+problems.&ots=sKCre62nHe&sig=nXi8yGASWJ5tGPZakOlxA3QTW18 (accessed on 10 October 2022).
- Orsmond, Paul, Stephen Merry, and Kevin Reiling. 2002. The Use of Exemplars and Formative Feedback When Using Student Derived Marking Criteria in Peer and Self-Assessment. Assessment and Evaluation in Higher Education 27: 309–23. [Google Scholar] [CrossRef]
- Panadero, Ernesto, and Anders Jonsson. 2013. The Use of Scoring Rubrics for Formative Assessment Purposes Revisited: A Review. Educational Research Review 9: 129–44. [Google Scholar] [CrossRef]
- Panadero, Ernesto, and Maryam Alqassab. 2019. An Empirical Review of Anonymity Effects in Peer Assessment, Peer Feedback, Peer Review, Peer Evaluation and Peer Grading. Assessment & Evaluation in Higher Education 44: 1253–78. [Google Scholar] [CrossRef]
- Patton, M. Q. 1990. Qualitative Evaluation and Research Methods. Available online: https://psycnet.apa.org/record/1990-97369-000 (accessed on 10 October 2022).
- Ramirez, Tatyana V. 2017. On Pedagogy of Personality Assessment: Application of Bloom’s Taxonomy of Educational Objectives. Journal of Personality Assessment 99: 146–52. [Google Scholar] [CrossRef] [PubMed]
- Reddy, Y. Malini, and Heidi Andrade. 2010. A Review of Rubric Use in Higher Education A Review of Rubric Use in Higher Education. Assessment & Evaluation in Higher Education 35: 435–48. [Google Scholar] [CrossRef]
- Reeves, Thomas C. 2000. Alternative assessment approaches for online learning environments in higher education. J. Educational Computing Research 23. Available online: http://www.aahe.org (accessed on 10 October 2022). [CrossRef]
- Reynolds-Keefer, Laura. 2010. Rubric-Referenced Assessment in Teacher Preparation: An Opportunity to Learn by Using. Practical Assessment, Research and Evaluations 15. [Google Scholar] [CrossRef]
- Rezaei, Ali Reza, and Michael Lovorn. 2010. Reliability and Validity of Rubrics for Assessment through Writing. Assessing Writing 15: 18–39. [Google Scholar] [CrossRef]
- Riley, Tracey J., and Kathleen A. Simons. 2016. The Written Communication Skills That Matter Most for Accountants. Accounting Education 25: 239–55. [Google Scholar] [CrossRef]
- Rovai, Alfred P. 2000. Online and Traditional Assessments: What Is the Difference? Internet and Higher Education 3: 141–51. [Google Scholar] [CrossRef]
- Ruegg, Rachael. 2015. The Relative Effects of Peer and Teacher Feedback on Improvement in EFL Students’ Writing Ability. Linguistics and Education 29: 73–82. [Google Scholar] [CrossRef]
- Saddler, Bruce, and Heidi Andrade. 2004. The Writing Rubric. Educational Leadership 62: 48–52. [Google Scholar]
- Sadler, Philip M., and Eddie Good. 2006. The Impact of Self-and Peer-Grading on Student Learning. Educational Assessment 11: 1–31. [Google Scholar] [CrossRef]
- Schamber, Jon F., and Sandra L. Mahoney. 2006. Assessing and Improving the Quality of Group Critical Thinking Exhibited in the Final Projects of Collaborative Learning Groups. The Journal of General Education 55: 103–37. [Google Scholar] [CrossRef]
- Shute, Valerie J. 2008. Focus on Formative Feedback. Review of Educational Research 78: 153–89. [Google Scholar] [CrossRef]
- Sluijsmans, Dominique M. A., Saskia Brand-Gruwel, and Jeroen J. G. van Merriënboer. 2002. Peer Assessment Training in Teacher Education: Effects on Performance and Perceptions. Assessment & Evaluation in Higher Education 27: 443–54. [Google Scholar] [CrossRef]
- Sluijsmans, Dominique, and Frans Prins. 2006. A Conceptual Framework for Integrating Peer Assessment in Teacher Education. Studies in Educational Evaluation 32: 6–22. [Google Scholar] [CrossRef]
- Somervell, Hugh. 1993. Issues in Assessment, Enterprise and Higher Education: The Case for Self-, Peer and Collaborative Assessment. Assessment & Evaluation in Higher Education 18: 221–33. [Google Scholar] [CrossRef]
- Struyven, Katrien, Filip Dochy, and Steven Janssens. 2005. Students’ Perceptions about Evaluation and Assessment in Higher Education: A Review. Assessment and Evaluation in Higher Education 30: 325–41. [Google Scholar] [CrossRef]
- Su, Wei. 2021. Understanding Rubric Use in Peer Assessment of Translation. Studies in Translation Theory and Practice 30: 71–85. [Google Scholar] [CrossRef]
- Sun, Haiyang, and Mingchao Wang. 2022. Effects of Teacher Intervention and Type of Peer Feedback on Student Writing Revision. Language Teaching Research. [Google Scholar] [CrossRef]
- Sundeen, Todd H. 2014. Instructional Rubrics: Effects of Presentation Options on Writing Quality. Assessing Writing 21: 74–88. [Google Scholar] [CrossRef]
- Taras, Maddalena. 2008. Summative and Formative Assessment: Perceptions and Realities. Active Learning in Higher Education 9: 172–92. [Google Scholar] [CrossRef] [Green Version]
- Thomas, David R. 2006. A General Inductive Approach for Analyzing Qualitative Evaluation Data. American Journal of Evaluation 27: 237–46. [Google Scholar] [CrossRef]
- Topping, Keith. 1996. The Effectiveness of Peer Tutoring in Further and Higher Education—A Typology and Review of the Literature. Higher Education 32: 321–45. [Google Scholar] [CrossRef]
- Topping, Keith. 1998. Peer Assessment between Students in Colleges and Universities. Review of Educational Research 68: 249–76. [Google Scholar] [CrossRef]
- Topping, Keith. 2009. Peer Assessment. Theory into Practice 48: 20–27. [Google Scholar] [CrossRef]
- Topping, Keith J., Elaine F. Smith, Ian Swanson, and Audrey Elliot. 2000. Formative Peer Assessment of Academic Writing Between Postgraduate Students. Assessment & Evaluation in Higher Education 25: 149–69. [Google Scholar] [CrossRef]
- Tsingos, Cherie, Sinthia Bosnic-Anticevich, and Lorraine Smith. 2015. Learning Styles and Approaches: Can Reflective Strategies Encourage Deep Learning? In Currents in Pharmacy Teaching and Learning. Amsterdam: Elsevier Inc. [Google Scholar] [CrossRef]
- van den Berg, Ineke, Wilfried Admiraal, and Albert Pilot. 2007. Design Principles and Outcomes of Peer Assessment in Higher Education. Studies in Higher Education 31: 341–56. [Google Scholar] [CrossRef]
- van der Pol, Jakko, B. A. M. van den Berg, Wilfried F. Admiraal, and P. Robert Jan Simons. 2008. The Nature, Reception, and Use of Online Peer Feedback in Higher Education. Computers and Education 51: 1804–17. [Google Scholar] [CrossRef] [Green Version]
- van Gennip, Nanine A. E., Mien S. R. Segers, and Harm H. Tillema. 2009. Peer Assessment for Learning from a Social Perspective: The Influence of Interpersonal Variables and Structural Features. Educational Research Review 4: 41–54. [Google Scholar] [CrossRef]
- van Ginkel, Stan, Judith Gulikers, Harm Biemans, and Martin Mulder. 2015. The Impact of the Feedback Source on Developing Oral Presentation Competence. Studies in Higher Education 42: 1671–85. [Google Scholar] [CrossRef]
- Vickerman, Philip. 2009. Student Perspectives on Formative Peer Assessment: An Attempt to Deepen Learning? Assessment & Evaluation in Higher Education 34: 221–30. [Google Scholar] [CrossRef]
- Vista, Alvin, Esther Care, and Patrick Griffin. 2015. A New Approach towards Marking Large-Scale Complex Assessments: Developing a Distributed Marking System That Uses an Automatically Scaffolding and Rubric-Targeted Interface for Guided Peer-Review. Assessing Writing 24: 1–15. [Google Scholar] [CrossRef]
- Vonderwell, Selma, Xin Liang, and Kay Alderman. 2007. Asynchronous Discussions and Assessment in Online Learning. Journal of Research on Technology in Education 39: 309–28. [Google Scholar] [CrossRef] [Green Version]
- Warburton, Kevin. 2003. Deep Learning and Education for Sustainability. International Journal of Sustainability in Higher Education 4: 44–56. [Google Scholar] [CrossRef]
- Wen, Meichun Lydia, and Chin Chung Tsai. 2006. University Students’ Perceptions of and Attitudes toward (Online) Peer Assessment. Higher Education 51: 27–44. [Google Scholar] [CrossRef] [Green Version]
- Wen, Meichun Lydia, and Chin Chung Tsai. 2008. Online Peer Assessment in an Inservice Science and Mathematics Teacher Education Course. Teaching in Higher Education 13: 55–67. [Google Scholar] [CrossRef]
- Wollenschläger, Mareike, John Hattie, Nils Machts, Jens Möller, and Ute Harms. 2016. What Makes Rubrics Effective in Teacher-Feedback? Transparency of Learning Goals Is Not Enough. Contemporary Educational Psychology 44–45: 1–11. [Google Scholar] [CrossRef]
- Wu, Wenyan, Jinyan Huang, Chunwei Han, and Jin Zhang. 2022. Evaluating Peer Feedback as a Reliable and Valid Complementary Aid to Teacher Feedback in EFL Writing Classrooms: A Feedback Giver Perspective. Studies in Educational Evaluation 73: 101140. [Google Scholar] [CrossRef]
- Yen, Ai Chun. 2018. Effectiveness of Using Rubrics for Academic Writing in an EFL Literature Classroom. The Asian Journal of Applied Linguistics 5: 70–80. Available online: http://caes.hku.hk/ajal (accessed on 1 September 2022).
- Yonker, Julie. 2010. The Relationship of Deep and Surface Study Approaches on Factual and Applied Test-bank Multiple-choice Question Performance. Assessment & Evaluation in Higher Education 36: 673–86. [Google Scholar] [CrossRef]
Question Identifier | Open-Ended Questions |
---|---|
Qa | How did you answer the essay type of questions before the introduction of the rubric? |
Qb | How did you go about answering the essay-type questions after the introduction of the rubric? |
Qc | Have the peer assessment and rubric helped you analyse and apply the subject content? If yes, how did they help you? |
Qd | Have the peer assessment and rubric helped you think critically about the subject content and develop a deeper understanding? If yes, how did they help you? |
Qe | Any suggestions for improvement of the peer review assessment process? |
Themes |
---|
Clear performance criterion |
Structured writing |
Deep approach to learning and critical thinking |
Better analysis and application of theory |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mphahlele, L. Students’ Perception of the Use of a Rubric and Peer Reviews in an Online Learning Environment. J. Risk Financial Manag. 2022, 15, 503. https://doi.org/10.3390/jrfm15110503
Mphahlele L. Students’ Perception of the Use of a Rubric and Peer Reviews in an Online Learning Environment. Journal of Risk and Financial Management. 2022; 15(11):503. https://doi.org/10.3390/jrfm15110503
Chicago/Turabian StyleMphahlele, Letebele. 2022. "Students’ Perception of the Use of a Rubric and Peer Reviews in an Online Learning Environment" Journal of Risk and Financial Management 15, no. 11: 503. https://doi.org/10.3390/jrfm15110503
APA StyleMphahlele, L. (2022). Students’ Perception of the Use of a Rubric and Peer Reviews in an Online Learning Environment. Journal of Risk and Financial Management, 15(11), 503. https://doi.org/10.3390/jrfm15110503