Next Article in Journal
A Project-Based Instruction Approach to Improving Student Lunar Phases Learning Outcomes: A Quantitative Inquiry
Next Article in Special Issue
Rubric’s Development Process for Assessment of Project Management Competences
Previous Article in Journal
Construction of Teacher Professional Identity through Initial Training
Previous Article in Special Issue
Analysis of Quality Teaching and Learning from Perspective of University Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Concept Paper

Towards a Framework to Support the Implementation of Digital Formative Assessment in Higher Education

by
Sila Kaya-Capocci
1,
Michael O’Leary
2 and
Eamon Costello
3,*
1
Faculty of Education, Agri Ibrahim Cecen University, Agri 04100, Turkey
2
School of Policy & Practice, Dublin City University, D09DY00 Dublin, Ireland
3
School of STEM Education Innovation and Global Studies, Dublin City University, D09Y0A3 Dublin, Ireland
*
Author to whom correspondence should be addressed.
Educ. Sci. 2022, 12(11), 823; https://doi.org/10.3390/educsci12110823
Submission received: 22 July 2022 / Revised: 23 September 2022 / Accepted: 13 November 2022 / Published: 17 November 2022
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)

Abstract

:
This paper proposes a framework to support the use of digital formative assessment in higher education. The framework is informed by key principles and approaches underpinning effective formative assessment and, more specifically, by approaches to formative assessment that leverage the functionalities of technology. The overall aim is to provide a structured conceptualisation of digital formative assessment that supports the planning of lectures and other teaching and learning activities in higher education classrooms. At the heart of the framework, as presented in this paper, is a 12-cell grid comprising 4 key formative assessment strategies (sharing learning intentions and success criteria, questioning and discussion, feedback, and peer- and self-assessment) crossed with 3 functionalities of technology (sending and displaying, processing and analysing, and interactive environments). These functionalities of technologies are used as the basis to integrate digital tools into formative assessment for effective teaching and learning processes. For each cell in the grid, an exemplary digital formative assessment practice is described. This paper highlights the framework’s potential for enhancing the practice of digital formative assessment and its significance in light of the ongoing digital transformation. This paper concludes with suggesting a programme of research that might be undertaken to evaluate its utility and impact in higher education contexts.

1. Introduction

Conceptual tools and frameworks to date in the literature have lacked clear visual support and high-level guides for educators as to how they can plan their lectures effectively based on digital formative assessment. Thus, frameworks are needed to help educators engage in pedagogical planning for their unique classroom environments. This paper attempts to address this gap and help educators intentionally use digital tools and formative assessment strategies to promote effective formative assessment. We seek to help educators engage in pedagogical planning by proposing a framework to help overcome a variety of challenges, such as knowing when and how to provide effective feedback [1,2].
Formative assessment has been defined as any classroom process that seeks to gather and use information about the current status of student learning with the intention of identifying what needs to be done to progress learning [3]. Digital formative assessment refers to formative assessment approaches that utilise “all features of the digital learning environment” to support the assessment of student progress [4]. However, problems remain about how to draw on concepts from both these areas—digital technologies and formative assessment—so that educators can both conceptually and practically use the elements that are often presented with different emphases in these two works. Hence, we review some key research here to attempt to first make these ideas accessible to the general reader before going on to address the gap of how to integrate digital formative assessment and bring specific elements of that research together through a conceptual framework. The proposed framework in this paper hence supports the use of digital formative assessment in higher education and is informed by key principles underpinning effective formative assessment and, more specifically, approaches to digital formative assessment that enhance teaching and learning in higher education. In presenting a set of 12 exemplar practices as part of the framework, the intention is that it will help practitioners conceptualise, plan, and implement digital formative assessment in higher education classrooms. In this paper, we use digital technologies as a broad term to cover digital learning environments including virtual learning environments, although in some cases, the former can be a subset of the latter. Through their functionalities, we point to what a teacher can practically do in any technological environment.
This paper draws on the outcomes of a research project funded by the European Commission under Erasmus+, which was designed to enhance of the use of digital technologies for the assessment of transversal skills in Science, Technology, Engineering, and Mathematics (STEM) at the post-primary level (see www.atsstem.eu, accessed on 3 February 2020). The digital formative assessment framework developed to support this project [5] is repurposed in this paper for use in higher education contexts.
This paper is organised into four sections. Following this introduction, the recent literature is examined in a non-systematic way [6] to present a review on how formative assessment and digital formative assessment are being conceptualised across different educational contexts and the efficacy of different strategies and approaches that might be used to implement digital formative assessment. The literature review serves to highlight an important gap in terms of what is required to enhance the practice of digital formative assessment in higher education. In Section 3, the digital formative assessment framework is introduced and explained. Next, an in-depth discussion is presented on the possible contributions of the digital formative assessment framework for enhancing the practice of digital formative assessment, including how it might be conceptualised and implemented in the support of teaching and learning. After discussing why to use this framework, this paper concludes with the potential implications for higher education, including related exemplars that can be implemented in higher education and the programme of research that might be undertaken to evaluate its efficacy in different higher education contexts.

2. Formative Assessment and Digital Formative Assessment in Higher Education

2.1. Formative Assessment and Its Strategies for Effective Implementation

Following the publication of a seminal work by Black and Wiliam in 1998 highlighting the impact of formative assessment on learning [7], much of the research conducted on the topic in the intervening years has been focused on primary and post-primary education (e.g., [8]). Wiliam and Thompson proposed five key strategies to conceptualise formative assessment [9]:
  • Clarifying and sharing learning intentions and criteria for success;
  • Engineering effective classroom discussions, questions, and learning tasks;
  • Providing feedback that moves learners forward;
  • Activating students as instructional resources for one another (peer assessment);
  • Activating students as the owners of their own learning (self-assessment).
Wiliam and Thompson believed that these five strategies can help establish where the learners are and where they are going in their learning as well as help establish what needs to be done to get them there [9]. That said, the literature supporting the use of formative assessment in higher education began to appear in the early 2000s, with commentators such as Yorke claiming that formative assessment had the potential to support the development of students’ employability through developing a range of skills, understandings, and personal attributes that employers value [10]. Others such as López-Pastor and Sicilia-Camacho identified further benefits of formative assessment that included increasing student motivation, creating more responsible and autonomous learners, and developing lifelong learning strategies [11]. Nicol and Macfarlane-Dick advocated that formative assessment, particularly feedback as one of the formative assessment strategies, should be used in higher education to level up the understanding of learning and assessment and empower learners to take control of their own learning [12]. The five strategies were also used in higher education by many other researchers. For example, Klapwijk and van den Burg investigated the impact of including students in sharing and clarifying learning intentions related to 21st century skills in primary design and technology education, where they detected no research to be available [13]. The researchers found that this formative assessment strategy supports self-evaluation and feedback uptake when used in the context of real-life design projects. Jiang investigated how teachers used questioning as a formative assessment strategy to improve students’ thinking and language learning as well as inform decision making in Chinese tertiary institutions [14]. Based on the findings, the researcher recommended the use of higher-order questions as well as recall questions to improve learning. The paper also recommended applying group discussions more frequently to give an opportunity to each student to express their ideas. Various researchers researched the importance and implementation of feedback [15,16,17] and found that effective feedback can mainly improve student learning and learning motivation. Iglesias Pérez, Vidal-Puga, and Pino Juste explored the relationship between formative evaluation of the lecturer and students’ self- and peer assessments [18]. While the researchers found strong agreement between peer and lecturer assessment, they identified moderate agreement between self- and lecturer assessment, meaning that peer assessment has a higher validity and reliability than self-assessment. Iglesias Pérez, Vidal-Puga, and Pino Juste highlighted the need for conceptual clarification in relation to peer and self-assessment in higher education [18].
Significantly, the researchers mentioned in the previous paragraph, and others such as Medland [19], also highlighted several challenges for formative assessment in higher education, many of which resonate with those discussed by Bennett in the school context [20]. Of particular note is the lack of teacher knowledge and experience in the implementation of formative assessment (including the practices of formative assessment) and the lack of student experience in it. For example, López-Pastor and Sicilia-Camacho highlighted the need for further conceptual clarification around formative assessment in higher education [11]. Furthermore, the lack of teachers’ knowledge and planning in using formative assessment was also mentioned by Gikandi, Morrow, and Davis [2].
These studies show us that although there is research using feedback as a formative assessment strategy, other strategies are less studied. Furthermore, the lack of teacher knowledge on how these strategies can be used and their planning of lessons using different formative assessment strategies point us to the need for a framework that can help educators engage in pedagogical planning.

2.2. Digital Tools to Use for and the Ways to Integrate Them into Digital Formative Assessment

Not surprisingly, the extent to which technology can be used to support formative assessment has been a feature of the literature with, for example, Luckin, Clark, Avramides, Hunter, and Oliver [21] and O’Leary, Scully, Karakolidis, and Pitsia [22] highlighting its potential to improve the efficiency of many aspects of it. Following a review of 32 studies published since 2010, McLaughlin and Yan highlighted two benefits of digital formative assessment: (1) a knowledge and achievement benefit resulting from more immediate and targeted feedback and (2) a complex cognitive processes benefit resulting from an enhanced capacity to self-regulate [23]. A benefit of digital formative assessment considered important by Bhagat and Spector [1] was the improved learning motivation of students. Barana and Marchisio found that teachers and students identified “anytime, anywhere availability” and prompt feedback among the many benefits of the automated digital formative assessment model used in two high schools [24].
Digital formative assessment can be highly context-specific [25]. Nevertheless, promising results referring to formative assessment strategies in digital formative assessment were reported in studies conducted in various countries. For example, the research of Petrovic, Pale, and Jeren with undergraduate engineering students in Croatia found that the participants used the available online formative assessments by choice and that elaborated feedback aided by digital formative assessment led to better exam results [16]. In South Africa, a study on peer assessment conducted by Baleni concludeed that effective digital formative assessment encouraged student-centred learning through formative feedback and enriched learners’ commitment [26]. To implement formative assessment strategies effectively by using digital tools, some studies focused specifically on particular software involving learning management systems (interchangeably used with virtual learning environments and course management systems) such as Moodle, Blackboard, Edmodo, Pocket Study, Edsby, Canvas, and WebCT and classroom response systems (interchangeably used with clickers, personal response systems, student response systems, audience response systems, electronic voting systems, and classroom performance systems) such as Kahoot!, Socrative, Formative, and Plickers (for a review of approaches to digital formative assessment, see [23]). For example, Cohen and Sasson explored the role of online quizzes in Moodle (a popular learning management system) in a blended learning environment and concluded that Moodle can improve instructional design and formative assessment in higher education [27]. Elmahdi, Al-Hattami, and Fawzi investigated the effectiveness of using Plickers—a classroom response system tool—in improving learning [28]. The data collected via a questionnaire including 3 open-ended questions from 166 undergraduates in Bahrain indicated that using Plickers for formative assessment improved participation, saved time, provided feedback, gave equal participation opportunities, and created a fun and exciting learning environment. The researchers referred to potential issues, such as educators’ experience, network problems, and lack of infrastructure in schools. Ismail et al. investigated the impact of Kahoot! and e-Quiz as digital formative assessment tools with 36 medical students in Malaysia [29]. The analysis of five focus group discussions suggested that the students found Kahoot! To be an attractive learning tool, learning guide, and source of motivation. The results found Kahoot! To be better than e-Quiz in motivating learners to study, creating a focus on important concepts, reflecting on learning, and eventually increasing academic performance. Similarly, a study by Zhan et al. explored the impact of Mentimeter, Kahoot!, and Google+ with two experimental groups (n1 = 18, n2 = 15) and one control group (n3 = 20) during one semester of a general education course in China [30]. The researchers argued that the integration of digital or online technologies can increase the effectiveness of formative assessment and learners’ engagement by using a low-cost tool for creating interesting assessment tasks while enhancing meaningful interactions with the content, peers, and self. The results suggested using tools such as Mentimeter to clarify the assignment criteria and Kahoot! to develop a sense of group belongingness, while Google+ showed a lower impact.
A study conducted in Spain between 2014 and 2018 and reported by González-Gómez, Jeong, and Cañada-Cañada investigated the impact of digital formative assessment tools on the motivation and achievement of 311 second-year primary education undergraduates over 4 years [15]. Pre- and post- motivation surveys showed that the participants’ motivation towards the course and attitude towards science increased. Classroom assessments, lab assessments, exams, and post-test grades were analysed for 4 years to determine the participants’ achievements. The mean score pointed to a notable increase in science achievements. An additional finding showed that personalised feedback had a positive impact on participants’ motivation and achievement. However, the competency of the educator to integrate digital and online tools into day-to-day practice has been highlighted by, among others, Ismail et al. [29] and Zhan et al. [30]. These examples illustrate specific instances of what was highlighted by a recent systematic review of the field [31], which highlighted low-stakes quizzing in particular as a particularly powerful approach.
What this selection of studies highlights are the range of technologies, research approaches, and study contexts at play. However, it is arguable that they do not focus on technology integration or formative assessment at a high level, or to frame this more as a question we can pose to highlight a gap in this research: “How can these tools be integrated into digital formative assessment while using formative assessment strategies?” Building on the above formative assessment strategies and utilizing the principles of design research [32], the Formative Assessment in Science and Mathematics Education (FaSMEd) initiative proposed a model referring to the ways of integrating technology into formative assessment. This model incorporates three functionalities of technology in the formative assessment process [33].
  • Sending and displaying: These actions facilitate communication between the different actors in the formative assessment process, and they can be thought of as facilitating the elicitation and learner response processes. A classroom response system where learners reply to items using phones or tablets and where results are displayed for the class would be an example of this.
  • Processing and analysing: These actions are adopted when technology supports the interpretation phase of formative assessment, such as extracting or summarizing relevant data. An example of this would be a data dashboard summarizing learner performance against the stated criteria or in comparison to other learners.
  • Providing an interactive environment: These actions enable learners to work individually or collaboratively to explore content and may include features from the other two categories. Examples of this are specialised software for allowing learners to explore simulations, geometrical drawings, plugins that enable formative self-quizzes to be embedded in instructional videos, or custom-designed environments for complex peer and group learning (e.g., Peerwise).
Each of these three technology functionalities may be useful for any of the formative assessment strategies discussed earlier.
While the literature is replete with calls for professional development in the area, it is also the case that frameworks that could be used to support such work in a practical way are lacking [34]. Indeed, what this brief review of several studies shows is the context-specific nature of operationalising effective formative assessment and the multifactorial nature of both the teaching itself and its evaluation. This issue of how teachers can tackle this complexity via conceptual framing is explored in the section to follow.

3. A Framework for Implementing Digital Formative Assessment in Higher Education and Its Contributions

This section supports the emerging literature proposing a conceptual framework to harness the potential of digital formative assessment in higher education and its possible contributions. It seeks to address an important gap in the field by providing guidance on the use of digital tools, utilizing formative assessment strategies, and supporting educators with their implementation, and this is the major contribution of this paper.
For the framework, we utilised the five key strategies proposed by Wiliam and Thompson [9] to conceptualise formative assessment and three functionalities of technology as a means of integrating digital tools into formative assessment before, during, and after the teaching process. The FaSMEd project showed that integration of the technology into formative assessment strategies can benefit students, peers, and teachers in the school context. Starting from that point, we propose a framework in Table 1 that brings these two strands (formative assessment and digital technologies) together to provide a useful way of thinking about how digital formative assessment might be implemented in a practical way in higher education.
In Table 1, Wiliam and Thompson’ s [9] original five strategies are reduced to four, with peer and self-assessment combined to focus on this critical aspect of assessment from someone other than a teacher. A conceptual framework necessarily tries to reduce information to create more general (and hopefully useful) abstractions. This reduction, however, means that certain aspects lose emphasis, and we return to this issue under the limitations of the framework, as peer and self-assessment are distinct concepts in their own rights. The formative assessment strategies are numbered, while the digital technology functionalities are assigned letters. Hence, cell 1A is intended to refer to how, for example, learning outcomes can be displayed using digital technology.
The digital formative assessment framework aims to harness the potential of digital formative assessment in higher education. This framework supports educators conceptualising digital formative assessment by providing guidance to the use of digital tools and utilizing formative assessment strategies, which helps them plan their practices based on what is conceptualised and provides them with a useful way of thinking about how it might be implemented in a practical way in higher education. Although we believe that our framework is a useful and practical one, we acknowledge the challenges to digital formative assessment identified by other researchers as possible barriers to implementing ideas in our framework. By doing so, in the following, we discuss the potential benefits, uses, and challenges of the framework in terms of its place in conceptualizing digital formative assessment, planning lectures accordingly, and implementing it in higher education.
This framework is a conceptual tool to help educators to engage in their classroom environments. It allows them, moreover, to have conversations with their colleagues using discussion of the enactment of its constituent elements, such as the formative assessment strategies and functionalities of technology. However, promoting conceptual understanding of digital formative assessment and ensuring its effectiveness can be a challenge. To improve its effectiveness, a shared understanding of the learning goals and assessment criteria and their role in the digital formative assessment process should be promoted [2,30]. The framework that we propose is a conceptual tool that can start a pedagogical conversation on such aspects. Educators cannot simply take this tool, tick all 12 cells in the framework, and expect an infallible or even coherent learning design. Rather, they should start by discussing the strategies in the rows, which are the cornerstones of effective formative assessment. One of the key ideas in the framework is to embrace all of the formative assessment strategies, rather than only focusing on feedback, as was commonly observed in the literature. Using each of these, they should move across the columns to examine what digital tool functionalities might enable particular strategies. By promoting effective use of formative assessment strategies and technology functionalities, the framework also provides conceptual clarification for formative assessment practices, as suggested by López-Pastor and Sicilia-Camacho [11]. However, understanding how formative assessment functions within online and blended learning is challenging [2] and while our framework, provides a stricture that can be used for planning purposes, it is limited in terms of practical guidance in terms of how, for example, effective feedback should be structured.
Conceptual frames to date in the literature have lacked visual support and information for educators on how they can plan their lectures effectively based on digital formative assessment. Thus, this framework aims to help educators engage in pedagogical planning for their own classroom environments. This framework can help educators plan what digital tools and formative assessment strategies they will use and how they will use them to promote the effective and practical use of digital formative assessment. While engaging in pedagogical planning, our framework can help overcome a variety of challenges, such as providing effective feedback and knowing when and how to provide it for effective learning [1,2]. Almost all of the literature we reviewed referred to feedback in digital formative assessment to improve learning achievement and motivation. Different aspects of feedback were examined, such as its type (how elaborate it is), promptness, and the number of attempts [16,27,35]. Yet, other formative assessment strategies, such as the implementation of peer and self-assessment and the importance of the construction of learning intentions and success criteria, are not discussed thoroughly. Peer and self-assessment can be further deconstructed. For example, giving feedback rather than receiving it in a peer formulation is said to have positive effects [36], which puts peer and self-assessment together in some dimensions. However, other research focuses on the unique characteristics of external feedback (teachers, peers, parents, and computer-based systems) and does not include self- and peer assessment together (see, for example, [37] and, as highlighted earlier, [18]).
To improve the effectiveness of formative assessment during teaching, our framework proposes focusing on all formative assessment strategies rather than only one. The framework gives educators ways of integrating these strategies together into their teaching, for example, through sending and displaying, processing and analyzing, and interactive learning environments. By doing so, we support educators familiarizing themselves with effective ways of using formative assessment strategies at different stages of the teaching and learning process. Other difficulties may also be faced during the planning of lectures, such as developing a coherent practice and pedagogical discourse and acknowledging ethical principles in the formative assessment and divergent assessment processes [11]. It may be difficult to find a balance between summative and formative assessment in digital environments, and this may challenge educators to identify when and how to provide digital formative assessment for effective learning [1], but increasing educators’ conceptual understanding and providing them with new perspectives and practices of digital formative assessment may help to overcome this challenge. In this way, we contribute to the development of “teacher feedback literacy” and its interplay with student feedback literacy, as is called for in the literature [34].
Concerns were also raised about implementing digital formative assessment in both school and higher education contexts. While some of these issues relate to the learning environment, such as the network problems and technical infrastructure [24,28,30], some others relate to the educators’ and students’ knowledge, skills, and attitude. Concerning the learning environment, educators should be sensitive to issues such as internet connections and the number of digital tools available and aim to create a self-regulated, creative, valid, and reliable learning environment [2]. For example, if only a standard notation is accepted for mathematical formulas, students would lose the whole grade of a question due to a small mistake, and this may decrease the validity of the approach as well as resulting in frustration that negatively affects learning [24]. This paper cannot fix this problem, but we can at least try to inform educators of these potential learning environment-related issues.
A lack of experience on the part of the learners and educators regarding the effective use of digital formative assessment [11] and their unfamiliarity with and incapability of using digital and online tools [24,28,29,30] may also impede learning and negatively affect the implementation of digital formative assessment. For an impactful implementation, Medland, for example, suggested providing opportunities for students in higher education to familiarise themselves with the concepts of formative assessment, support the development of academic staff, and establish the concepts internally at the institutional level [19]. Achieving such goals might require some time and effort, and sometimes, this might be perceived as an excessive workload, which can hinder the effectiveness of digital formative assessment [11]. However, to benefit from digital formative assessment and promote its effective implementation, we should keep in mind that the integration of technology in formative assessment is not the goal [4,22]. Perceiving technology as just a tool for advancement may also be problematic. The features of technology tools have had an impact on our lives for hundreds of years. For example, the human hand evolved in shape and function as people began to learn how to use tools. Similarly, technology tools impact the assessment process for better and worse. To ensure the development of digital formative assessment for better use, its full potential should be conceptualised.
The framework can also be used as a reflective device as well as a forward planning tool. The strands of the framework can be overlain into an existing practice or set of practices to find gaps or inefficiencies. For example, an educator may be able to see that a particular way he or she uses a tool in his or her teaching is hard to align with the framework suggesting, that he or she can tweak the practice or change it entirely to ensure, firstly, that he or she is using the core principles of formative assessment to help underpin learning enactments, and secondly, that he or she is not using technology just for its own sake but as one of the key ways digital technology can aid assessment as per the column headings. Whilst we did say that users of the framework should start at the left and read to the right and that formative assessment principles should underpin learning design, we also sound a note of caution that we are not giving pedagogy a simplistic primacy over technology. Rather, education and technology can be seen as more of a dance [38] of two things tangled together. Educators will need to use the framework iteratively by switching back and forth between the pedagogical and technical dimensions to build learning designs. Their decisions in this process may not be straightforward to make, but the result should be sets of individually testable practices. In this respect, we aim to contribute to building assessment and feedback literacy, which researchers have advocated should be enabled via a pragmatic approach that “addresses how teachers manage the compromises inherent in disciplinary and institutional feedback practice” [34].
The framework presented here provides one possible way to structure conversations about building rich and effective formative assessment practices in digital environments. Ultimately, we call for researchers to build cultures of digital feedback literacy with an approach that is research-lead, digitally centred, and practically useful for educators.

4. Proposed Use of the Framework in Higher Education

Applied Example of the Framework

To give an applied illustration of what follows, we provide examples across Table 1 for how digital formative assessment might be activated in a practical way in higher education. Although we give specific instances of tools to help make this real and vivid, the challenge for users of the framework is to try translating these examples into tools they are already familiar with; that is, we are not specifically advocating for the tools mentioned here. Educators should draw on their existing skills where they can in the first instance. The framework may, however, serve to show gaps in their technical skillsets. It may highlight areas for professional development if they realise that a particular technique suggested by the framework examples is not in their existing toolset. The aim is for the educators to use the lessons from the research, synthesised as a best practice and brought to life via examples in the framework, in addition to the real-time feedback from their students, their reflective capacity, and their classroom instincts. We advocate for engagement in teaching as something that is theory-informed, physically and digitally enacted, and an expression of individual creativity.
The examples across each of the 12 cells of Table 1 are presented in the following, crossing the two strands of digital technology functionalities and formative assessment strategies.
Cell 1A (1: Clarifying and Sharing Learning Outcomes and Success Criteria; A: Sending and Displaying)
Learning outcomes pertaining to each lecture are sent to learners through the learning management system or virtual learning environment at least one day prior and are also displayed on screen during the lecture.
Cell 1B (1: Clarifying and Sharing Learning Outcomes and Success Criteria; B: Processing and Analysing)
At the beginning of the lecture, learners are asked to review the learning outcomes and, using clicker devices or their mobile phones and online tools such as Poll Everywhere or Kahoot!, provide an indication of how challenging they think the learning outcomes will be for them.
Cell 1C (1: Clarifying and Sharing Learning Outcomes and Success Criteria; C: Interactive Environment)
Using an online shared writing space such as Google Docs, the educator and learners work together following the lecture to create success criteria for a particular learning outcome, which is then uploaded into the learning management system or virtual learning environment.
Cell 2A (2: Classroom Discussion, Questioning, and Learning Tasks; A: Sending and Displaying)
At least one day prior to the lecture, using the poll tool in the learning management system or virtual learning environment, an online anonymous quiz is posted to ascertain or gather learners’ current understandings of the topic. The educator begins class the next day by sharing the aggregated results of the quiz and then discussing where knowledge gaps exist or deeper understanding of the complexity of the topic is required.
Cell 2B (2: Classroom Discussion, Questioning, and Learning tasks; B: Processing and Analysing)
Following this discussion, the next day, a brief case study is made available in the learning management system or virtual learning environment, where the learners are required to apply their knowledge to identify the best of five possible solutions to a related problem. After selecting their chosen solution, the poll tool in the learning management system or virtual learning environment presents an aggregated summary of what other learners chose from the range of options.
Cell 2C (2: Classroom Discussion, Questioning, and Learning Tasks; C: Interactive Environment)
Having viewed the aggregated results of the above poll, where there was a range of responses across the options, learners are invited to post a brief comment to a discussion forum to explain why they chose their particular solutions. They are also asked to say whether, with the benefit of reflection and the opportunity of seeing other learners’ comments, they would like to revise their original choices.
Cell 3A (3: Feedback; A: Sending and Displaying)
After submitting through the learning management system or virtual learning environment a draft outline and proposed structure for a forthcoming written assignment, the learners receive a 2-min MP3 recording with personalised feedback from their educator. Attached to the email is also a rubric which the audio clip explicitly refers to when providing guidance on the assessment criteria and a grading schedule that will be used in assessing the next and final version of the written assignment.
Cell 3B (3: Feedback; B: Processing and Analysing)
After having returned all of the learners’ personalised audio clips, the following day, the educator shares on a large screen during class a colour-coded rubric which illustrates the overall strengths and weaknesses of the draft assignment outlines and where learners need to focus their attention in order to ensure they meet the performance criteria. This information allows learners to compare and contrast their own personalised audio feedback against that given to the class at large.
Cell 3C (3: Feedback; C: Interactive Environment)
Following the above feedback on the strengths and weaknesses of the initial draft outlines and proposed structures, an optional online workshop is offered the next day through Zoom, where the educator is available to respond to questions, elaborate on how the criteria will be interpreted, and discuss ways in which learners can improve their performance in crafting the final written assignment.
Cell 4A (4: Peer and Self-Assessment; A: Sending and Displaying)
Leading up to a major assessment task, the educator creates a site for the course in Peerwise. This free online platform enables learners to develop their own questions and answers in a multi-choice format relevant to the course content. It also encourages them to write explanations of their answers to other learners’ questions in their own words which can be viewed by fellow classmates.
Cell 4B (4: Peer and Self-Assessment; B: Processing and Analysing)
Within the Peerwise online environment, learners can then rate the quality of their fellows’ questions and answers as they prepare for formal assessment tasks over the coming weeks and perhaps the final examination. This quality rating feature helps learners to identify better-quality questions and understand the answers in the words of other learners. It also provides a summary of how other learners have answered the questions in comparison to them.
Cell 4C (4: Peer and Self-Assessment; C: Interactive Environment)
Extending the use of Peerwise, the educator creates a dedicated discussion forum in the learning management system or virtual learning environment for learners to discuss some of their answers and seek feedback from peers on where they are still grappling with some of the questions or even disagree with the correct answers provided through the platform. They can also form their own study groups by posting requests to this forum.
The framework proposed here is not exhaustive. As one reviewer of this paper helpfully pointed out, the activities in each cell are not necessarily mutually exclusive and could happen at the same time. A weakness of the framework is that it could misrepresent the complexity of the classroom, but equally, its strength is that it can help to simplify and provide focus to practically plan and execute evidence-informed pedagogies.

5. Conclusions

In this paper, digital formative assessment was conceptualised as including all features of the digital learning environment that support the flow of information used to modify teaching and learning. The incorporation of technology into assessment is presented as a means to an end and not a goal in its own right. It is argued that digital formative assessment must enhance teaching and learning and validate evidence, for it is seriously compromised if it does not improve learning (the consequential validity argument).
A good deal of commentary was conceptual in nature and lacked specificity in terms of implementation. Frameworks that could be used to support educators in developing effective classroom practices through digital formative assessment were missing, as were empirical studies focused on how digital formative assessment could be used to improve learning outcomes in higher education. As a first step in addressing these lacunae, the digital formative assessment framework presented here was developed to provide educators with a way of (1) conceptualizing digital formative assessment by providing information on the formative assessment strategies and functionalities of technology, (2) planning lectures on how they could integrate digital tools with formative assessment in a structured, logical way, and (3) integrating practical digital formative assessment practices into the classroom. The paper concludes with providing examples showing the framework’s potential for enhancing the practice of digital formative assessment and moving from theory to practice.
Moving forward, those adopting the digital formative assessment framework must do so with their own individual higher education contexts in mind and should be prepared to utilise digital tools that go beyond the exemplars provided. Over time, it is hoped that the framework will be the focus of empirical trials that present evidence of its utility and practicality as a device for implementing digital formative assessment. Studies will need to be conducted on which digital formative assessment approaches are most commonly used and which need to be promoted. Most importantly, it should be possible to gather consequential validity evidence by using the 12-cell framework presented in Table 1 to develop a measurement scale that can link digital formative assessment implementation in higher education with learning outcomes.

Author Contributions

Conceptualisation, M.O. and S.K.-C.; literature review, S.K.-C.; framework development, M.O.; applied examples, E.C. and M.O.; discussion, E.C. and S.K.-C.; writing—original draft preparation, S.K.-C.; writing—review and editing, M.O. and E.C. All authors have read and agreed to the published version of the manuscript.

Funding

Co-funded by the Erasmus+ Programme of the European Union. Grant no. 606696-EPP-1-2018-2-IE-EPPKA3-PI-POLICY/2018-3262/007.

Institutional Review Board Statement

The study did not require ethical approval.

Informed Consent Statement

Not applicable.

Data Availability Statement

The study did not report any data.

Acknowledgments

Mark Brown (Chair in Digital Learning and Director of the National Institute for Digital Learning (NIDL), DCU) is thanked and acknowledged as a critical friend for his feedback on drafts of the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bhagat, K.K.; Spector, J.M. Formative assessment in complex problem-solving domains: The emerging role of assessment technologies. J. Educ. Technol. Soc. 2017, 20, 312–317. [Google Scholar]
  2. Gikandi, J.W.; Morrow, D.; Davis, N.E. Online formative assessment in higher education: A review of the literature. Comput. Educ. 2011, 57, 2333–2351. [Google Scholar] [CrossRef]
  3. Assessment Reform Group. Assessment for Learning: 10 Principles. Research-Based Principles to Guide Classroom Practice Assessment for Learning. 2002. Available online: https://www.hkeaa.edu.hk/doclibrary/sba/hkdse/eng_dvd/doc/Afl_principles.pdf (accessed on 3 February 2020).
  4. Looney, J. Digital Formative Assessment: A Review of the Literature. 2019. Available online: http://www.eun.org/documents/411753/817341/Assess%40Learning+Literature+Review/be02d527-8c2f-45e3-9f75-2c5cd596261d (accessed on 30 November 2019).
  5. How to Design an ATS STEM Implementation. Available online: https://www.atsstem.eu/ (accessed on 3 October 2020).
  6. Huelin, R.; Iheanacho, I.; Payne, K.; Sandman, K. What’s in a Name? Systematic and Non-Systematic Literature Reviews, and Why the Distinction Matters. The Evidence Forum. May 2015, pp. 34–37. Available online: https://www.evidera.com/wp-content/uploads/2015/06/Whats-in-a-Name-Systematic-and-Non-Systematic-Literature-Reviews-and-Why-the-Distinction-Matters.pdf (accessed on 18 March 2019).
  7. Black, P.; Wiliam, D. Assessment and Classroom Learning. Assess. Educ. Princ. Policy Pract. 1998, 5, 7–74. [Google Scholar] [CrossRef]
  8. Lane, R.; Parrila, R.; Bower, M.; Bull, R.; Cavanagh, M.; Forbes, A.; Jones, T.; Leaper, D.; Khosronejad, M.; Pellicano, L.; et al. Formative Assessment Evidence and Practice Literature Review; AITSL: Melbourne, Australia, 2019. Available online: https://www.lpofai.edu.au/media/u5ahfia0/literature-review.pdf (accessed on 30 November 2019).
  9. Wiliam, D.; Thompson, M. Integrating Assessment with Learning: What Will It Take to Make It Work? In The Future of Assessment: Shaping Teaching and Learning; Dwyer, C.A., Ed.; Routledge: New York, NY, USA, 2008; pp. 53–82. [Google Scholar] [CrossRef]
  10. Yorke, M. Formative Assessment in Higher Education:Its Significance for Employability, and Steps towards Its Enhancement. Tert. Educ. Manag. 2005, 11, 219–238. [Google Scholar] [CrossRef]
  11. López-Pastor, V.; Sicilia, A. Formative and shared assessment in higher education. Lessons learned and challenges for the future. Assess. Eval. High. Educ. 2017, 42, 77–97. [Google Scholar] [CrossRef]
  12. Nicol, D.J.; Macfarlane-Dick, D. Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Stud. High. Educ. 2006, 31, 199–218. [Google Scholar] [CrossRef]
  13. Klapwijk, R.; van den Burg, N. Involving students in sharing and clarifying learning intentions related to 21st century skills in primary design and technology education. Des. Technol. Educ. Int. J. 2020, 25, 8–34. [Google Scholar]
  14. Jiang, Y. Exploring Teacher Questioning as a Formative Assessment Strategy. RELC J. 2014, 45, 287–304. [Google Scholar] [CrossRef]
  15. González-Gómez, D.; Jeong, J.S.; Cañada-Cañada, F. Examining the Effect of an Online Formative Assessment Tool (OFAT) of Students’ Motivation and Achievement for a University Science Education. J. Balt. Sci. Educ. 2020, 19, 401–414. [Google Scholar] [CrossRef]
  16. Petrović, J.; Pale, P.; Jeren, B. Online formative assessments in a digital signal processing course: Effects of feedback type and content difficulty on students learning achievements. Educ. Inf. Technol. 2017, 22, 3047–3061. [Google Scholar] [CrossRef]
  17. Winstone, N.E.; Boud, D. The need to disentangle assessment and feedback in higher education. Stud. High. Educ. 2020, 47, 656–667. [Google Scholar] [CrossRef]
  18. Iglesias Pérez, M.C.; Vidal-Puga, J.; Juste, M.R.P. The role of self and peer assessment in Higher Education. Stud. High. Educ. 2020, 47, 683–692. [Google Scholar] [CrossRef]
  19. Medland, E. Assessment in higher education: Drivers, barriers and directions for change in the UK. Assess. Eval. High. Educ. 2016, 41, 81–96. [Google Scholar] [CrossRef]
  20. Bennett, R.E. Formative assessment: A critical review. Assess. Educ. Princ. Policy Pract. 2011, 18, 5–25. [Google Scholar] [CrossRef]
  21. Luckin, R.; Clark, W.; Avramides, K.; Hunter, J.; Oliver, M. Using teacher inquiry to support technology-enhanced formative assessment: A review of the literature to inform a new method. Interact. Learn. Environ. 2017, 25, 85–97. [Google Scholar] [CrossRef]
  22. O’Leary, M.; Scully, D.; Karakolidis, A.; Pitsia, V. The state-of-the-art in digital technology-based assessment. Eur. J. Educ. 2018, 53, 160–175. [Google Scholar] [CrossRef]
  23. McLaughlin, T.; Yan, Z. Diverse delivery methods and strong psychological benefits: A review of online formative assessment. J. Comput. Assist. Learn. 2017, 33, 562–574. [Google Scholar] [CrossRef]
  24. Barana, A.; Marchisio, M. Ten Good Reasons to Adopt an Automated Formative Assessment Model for Learning and Teaching Mathematics and Scientific Disciplines. Procedia—Soc. Behav. Sci. 2016, 228, 608–613. [Google Scholar] [CrossRef] [Green Version]
  25. Winstone, N.E.; Balloo, K.; Carless, D. Discipline-specific feedback literacies: A framework for curriculum design. High. Educ. 2020, 83, 57–77. [Google Scholar] [CrossRef]
  26. Baleni, Z.G. Online formative assessment in higher education: Its pros and cons. Electron. J. e-Learn. 2015, 13, 228–236. [Google Scholar]
  27. Cohen, D.; Sasson, I. Online quizzes in a virtual learning environment as a tool for formative assessment. J. Technol. Sci. Educ.—JOTSE 2016, 6, 188–208. [Google Scholar] [CrossRef]
  28. Elmahdi, I.; Al-Hattami, A.; Fawzi, H. Using Technology for Formative Assessment to Improve Students’ Learning. Turk. Online J. Educ. Technol.—TOJET 2018, 17, 182–188. [Google Scholar]
  29. Ismail, M.A.-A.; Ahmad, A.; Mohammad, J.A.-M.; Fakri, N.M.R.M.; Nor, M.Z.M.; Pa, M.N.M. Using Kahoot! as a formative assessment tool in medical education: A phenomenological study. BMC Med. Educ. 2019, 19, 230. [Google Scholar] [CrossRef]
  30. Zhan, Y.; Sun, D.; Chan, N.C.; Chan, K.W.; Lam, T.S.; Lee, T.H. Enhancing learning engagement through formative e-assessment in general education foundation course tutorials. In Blended Learning for Inclusive and Quality Higher Education in Asia; Springer: Singapore, 2021; pp. 281–300. [Google Scholar]
  31. Morris, R.; Perry, T.; Wardle, L. Formative assessment and feedback for learning in higher education: A systematic review. Rev. Educ. 2021, 9, e3292. [Google Scholar] [CrossRef]
  32. Swan, M. Design Research in Mathematics Education. In Encyclopedia of Mathematics Education; Lerman, S., Ed.; Springer: Dordrecht, The Netherlands, 2014. [Google Scholar]
  33. European Commission. Formative Assessment in Science and Mathematics Education (FaSMEd) Summary Report; European Commission: Brussels, Belgium, 2016; Available online: https://cordis.europa.eu/docs/results/612/612337/final1-final-fasmed-summary-report-final.pdf (accessed on 5 June 2018).
  34. Carless, D.; Winstone, N. Teacher feedback literacy and its interplay with student feedback literacy. Teach. High. Educ. 2020, 1–14. [Google Scholar] [CrossRef]
  35. Hettiarachchi, E.; Mor, E.; Huertas, M.A.; Guerrero-Roldán, A.E. Introducing a Formative E-Assessment System to Improve Online Learning Experience and Performance. J. Univers. Comput. Sci. 2015, 21, 1001–1021. [Google Scholar]
  36. Nicol, D.; Thomson, A.; Breslin, C. Rethinking feedback practices in higher education: A peer review perspective. Assess. Eval. High. Educ. 2014, 39, 102–122. [Google Scholar] [CrossRef]
  37. Narciss, S. Conditions and effects of feedback viewed through the lens of the interactive tutoring feedback model. In Scaling Up Assessment for Learning in Higher Education; Springer: Singapore, 2017; pp. 173–189. [Google Scholar]
  38. Anderson, T.; Dron, J. Three generations of distance education pedagogy. Int. Rev. Res. Open Distrib. Learn. 2011, 12, 80–97. [Google Scholar] [CrossRef]
Table 1. A framework for conceptualising, planning, and implementing digital formative assessment in higher education.
Table 1. A framework for conceptualising, planning, and implementing digital formative assessment in higher education.
Formative Assessment StrategiesDigital Technology Functionalities
A.
Sending and Displaying
B.
Processing and Analysing
C.
Interactive Environment
  • Clarifying and Sharing Learning Outcomes and Success Criteria
1A1B1C
2.
Classroom Discussion, Questioning, and Learning Tasks
2A2B2C
3.
Feedback
3A3B3C
4.
Peer and Self-Assessment
4A4B4C
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kaya-Capocci, S.; O’Leary, M.; Costello, E. Towards a Framework to Support the Implementation of Digital Formative Assessment in Higher Education. Educ. Sci. 2022, 12, 823. https://doi.org/10.3390/educsci12110823

AMA Style

Kaya-Capocci S, O’Leary M, Costello E. Towards a Framework to Support the Implementation of Digital Formative Assessment in Higher Education. Education Sciences. 2022; 12(11):823. https://doi.org/10.3390/educsci12110823

Chicago/Turabian Style

Kaya-Capocci, Sila, Michael O’Leary, and Eamon Costello. 2022. "Towards a Framework to Support the Implementation of Digital Formative Assessment in Higher Education" Education Sciences 12, no. 11: 823. https://doi.org/10.3390/educsci12110823

APA Style

Kaya-Capocci, S., O’Leary, M., & Costello, E. (2022). Towards a Framework to Support the Implementation of Digital Formative Assessment in Higher Education. Education Sciences, 12(11), 823. https://doi.org/10.3390/educsci12110823

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop