Next Article in Journal
Teaching Mathematics to Non-Mathematics Majors through Problem Solving and New Technologies
Next Article in Special Issue
Unique Problems Require Unique Solutions—Models and Problems of Linking School Effectiveness and School Improvement
Previous Article in Journal
Philosophy of Education: The Promise of Education and Grief
Previous Article in Special Issue
Models on Teaching Effectiveness in Current Scientific Literature
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Potential to Build Collective Capacity for Organisational Learning in the Context of Teachers’ Use of Digital Technology for School Improvement

by
Susila Davis-Singaravelu
Undergraduate Admissions and Outreach, University of Oxford, Oxford OX1 2JD, UK
Educ. Sci. 2022, 12(1), 33; https://doi.org/10.3390/educsci12010033
Submission received: 12 November 2021 / Revised: 23 December 2021 / Accepted: 3 January 2022 / Published: 7 January 2022

Abstract

:
This article considers how digital spaces focused on whole school improvement combined with supportive leadership may be mobilised towards building collective capacity for evidence-informed practice and organisational learning. This topic originated from a qualitative, multi-method design-based research (DBR) project that studied practitioners’ use of an online resource for primary school practitioners called Pathways for school improvement, designed by Oxford University Press (OUP). Semi-structured interviews, participant observations and a documentary analysis were conducted with teachers and senior leaders in five primary schools across England between 2014 and 2016. Connections were made with the dynamic approach to school improvement (DASI) that encourages practitioners to systematically engage with a variety of evidence in their reflections and efforts to design school and classroom improvement strategies. Pathways’ four-step system and series of systematised tasks under each step seemed to provide opportunities for practitioners to explore elements of theory and practice in conjunction with empirical and pupil performance data, and potentially guide them through how to collaborate with others in developing specific whole school approaches to improvement. Opportunities and challenges in developing collective capacity for improvement are also explored.

1. Introduction

This article explores school practitioners’ use of digital technology for school improvement and teachers’ professional development and learning. The study applied the dynamic approach to school improvement, DASI [1], as a lens to investigate how teachers and school leaders compiled and used different forms of evidence to inform their school improvement practice. The research focused on how practitioners both collaboratively and more autonomously engaged with a mixture of resources that may have included an online platform called OUP Pathways for school improvement [2]. Levin and Schrum [3] (p. 651) argue that technology implemented without accompanying professional development (informal or formal) relies on teachers to learn and explore new teaching practices on their own time, likely resulting in “uneven use of technology for instruction and only pockets of change”. Bakkanes, Vermunt and Wubbels [4] report that teachers considering their practice in more informal learning environments, compared to more structured settings (peer coaching and collaborative project groups) recounted more instances of experiencing negative emotions and continuing their old or existing practices. They also found fewer instances of experimentation, surprise and developing new ideas. Messman and Mulder [5] (p. 81), however, observed that work activities that support innovation such as “idea generation” (including “adapting ideas from external resources”) and “idea promotion” relied on more informal approaches, including social interactions and negotiation to move strategies forward. ‘Learning’ and ‘working’ are interlinked, although individuals’ learning goals or how learning is organised can vary [6]. OUP Pathways seemed to function as both a formal space for learning and knowledge building as well as a space where both more structured and unstructured conversations could take place centred on a unifying topic. The digital space appeared to focus practitioners’ attention on an ‘issue’ and provide a means via document templates and a shared language with which to communicate with others on that issue. In contrast to other educational technology-related projects that tend to focus more on students’ perspectives, this study’s unique contribution to knowledge lies in its investigation of teachers’ school improvement practices and professional learning in the context of online technology, combined with the application of the DASI in practical, authentic settings.
As explained later in the Methodology section, data for this study were collected between 2014 and 2016. Considering current global events, it is important, therefore, to acknowledge the change in the national context pre- and post-COVID-19. Weiner et al. [7] (p. 1) point to the extraordinary “scale and rapidity with which educators [have] had to respond to school closures and fundamentally shift all aspects of their work” during the ongoing pandemic. Simultaneously, however, there is early evidence of “emergency education models” formulated during the pandemic also being considered (for better or worse) as exemplars for propagation post-COVID-19 [8] (p. 109), such as using online learning to enhance ‘creativity’ and formative feedback [9]. Although it is acknowledged that technology developments are fast-moving and the present article provides a context-specific analysis of the use of Pathways at ‘a point in time’, the study also offers opportunities to explore more enduring aspects of school improvement that are of interest regardless of the type of technology available. These include the potential development of capacities for learning in organisations and the concept of ‘school improvement spaces’ that go beyond the actual digital technology itself—looking, for instance. at practitioners’ individual and collective approaches both in and around a shared problem—which remain persistent concerns for school improvement research. With this in mind, a description of Pathways is provided next, followed by a review of the literature pertinent at the time that the research was conducted.

Overview of Pathways

OUP Pathways is a resource that sits on a bigger online platform for schools and families called Oxford Owl [10]. Pathways is organised by 23 “issues”:
  • Ten focusing on “whole-school improvement”: “Assessment for Learning” (provided for free until 2017), “Managing and Implementing the National Curriculum” (currently still free), “Assessment without Levels”, “Closing the Gap”, “Developing Best Practice in the Early Years”, “Effective Governance”, “Effective Self-Evaluation”, “Outstanding SEND and Inclusion Practice”, “Parental Engagement” (very recently provided for free) and “Raising Boys’ Achievement”; and
  • Thirteen focusing on “teaching and learning”, including nine English Pathways (“Building an Outstanding Reading School”, “Developing Early Writing”, “Guided Reading”, “Improving Big Writing Practice”, “Improving Grammar”, “Improving Phonics”, “Improving Spelling”, “Improving Writing” and “Teaching Comprehension”) and four dedicated to mathematics (“Mastering Mathematics”, “Mathematical Reasoning”, “Number and Calculation” and “Problem Solving in Mathematics”).
Each issue or Pathway contains four steps (“audit”, “strategic planning”, “take action” and “evaluate impact”), which are in turn divided into smaller “tasks” made up of tools such as templates, exemplars and educational research summaries. Tools include, for example, surveys to conduct with pupils and staff: e.g., auditing existing strategies deployed to solve specific problems to aid practitioners’ work; PowerPoint presentations with ideas of how to communicate potential changes to teachers, pupils, parents and governors; and improvement plan templates and exemplars based on learning from other schools. Every Pathway also contains a section entitled “Background Research” outlining research evidence that, for instance, helps to define the “issue” and provides a summary of historical and contemporary education policies and expectations, and strategies to mitigate the issue in different primary school settings. Users can create their own login to Pathways or share a whole school account if they prefer. Each subscription costs £499 regardless of the number of users per school and includes all Pathways.
The study was framed by the principles of design-based research (DBR). The study’s exploratory nature called for a more open approach to data collection, and one that emphasised the opportunities afforded by individual and collective meaning-making. As such, the more formatively structured DBR methodology was chosen. Collins, Joseph and Bielaczyc [11] (p. 16) put forward four essential factors that support the use of DBR in the context of studying approaches to learning and development, i.e., the need:
  • To address theoretical questions about the nature of learning in context;
  • For approaches to the study of learning phenomena in the real world rather than the laboratory;
  • To go beyond narrow measures of learning; and
  • To derive research findings from formative evaluation.
The use of DBR dovetailed with the need for educational effectiveness and improvement research to move ahead with insights into technology:
“The field of school improvement is, almost by definition, a multi-year endeavour, yet both hardware and software are changing at rates that make longitudinal study of their effects almost inherently irrelevant.”.
[12] (p. 19)
The three research questions addressed by the study are:
  • What are the views and perceptions of practitioners in their use of Pathways around the functionality of the platform and its tools?
  • What role does practitioners’ use of Pathways (and other related resources) play in supporting individual schools’ improvement journeys and approaches and practitioners’ professional development experiences?
  • How were Pathways and its constituent components designed, against a background of school improvement and school effectiveness literature and how are they evolving in light of the experiences of schools’ and practitioners’ use of Pathways?
This paper discusses RQ1 and RQ2. The next section outlines some of the theoretical and practical aspects associated with schools’ and Pathways’ approaches to school improvement.

2. Literature Review

2.1. School Improvement and Effectiveness

School improvement has been recognised as a series of overlapping processes taking place within a collective endeavour that can “significantly enhance the quality of teaching and learning” [13] (p. 5). In addition, school improvement may involve strategies that encompass valuable lessons and signposts that schools can access in order to construct their own facilities and capacities for improvement, and a means of developing a professional learning community in which teachers and pupils learn and progress together. Combining these characteristics, school improvement has also been framed as:
“[A] distinct approach to educational change that enhances student outcomes as well as strengthening the school’s capacity for managing change.”.
[14] (p. 3)
School improvement can also be seen as a path of growth or maturation over time that is dynamic in nature and responsive to internal and external stimuli [15], including the disseminating and exchange of knowledge or ‘propagation’, after which organisational growth may continue into different initiatives and the cultivation of new ideas [16] (pp. 29–32). The notion of schools and ideas as ‘living organisms’ encourages researchers and practitioners to consider educational institutions as evolving structures and settings where strategies are being tested and applied in vivo, and hence require due care and attention in the event that they ‘fall ill’ and become weak. Theoretical models of school improvement such as the “dynamic model of educational effectiveness” [17] seek to forge stronger ties between educational effectiveness research (EER) and improvement practices [18]. The metaphor of the ‘living’ or developing organisation is extended in a more material sense with the dynamic approach to school improvement or DASI [1] which, by the authors’ account, provides schools and practitioners with more than just a model of the different factors that are associated with learning outcomes [19]. The DASI, supported by a series of empirical evidence testing the validity of the dynamic model [20,21],
“stresses the importance of collecting data about the functioning of factors at the classroom and school level to identify teacher and school improvement needs, respectively. In this way, an evidence-based and theory-driven approach to improvement can be gradually developed.”.
[18] (p. 105)
Each factor in the DASI is defined along five dimensions, where frequency quantitatively measures the “functioning” of that factor, and focus, stage, quality and differentiation qualitatively measure the functioning of effectiveness at a classroom/school/system level [18]. This links closely with what Silins and Mulford [22] refer to as the “reframing” of schools as learning organisations, or where Louis, Toole and Hargreaves [23] refer to organisations’ examinations of and learning from past practices as ambiguous and unclear. This invokes the topic of organisational learning (OL), or the “detection and correction of error” [24]. One component that has been linked with OL is that of knowledge management (KM) systems, often associated with information technology and seen as a means of supporting continuous OL [25] and self-evaluation, providing the ‘feedback loop’ necessary for schools to build and sustain their capacity for change [26]. In the current COVID-19 climate, Huber and Helm [27] (p. 239) point to the relevance of ‘barometer surveys’ for example, that rapidly assess perceptions and opinions from multiple perspectives which aim to “serve the heterogeneous informational needs of different target groups”. KM systems can take the form of mechanisms to, for example, formulate curriculum goals or analyse student attainment data. In the context of Pathways, the potential for such a platform to support the collection of audit, feedback and evaluation data from different sources presents opportunities to explore what Pathways is designed to achieve and how examples of implementation of that design look in practical settings. Bain and Swan [26] (p. 676) are not alone in pointing out that “moving from intention to application is always a difficult transition”, and they go on to discuss ideas around the design of feedback tools to enable school improvement such as “solution mapping”, which share similarities with school self-evaluation.
Chapman [28] (p. 32) organises school self-evaluation (SSE) efforts at three levels— “within schools”, “between schools” and “beyond schools”—focusing on the structures and processes available to support “teaching, learning and capacity for change” across practitioner, organisational and systemic boundaries. MacBeath [29] (p. 2) defines SSE as:
“[O]ngoing, embedded in the day-to-day work of classroom and school, formative in character, honest in its assessment of strengths and weakness, rigorous in its concern for evidence.”
MacGilchrist, Myers and Reed [30] (p. 330) also refer to three levels of SSE that collectively contribute towards the “intelligent school”, namely the “micro” (which focuses on the quality of learning in classrooms), “meso” (which examines the role of the school and its management procedures in improving student outcomes), and “macro” (how all the different forms of knowledge feed into the ‘big picture’). Antoniou, Myburgh-Louw and Gronn [31] (p. 192) describe SSE as an “ongoing quest for evidence” that requires a positive self-evaluative culture, one where teachers can [32] (p. 7):
  • Generate questions and engage in regular dialogue amongst their peers and pupils about learning;
  • Develop a critical interest in current research about learning and use the knowledge to reflect on their own practice; and
  • Be willing to take a step back from their own position and apply a more systematic use of research evidence on which to explore individual classroom practice and learning context (rather than the use of unexamined assumptions).
In practical terms, improvement-type processes can ‘fire-up’ following a ‘spark’ such as SSE or the observation of a school’s particular functions or ways of working including a response to inspection judgments in England [33,34,35]. Reeves [36] (p. 59) outlines the following steps for the development of research and design solutions using DBR principles:
  • The examination of practical problems by researchers and practitioners working in partnership;
  • The development of responses to problems underpinned by existing design principles and technological innovations;
  • Iterative cycles of testing and the refinement of those responses; and
  • Regular reflection to inductively produce ‘design principles’ and further enhance solution implementation.
Meanwhile, the major steps of DASI [18] (p. 109) set out for school stakeholders and the “advisory and research team” are:
  • Clearly establishing and agreeing on the main aims of school improvement and the main function of school being student learning;
  • Addressing the many contextual school and classroom factors that can influence teaching and learning;
  • Collecting evaluation data to help identify improvement priorities, for example, via an analysis of SSE;
  • Formulating school improvement strategies based on available knowledge and evidence;
  • Undertaking formative evaluation via implementation monitoring; and
  • Summative evaluation via the measurement of the impact of DASI.
Comparing Reeves’ design principles [36] (p. 59) with the major steps of the DASI, several parallels can be drawn. Both configurations:
  • Call for an analysis and understanding of the context and nature of the practical problem of interest by researchers and practitioners.
  • Propose the collection of data and strategy design to develop possible solutions to the problem.
  • Call for reflection to enhance strategy implementation.
The models differ slightly in their starting points; the DASI asks to establish “clarity and consensus” on the general aims of school improvement [1] and puts student learning at the centre of any strategy or intervention developed in the process, whereas the DBR approach may apply to a whole host of different problems, including student outcomes. Equally, similarities can be found between the DASI and organisational design models that incorporate some form of ‘redesign’ in their approach, e.g., “contingency theory” [37], “organisational ambidexterity” [38], and “design rules” [39]. Each model calls for a good understanding of the organisation, adaptations, change, redesign and achieving an equilibrium with environmental and contextual factors that may influence the organisation’s performance [40].
Leask and Younie [41] (p. 279) reflect on how “national knowledge management initiatives” to systematise educational research evidence in the past have created a fractured array of smaller ‘networks’ that remain disconnected from each other. Leask and Preston [42] suggest that the critical characteristics identified by teachers for effective online spaces include being able to search between different kinds of research evidence (not just from systematic reviews, meta-analyses and randomised controlled trials but other types of smaller-scale evidence) and work with peers interested in similar issues. These characteristics present potential design implications for a resource such as Pathways (which is not part of the ‘national policy’ but has national coverage). Although English schools can currently access a variety of databases to source research evidence such as the Teaching and Learning toolkit [43], EPPI Centre [44], University of York [45] and Evidence-Based Teachers Network [46], there is still no comprehensive portal of research evidence that brings different types of studies together, which “poses challenges for the future funding and management of databases” [41] (p. 281). It may then be useful to view Pathways, or any other online resource with a similar remit, as a kind of ‘extended’ environment or space that encompasses people, places and artefacts, both on- and offline. For example, Pathways may provide a collective space for practitioners to take a step back and consider different types of evidence, reflect on their practice, and investigate the opportunities and possible problems of different types of experimentation, akin to professional learning communities [47,48], personal learning environments [49] and communities of practice [50]. Opportunities to enhance these aspects seem ever more relevant during COVID-19 which has seen teachers more physically and emotionally isolated in their homes [51].

2.2. Organisational Learning (OL): Contingency Theory

Effective schooling is recognised as a dynamic, enduring process [18]. The notion that to be considered ‘effective’, schools are expected to adapt to changing contexts is consistent with contingency theory [37,52] which also underpins the dynamic model [18,53]. Contingencies may include the environment, the size of the organisation and the organisational strategies being deployed [37]. The main tenet of contingency theory states that the corresponding ‘fit’ between organisational characteristics and contingencies leads to high performance; as such, organisations are constantly seeking ways to adapt and attain the ‘best fit’ in changing circumstances.
The idea of adaptation corresponds to organisations such as schools attempting to reach a temporary ‘equilibrium’ or “fit”; for example, when a school is in Special Measures, operations may become more hierarchical and mechanistic in order to adapt to and seek to stabilise a probably unstable and uncertain state after the judgment is given. This pattern is also consistent with the “lines of success” identified by Day et al. [54] (p. 185) to illustrate leadership strategies applied at different points in schools’ improvement journeys, where successful leaders are seen to “enact leadership practices in contextually appropriate forms”, such as “setting directions, redesigning the organisation, developing people and use of data” [55] (p. 578). When a school emerges from the ‘cloud’ of Special Measures or is in the midst of aiming for an ‘outstanding’ Ofsted grade, the school may be experiencing more in the way of stability, and so can afford to potentially ‘destabilise’ by experimentation and a more participatory approach in order to innovate. The timing of when to switch between mechanistic and organic may vary. Moreover, contingency theory links with how a school builds and stores capacity for change and enhances student outcomes (in line with its improvement goals), particularly as the school’s intake changes, and educational policy is reformed. The DASI represents the process of schools, including those identified as most effective, trying to attain ‘fit’ within changing sets of contingencies [18,19].
OL has its limits. Commenting on revelations from research by Argyris and Schön [21] (pp. 39–40) such as letting “buried failures lie” and avoiding the ‘surfacing’ or testing of opposing views in organisational conflict, Scheerens [56] (p. 19) warns of what may lie beneath the “less contested” areas of school self-evaluation: “resistance, immunisation against potential criticism, and barriers to organizational learning”. Creemers and Kyriakides [1] (pp. 165–166), meanwhile, stress the importance of school stakeholder buy-in to any school improvement project:
“It is important to acknowledge that each member of the school may have his/her own views on what the priorities for improvement are. However, after the results of the [SSE] have identified a priority area for improvement, all stakeholders are expected to show a willingness to work on this area.”
Luyten, Visscher and Witziers [57] (p. 271) recommend that SER should focus less on the “well-known key variables” and instead try to draw more on theory to help “explain specific phenomena” that can also adapt to individual school settings and contexts. They give an example around “organisational functioning” and how the development of theories in this area might be useful:
“[T]he concept of ‘‘the learning organisation’’ implies a causal chain; the sustainability of an organisation relies on innovation and improvement that, in turn, can be attained only by unlocking individual potential and enhancing commitment by creating favourable organisational conditions.”
[57] (pp. 271–272)
Reflecting on the current pandemic, Weiner et al. [7], for instance, raise the issue of “psychological safety” [58], which is defined as the degree to which people feel comfortable to speak up and ask for help which then may impact an organisation’s capacity for organisational learning. Weiner et al. [7] (p. 2) note the dependence of psychological safety on “organizational factors and specifically, differences in accountability, principal autonomy, professional culture, and teacher decision-making”, all of which are relevant to platforms and resources such as Pathways.

2.3. Building Collective Capacity for Improvement

School-wide capacity for improvement can be defined as:
“[S]chool conditions that support teaching and learning, enable the professional learning of the staff, and provide a means for implementing strategic actions aimed at continuous school improvement.”
[59] (p. 97)
Hargreaves [60] (p. 22) puts forward a model for organisational capacity that takes into account that individual-level “intellectual capital” (knowledge and experience) and “social capital” (mutual trust) merge to generate organisational capacity and leadership. This framework lends support towards notions of how collective capacity might be built in schools; essentially, it harnesses the combined efforts, skills and expertise of individuals and teams to develop the whole organisation. However, there is less mention in this framework of intra-organisation conflict and differences in individuals’ understandings, and differences such as those between “teachers’ belief systems” and “knowledge systems” [61,62]. In some instances, however, tension and conflict may also act as catalysts for organisational change [37]. Matthews and Sammons [34] (p. 18) point to post-inspection questionnaires indicating headteachers’ recognition of the value of Ofsted’s “external assessment of the quality, strengths and weaknesses” of schools, and how “improvement through inspection” should not be misinterpreted as the more causal-driven idea of “improvement by inspection”. What seems crucial here is the opportunity for ‘gestalt’ type combinations of capital that may potentially contribute to whole school ‘thinking’ and improvement. This fits with the idea of school principals as “professional leaders” rather than “head educators” [63] (p. 241). Sammons et al. [64] (p. 4) report that although overly bureaucratic and overlapping management layers hindered progress, effective staff development “supported the development of collaborative cultures and practice by removing barriers to multi-agency working”. Caldwell [65] lists characteristics commonly associated with the “gestalt” approach to “innovation in professionalism”: networking and teamworking within and beyond school settings and professional knowledge based on existing educational research and other evidence, and not just reliant on individuals’ conceptions of knowledge. This approach is also consistent with Vangrieken et al.’s [66] “teacher collaboration continuum” which moves from completely individualised to collaborative working in its fullest form, and Brouwer et al.’s [67] conception of teacher teams evolving into work communities. Additionally, considering the ongoing pandemic, Darling-Hammond and Hyler [68] (p. 457) highlight the heightened importance of school leaders establishing strategies that focus on “transforming educator professional learning opportunities” and creating spaces for critical reflection and collaboration.
Sleegers and Leithwood [69] identify two perspectives commonly used in the theoretical and empirical literature on school improvement and educational change: the “inside” view which focuses on the internal capacity of schools to create supportive contexts for teacher learning and educational change; and the “outside” view which focuses on the implementation efforts that take place externally to reform schools and systems. Thoonen et al. [70] highlight the distinction between these two perspectives using two approaches, first identified by Chin and Benne [71]:
  • The normative-reductive approach, which is the larger process of understanding how teachers “work and live through individual and collective reflection on beliefs and practices”; and
  • The empirical-rational approach, which assumes that teachers, as rational actors, will “implement changes in their classrooms which are demonstrated to improve student learning” [70] (p. 443).
The DASI [1] considers both perspectives where in a sense, change is co-constructed by practitioners, leaders and other school stakeholders (via internal transformations and capacity for change) in order to meet the goals and demands of an external model, which includes a process of mutual adaptation [70] (p. 444). These two strands appear to combine as proposed by Hargreaves [60], in forming a kind of organisational leadership ‘collective’. In studying both schools’ and OUP perspectives, the research outlined in this paper aims to bridge not only the fields of school improvement and educational digital technologies, but also (to a smaller extent) these ‘inside’ and ‘outside’ perspectives that have emerged in the author’s own perspective and DBR approach. Next, the methodology of the study is outlined.

3. Methodology

The Pathways study employed multiple methods in the qualitative paradigm, including: investigating the perspectives of school users and designers using semi-structured interviews with teachers, leaders and OUP staff; unstructured observations of staff and steering group meetings; and the analysis of documents and performance data from schools, Ofsted and OUP. The application of each method was demonstrated by working with schools as they made decisions on their use of Pathways for distinct improvement purposes. Pathways was potentially viewed as a remote ‘light touch’ advisory and research team (in support of the DASI model).
Data were collected from five primary schools and OUP between 2014 and 2016. This encompassed over 30 h of interviews and 20 h of observations, and included numerous documents created and adapted by practitioners (in parts using Pathways), existing documents from each school (school improvement plans, action plans, Ofsted reports and so on) and over 50 instances of journal entries and analytical memos. A total of 28 interviews with schools and OUP were transcribed and imported into NVivo. This can be broken down into the following number of interviews per school (some of which also contain observational data): Magpie (11), Raven (4), Sparrow (5), Jackdaw (2), Jay (2) and OUP (4). Jackdaw and Jay Schools decided a year into the project not to engage with Pathways, and hence have fewer data sources.
In addition to NVivo being used to code data sources, the software was used as a repository to store data from the different study sites. Holding all data in one location allowed the author to code the different documents using a common set of ‘nodes’ and themes, some of which emerged more inductively from the raw data, more deductively as concepts from the literature and as a hybrid of inductive and deductive approaches. Different tactics were used to generate meaning and the cross-case synthesis, with reference to DBR and DASI-focused questions and based on guidelines offered by Miles, Huberman and Saldaña [72] (p. 277):
  • Noting patterns and themes to cluster the characteristics of practitioners’ engagement with Pathways in the context of their schools’ improvement plans;
  • Making metaphors, contrasts and comparisons, subsuming particulars from individual schools and practitioners into the general, and making sense of everyday language and practice alongside educational theory; and
  • Building a logical chain of evidence (in schools’ improvement ‘journeys’), finding intervening components and relationships and mismatches, and drawing together a ‘story’ of schools before, during and after Pathways.
Each school was given a pseudonym to keep schools’ and practitioners’ identities anonymous. As the wider platform being studied is called Oxford Owl, every school was given an appropriate name of a bird. From this point on all schools will be referred to by their pseudonyms: Magpie, Raven, Sparrow, Jackdaw and Jay. Names of research participants will usually follow the convention of ‘Title’ ‘Bird name’ (e.g., the headteacher of Magpie School is Mrs. Magpie). Where there is more than one main actor per organisation, initials based on their job role or other defining characteristic will be used to identify participants. Table 1 shows each school’s characteristics at the time of the research. One can see that Magpie School had the highest percentage of pupils with EAL status, and SEN statements or EHC plans. Magpie was also an autism resource base (ARB) for their local authority. The schools with the highest percentages of Pupil Premium (PP) pupils were Raven and Jay, which were well above the national average.

4. Findings: Cross-Case Analysis

4.1. Pathways as a Potential Tool for Task ‘Mediation’ and Enabling Leadership Capacity

Pathways appeared to offer several opportunities for ‘mediated autonomy’ within at least two of the five schools in the study. Magpie’s headteacher, Mrs. Magpie distributed a portion of leadership activities and roles, and in the process, a kind of individual, professional ‘autonomy’ via the strategic use of the Teaching and Learning Responsibility 3 scheme (TLR3s). TLR3s, a government, fixed-term teaching and learning allowance initiative that began in 2013 [74], provided a means to remunerate early career teachers (Ms. FET2, Ms. KS1 and Ms. KS2) to manage individual Pathways as part of an integrated school improvement plan and to foster professional development. The headteacher explained the rationale behind the TLR3 and how it was “perfect” for what was required of the Managing and Implementing the New National Curriculum Pathway started a year before, also emphasising the important role of leadership:
“You have to have it [Pathways] strategic from the top. The person chosen [Ms. FET2] was not a senior leader but was felt to be ready to take on a project. She came to the SLT [Senior Leadership Team] meetings and staff meetings to deliver. It raised her self-esteem, her career; built her confidence. She’s in her third year of teaching—she will go on to do something great.”
(Mrs. Magpie_MS_Dec-2014)
OUP, as an external arbiter of educational research evidence and school improvement resources (a role that might have formerly been fulfilled by Magpie’s local education authority) was seen as a reliable resource to support the staff at Magpie through their improvement plans. Hence, the ‘mediation’ to the school improvement efforts that were being made was partly furnished by the headteacher and by Pathways as the digital technology, which both served as sources of guidance, support and direction or ‘leadership’.
“It’s not just coming from me. It’s a whole professional body like OUP […] ‘Here’s a questionnaire, here’s an audit’, much easier for me. Launch at the staff meeting, [we will] use [the] Parental Engagement booklet to work alongside. That’ll be my tool for the INSET [in-service training] day.”
(Mrs. Magpie_MS_Aug-2015)
Raven, a school judged as ‘good’ by Ofsted in 2014 after being put in ‘Special Measures’ in 2012, used the system, templates and links to materials as discussion pieces among members of the senior leadership team and in the Closing the Gap Pathway, to present research evidence and strategies around the PP to the staff of the whole school. Questions in surveys, for example, became agenda items and open-ended sources of inquiry among the headteacher and her team. Also, Mrs. Raven appeared to highly value the very precise and focused nature of the MAID analysis tool, a Pathways proforma designed for SSE and audit-type tasks that asks Pathways users to list strategies or activities to maintain, amend, introduce and discard:
“We’ve done lots of talking, but we need hard data, [I asked my team] ‘can you put something together?’ In Sept for planning, we’ll see what was successful, what can we do again, replicate, plan do review cycle. MAID—we do it now, over May half term, where are we up to with our SIP [School Improvement Plan].”
(Mrs. Raven_RS_Jun-2015)
Mrs. Raven and her team seemed to be employing elements of the DASI (without citing or necessarily being aware of the approach) specifically around the evaluation of the teaching and learning environment; for example, they examined more closely the opportunities for learning in different pupil groups and what could potentially be improved (by both adding and taking away). The use of Pathways for staff learning and development, and again, modelling WAGOLL (what a good one looks like) for her teachers were highlighted:
“[My SLT] looked at the gaps of achievement between children with PP [Pupil Premium] and not, how many PP children access clubs, we had an intervention class in Easter, baselining in April and updating last week, she’s done that and produced a report. If it’s always me, it doesn’t get spread around. I’ll get different resources, staff training from Pathways, use with staff.”
(Mrs. Raven_RS_Jun-2015)
Mrs. Raven listed the resources taken from the Closing the Gap Pathway and adapted them to Raven’s requirements:
“[1] Used PowerPoint for the staff meeting, [2] posters for display, [3] questions for governors, [4] timetables, [5] staff questionnaires. Took us up to about April. In the summer term, [6] [PP] expenditure—a ‘getting ready kit’ for Ofsted. [7] Summary statements that go on the website, ready for Ofsted.”
(Mrs. Raven_RS_Jun-2015)
Mrs. Raven went on to describe a review carried out by the deputy headteacher of an intervention for PP children using Leuven scales:
“You can see the difference between March and June. What she looked at then was all children, Pupil Premium and reading, writing, maths scores of their attainment and progress. At reception, who’s made expected progress, 82%, [PP] 81%. Writing 84% all, 88% [PP], so we have made an impact and ditto in maths, 93 vs. 94 […] We can look at progress vs. attainment, and talk about impact of interventions. We also look at impact on the wellbeing scores.”
(Mrs. Raven_RS_Jun-2015)
By contrast in Sparrow School, the headteacher was less active in managing Pathways. The potential merits of Pathways were recognised by one of the school’s teachers as her own professional development resource (before Pathways materials could be applied towards Sparrow’s overarching aim of improving their pupils’ attainment in maths).
In Magpie School, the research project also appeared to provide a ‘canvas’ to experiment on and sketch out possible strategies to improve specific aspects of parental engagement and reading. As a researcher and external observer, the author was privy to conversations discussing pupils’ and parents’ questionnaire results, staff meetings and action plans organised around Pathways’ resources, and attempts to solve practical problems involving communication to parents and staff, and had access to school information.
“What pulls this together is two people who are really keen [Ms. KS1 and Ms. KS2], with an action plan! [Mrs. Magpie smiled at the two teachers.] And great having [the author], we can’t just say we haven’t done anything since the last time you came. It’s someone who leads it from the outside.”
(Mrs. Magpie_MS_Feb-2016)
As in Raven School, Pathways appeared to spark ideas in team discussions in Magpie School, but in contrast to Raven, Magpie participants remained very close in adherence to the four-step structure even after starting their third Pathway. Again, similar to Raven, Pathways was used in conjunction with other resources such as the Key [75] and evidence sought from other organisations. Magpie’s leadership support for their continued use of, and trust in, Pathways seemed strong. Early career classroom teachers expressed feeling enabled to act more autonomously and develop their own professional practice while collaborating with other practitioners using Pathways materials as a basis for wider communication and debate. This mode of working was expressed as fulfilling but also one that promoted more transparency in their communications with colleagues and decision-making procedures.
For example, a Pathways questionnaire was sent out to parents in Magpie School asking for their views on how to improve communications with them a few weeks after the interview above. The results were reported as revealing but unsurprising:
  • Homework was raised as a “big thing”:
“There’s not much scope for parents to engage. They wanted more work to do with them [their children]. We’re thinking of projects, sheets of instructions… We’ve gone around the teachers and asked them which parents are unwilling to engage, and created a list. Next step is to share it with staff and ask who’s worked with these families. From the questionnaires, we have a list of parents who said they want to help with science, we’ll give those names to the science coordinators.”
(Ms. KS1_MS_Jan-2016)
  • The use of the somewhat problematic phrase ‘unwilling to engage’ perhaps indicated a mode of communicating with parents predicated more on teachers’ terms than families’. However, after learning about parents’ views on homework, Ms. KS1 acknowledged:
“Homework—after speaking to teachers, it’s more of an after-thought. The other thing that came up was not being communicated to enough. It’s quite hard. We have newsletters every short term (six times a year). Telling parents what topics are being covered, maths, English, science. They want a much more detailed description… We are going to do an interim questionnaire online and hopefully get a bigger response.”
(Ms. KS1_MS_Jan-2016)
It was decided in January that a workshop for parents would be held in March (a ‘stay and learn’ lesson demonstration), based on ideas derived from Pathways. In February, it transpired that (some) parents may not have been keeping up with school events, including those close to home:
“Parents don’t read the newsletter. I just found out a dinner lady who has a child here—she didn’t know an event was happening—Monday’s stay and learn [in March]. So, we are now making personal invitations!”
(Mrs. Magpie_MS_Feb-2016)
Mrs. Magpie and her team, without necessarily being aware of the DASI, seemed to be promoting the design of their school improvement initiatives based on different forms of evidence. Mrs Magpie appeared to consider the different factors and dimensions of school effectiveness from the perspective of different stakeholders in Magpie School, in supporting the collection of data from pupils, parents and teachers on the school’s functioning. Every time we met, Mrs. Magpie and Ms. KS1 and KS2 appeared to have moved a previous school improvement project further with a new survey or by trialing a new approach with their parents, and then eliciting information from the people involved about how particular strategies might be improved, slowly moving towards engaging with parents on their terms, rather than teachers’. Most of all, it came across fairly strongly that Mrs. Magpie was trying to develop a professional community among the teachers who were managing Pathways and their colleagues, including teaching assistants, in engaging with educational research and taking different approaches to implement action and evaluation. In addition, the team really seemed to engage with the author and Pathways as a kind of advisory and research team composite and participate in conversations about school improvement research. The opportunity to tangibly share the workload of a project in combination with a supportive senior leadership team and school ethos were also raised in relation to the two teachers working together on Pathways:
“[Mrs. Magpie] will give us time out of our working day to work on this. Having two of us working on it—especially this kind of Pathway, it’s made for two people! ([Ms. KS2 interjects:] Bouncing ideas off, getting each other started early in the morning, reminding each other what the next steps are etc.!)… It would take a lot more time outside of school hours. Now we just have quick PPA [Planning, Preparation and Assessment] time to talk to each other.”
(Ms. KS1_MS_Feb-2016)
Raven’s leadership support manifested more in the form of workload protection; Pathways’ resources were still used but by proxy (that is, not credited explicitly by the headteacher to other staff as ‘Pathways’ templates), particularly towards the end of their engagement with Pathways. For example, I asked Mrs. Raven how Pathways could ‘work’ if the tools were first scrutinised and chosen by her, and not individual practitioner-users:
“It’s a workload issue. There is so much staff are doing outside of Pathways. I shared the information, but I’ve decided which tools to use. Not added an extra online element to do. My Deputy, Assistant Head and Inclusion Leader have looked at all the pupil premium data and produced the next evaluation, not me. So now there is a structure in place, they understood what had to happen from the initial model.”
(Mrs. Raven_RS_Jun-2015)
Staff in Magpie and Raven seemed to apply their knowledge from substantive issues such as improving parental engagement in combination with other practitioners’ views and professional experience, moving from more global problems to local, practical solutions.
“Looking at questions in the audit tool, that makes you ask questions of yourself. When I looked at some of the case studies, like the case study about fathers I was interested to pursue. That struck a chord. We have a few fathers who clearly are single parents, and we’re doing as much as we can to engage them. We’re looking at gaps in our practice using Pathways. When we run parents’ sessions, ‘who tends to come?’… ‘Who’s not coming? Why?’ ‘What can we do?’”
(Mrs. Raven_RS_Oct-2015)

4.2. Pathways as a Potential ‘School Improvement Space’

Pathways seemed to provide a ‘space’, both online and offline, for conversations to materialise in schools around concepts such as ‘improvement’, ‘development’ or ‘effectiveness’ that might usually be viewed by practitioners (and researchers) as rather abstract and amorphous. Using different tools, templates and guides, a series of ‘unfinished’ artefacts were created to support each school’s improvement journey, which also contributed towards practitioners’ CPD. In contrast to Sparrow School, both Raven and Magpie appeared to be more comfortable reflecting on their practice with reference to external research evidence and learning theories. Sparrow School seemed to place more weight on individuals’ knowledge and past experiences rather than educational research evidence and theory. Sparrow School’s impressions of Pathways about halfway through the research study in the context of what value it could potentially add to teachers’ work were more ambivalent.
In contrast to the local (education) authority (LA) landscape surrounding the other four schools in this study, Sparrow spoke highly of the quality and range of support available in their county, but also noted that schools were being encouraged to seek help beyond their network and adapt to the newer system of Local Leaders of Education (LLEs) [76] and Self-Evaluation Partners (SEPs) [77]. The headteacher felt “lucky” to have an LA that still had ‘subject leaders’ who could provide training. Sparrow’s headteacher, Mr. Sparrow, expressed difficulty with choosing and starting a Pathway because:
“The problem in my head is that the things we are going to focus on are not going to go anywhere, things like [national assessments] and all that. There’s no right answer. They’re not going to change.”
(Mr. Sparrow_SS_May-2015)
There seemed to be a difficulty moving between more ‘generalised problems’ and more ‘specific’ and ‘localised problems’ in one’s own school. In this case, what did the government’s broader objective of reforming national assessments mean for this one school? Furthermore, what were the specific things that this school had to do to achieve a good ‘inspection’ judgement and continue to meet other changing external requirements? In this case study, the absence of consistency between Mr. Sparrow’s approach and the DASI seemed more evident. So far, ensuring the clarity of improvement objectives and the use of educational research evidence to inform practice [1] had not been made apparent in Sparrow School’s use of Pathways.
The problem of connecting the school’s improvement priorities to Pathways was compounded by (understandable) resistance from a classroom teacher who expressed the view that she could not see how Pathways or even the concept of ‘whole school improvement’ featured in her daily teaching practice:
“From my point of view, and I don’t know if I’m being really negative, it seems like an awful lot for a teacher to go through time-wise, to pick out what you want. [For example] the new ‘Early Years Baseline’ which is changing next year, I go to the meeting that tells me what I need to do for it and how, the assessments I’ve got to use. I don’t know what that [Pathways] will add to what I already know. There’s a lot of thinking and writing down [of] what you’re going to do anyway.”
(CT_SS_May-2015)
The idea of recognising teachers’ roles and stages of development in matching appropriate school improvement and professional development interventions rings true from the DASI and was a factor I passed on to OUP as feedback around the design of Pathways. I explained to the Sparrow staff that a school might determine a priority and then go into Pathways and get ideas, and it was perhaps more unusual to go about it the other way around. Paradoxically, had the headteacher consulted the Self-Evaluation Pathway (one of the ten launched in 2013), what seemed to be confusion around the diagnosis of the school’s priorities may have been partially abated. The classroom teacher responded:
“I understand how it’s all there to help with the planning… but as a class teacher, I’m not in any teaching responsibility leadership anything, so this seems far-fetched from… [the day to day].”
(CT_SS_May-2015)
This raised the initial problem again, but this time in reverse: moving from ‘special’ or ‘localised’ cases to the more general. This was perhaps made more conspicuous by the headteacher’s (initial) interest in Pathways. Hopkins, Reynolds and Grey [78] (p. 37) certainly acknowledge some of the complexities inherent in strategic and action planning, and point to difficulties some leaders face in prioritising and ‘synchronising’, and how they tend to ‘act intuitively’ rather than make use of available feedback loops. With reference to the DASI, Sparrow School seemed to initially bypass the following steps: (a) establishing the main aims of school improvement and functions; (b) considering the contextual and classroom factors that can influence teaching and learning; and (c) collecting SSE data. Moving straight to step (d) (i.e., formulating improvement strategies), professional judgement (by practitioners and the individuals/groups surrounding them) and other types of evidence appeared to override the need for steps (a), (b) and (c). In contrast, Mrs. Magpie and Mrs. Raven appeared to attempt strategies consistent with steps (a) and (b) with the help of Pathways. From the cross-case analysis, part of the successful implementation of Pathways appears to lie in schools recognising the need for steps (a), (b) and (c). Professional judgement and evidence also form the basis of decision-making capacity in empirical investigation, so it may have been a case of knowing the ‘dosage’ of professional judgement vs. ‘pure’ empirical investigation to apply to a given situation.
Teachers often voice the feeling of decreased autonomy in schools, and constantly rushing to meet unclear targets and respond to external changes in policy and Ofsted requirements. Step (c) is a stage when some autonomy seemed able to be reintroduced into the way schools operate; Magpie and Raven were examples of this. Of course, this depends on how something such as Pathways is implemented across the school. The implementation of Pathways may also reduce the impression of autonomy if the tasks are introduced in a rigid, prescriptive manner. Related to steps (d) formulating improvement strategies and (e) implementation monitoring, Magpie’s headteacher’s approach more explicitly and openly offered opportunities for reflection to all practitioners, parents and pupils, as each group had completed surveys, contributed to discussions at different points and attended meetings and workshops (e.g., staff meetings and parents’ workshops). This more explicit approach with Pathways affords different prospects for collaboration and reflection amongst staff, pupils and parents. There may be potential problems with this approach as well; not all voices may be heard equally and more vocal parents are perhaps easier to remember, although in some instances silent parents may also stand out.

5. Discussion

5.1. Pathways as a Potential ‘School Improvement Space’

Evidence from this study indicates that using Pathways shows consistency with applying the DASI in one’s school improvement framework, linked to the idea of using SSE data and educational research evidence to convert the more abstract to something more achievable. Although some examples have been identified and explored in the cross-case analysis, this does not mean that it always happens but certainly, that the possibilities exist. Resources such as Pathways in combination with leadership scaffolds may also offer structured opportunities to engage with a ‘global’ educational problem (e.g., the equity gap in achievement) as well as more local and situated problems (how to narrow the gap between pupil groups in a particular school context), breaking a bigger goal into smaller steps and tasks, providing a space to literally share the load, enabling newer staff to lead. Pathways could potentially be seen as a kind of ‘mindtool’ [79] (pp. 108, 110), i.e., digital applications and “critical thinking devices” that “facilitate meaningful professional thinking and working” and which “help users to think for themselves, make connections between concepts and create new knowledge”. Although a platform such as Pathways may provide structures to mediate the autonomy of newer leaders and potentially improve a school’s leadership capacity, it should also be recognised that using a platform such as Pathways may also make practitioners’ work more visible, transparent and possibly open to debate, which can be perceived positively and negatively. A big part of computer-supported cooperative work (CSCW) investigates ways to understand and manage the trade-offs in sharing, for instance, recontextualising information to “reuse experience or knowledge” across groups that may lack shared histories and meanings [80] (pp. 68–69). The professional environment observed in Magpie School came across as more ‘amenable’ to such open working practices, compared to Sparrow School, where it took some time for staff to see the potential merits of Pathways in their individual and collective development. One teacher in Sparrow had also expressed concerns over what Pathways would ‘add’ to their current practice, which is particularly pertinent to current pandemic conditions where teachers have reported increased workload and decreased work-life balance [81].
The concept of ‘collective school improvement” raises a topic discussed more in the learning technologies literature: the idea of created ‘affinity spaces’. These are spaces organised around content, content generators and portals through which content can be produced and communicated. Gee [82] (p. 71) differentiates the ‘communities of practice’ [83] and ‘spaces’. CoPs denote participants becoming members of a group, whereas a space is perceived as more fluid and open. In this research study, spaces spanned interactions beyond the online environments where “teachers participate in ways their knowledge and skills allow (e.g., generating and distributing content, posing questions, advising others) and shift their primary aims from teaching to learning and back again as warranted” [84] (p. 139). Pathways and the conversations that happened around its tools and templates seemed to form a space for practitioners to critically reflect on their own and others’ practice. Research evidence was sometimes referenced (which links with the DASI), e.g., Raven’s use of the MAID tool to review (maintain, adapt, introduce, discard) and coaching and modelling approaches. Sparrow seemed more comfortable with the holistic approach to reflection [85] based more on personal experiences as opposed to the more evidence-based model that can be exemplified by the DASI.

5.2. Pathways’ Potential Relationship with Diagnosis and School Self-Evaluation

In Magpie and Raven, bigger portions of solution mapping [26] (p. 678), problem identification [13] (p. 105) and diagnoses of knowledge and skills needed by pupils to meet specific goals [86] (p. 247) seemed to have happened before schools began Pathways. This probably shaped the decision to engage with Pathways initially as well as set the tone for how Pathways was viewed collectively in both Magpie and Raven Schools. A clearer space was formed to strengthen diagnoses and carry out tasks around a shared goal (linking prior engagement with OUP resources to Pathways’ action points in Magpie School and using the suggested MAID analysis tool, for example, in Raven), and, thus, a focused way to plan and engage in improvement practice. In Sparrow, the diagnosis stage seemed still underway after Pathways had been started, which may have confused practitioners’ ideas of what Pathways was and what it was for. Sparrow, with its small number of pupils on roll, seemed less comfortable with the idea of temporary destabilisation by experimenting with an unfamiliar resource such as Pathways and may have also lacked confidence and perhaps immediate capacity for such a focused approach.
Initial impressions of Pathways may be linked to how schools approach self-evaluation. Different portions of Pathways are relevant to each of MacGilchrist’s [87] three dimensions to SSE: the micro, meso and macro, sometimes in combination. However, if a practitioner is working in a specific dimension that does not mirror the dimension perceived to be Pathways, this is when misfit may occur. For example, Sparrow School’s classroom teacher (CT) who voiced nervousness at the prospect of the school adopting Pathways expressed how she was picturing more ‘micro’, classroom-level responsibilities whereas Pathways looked to her to be very much ‘macro’ and at the whole school level. In addition, it may not have been clear to this member of staff whether the process of self-evaluation advocated by the Pathways approach to audit was taking place for development or accountability purposes [87,88,89]. In practice, of course it is quite likely that both purposes were influential at least in some schools. For example, Raven had received a poor Ofsted judgment, which may have been a strong catalyst for school improvement activity, and Pathways was seen as a useful way to focus the school’s efforts, at least initially. In analysing the attributes of successful schools that they had worked with and in, MacGilchrist, Myers and Reed [90] (p. 104) defined schools as learning organisations and intelligence as:
“[A] range of collective capacities schools have that enable them to achieve their goals successfully. It involves the use of wisdom, insight, intuition and experience as well as knowledge, skills and understanding.”
This highlights the potential importance of recognising and documenting the conversations that take place around school improvement tasks in this DBR-embedded case study, and how these conversations might be mediated and further developed using specific artefacts and the way the task is situated over time. Project-based learning parallels emerged with Pathways’ design of tasks/issues. For example there seemed to be elements of ‘formative feedback’ being provided by Magpie’s headteacher to her early career teachers managing Pathways; they ran ideas and plans past Mrs. Magpie and she helped to refine them and put matters in context, both for individual teachers and the whole school. Feedback without appropriate guidance may mean that “participants lack the opportunity to frame their learning and development” [91] (p. 216). Hattie and Timperley [92] (p. 87) present four levels of feedback: (1) whether a task is understood/performed successfully; (2) whether the process needed to carry out a task has been understood/executed successfully; (3) self-regulation, monitoring and directing of actions; and (4) learner self-evaluations, attitudes and motivations. In online contexts which could facilitate discussion and supportive ways to suggest improvements to practitioners’ professional learning experiences, Tinoca and Oliveira [91] (p. 216) also point to the potential “cognitive gains fueled by feedback [that] may enable reflection and an expansion of the knowledge to other situations”, as well as a wider awareness around metacognition and self-assessment [93]. In the context of COVID-19, when in-person professional learning opportunities may be few and far between, Brown, Correll and Stormer [94] suggest that feedback may be supplemented by virtual interactions supporting personal coaching, self-reflection and classroom observations.

5.3. Pathways’ Potential Relationship with Schools’ Efforts towards Developing the Capacity for Organisational Learning and Accountability

The types of artefacts generated in Magpie, Raven and Sparrow Schools across their varied school improvement journeys can be seen as contextualised outputs that contribute towards notions of ‘collective school improvement’ and by extension, the capacity for organisational learning. Bereiter [95] (p. 7) argues that “the phrase ‘theory into practice’ implies a unidirectional and sequential process, with basic research and theory coming first.” Although sometimes, in more informal settings, the more tacit understandings of how decisions might have been made may remain hidden, documenting practitioners’ different types of engagement in a task via the use of surveys, presentations of research evidence and collectively generated audit data may illuminate some of the school improvement processes that influence a school’s effectiveness. This brings to mind the idea of “situated generalisation” that signaled in the early 2000s a move from evidence-based practice to evidence-informed practice followed by practice-informed evidence [96] (p. 351).
Bereiter [95] (p. 5) proposes that more than a ‘bridge’ be built between research and practice; rather, a ‘ladder’ in the form of “principled practical knowledge” (PPK), which is “defined as know-how combined with “know-why”. Engagement with different issues via Pathways may provide a means to document the PPK observed, communicated or apparent in a specific improvement task. Pathways’ tools seem to have provided users with new opportunities to implicitly engage with knowledge ‘conversions’ and ‘transformations’ that support improvement efforts:
“Principled practical knowledge is both procedural and declarative. It is knowledge of how to achieve practical objectives but it is also knowledge that can be communicated symbolically, argued about, combined with other propositions to form larger structures, and so on.”
[95] (p. 5)
Moreover, the publishers of Pathways, OUP, from its inception, sought to use research and professional evidence to underpin its approaches, hence the role of its steering group and commissioning processes. Materials in Pathways come from a variety of sources including academia, teachers and leaders, former Ofsted HMI, educational consultants and authors in educational publishing. At times, there seemed to be tensions between how much information should be displayed for the purposes of understanding and for completeness, which is perhaps a research topic in itself. Brown and Rogers [97] (p. 81) sum up at least three key questions around this problem: (1) what represents a ‘successful combination’ of formalised and practitioner-held knowledge (and to whom); (2) when the right calibration of ‘evidence-informed practice’ has been reached; and (3) how to devise ways to measure the first two questions. All three questions apply in this study and would benefit from further, larger scale research.
Returning to the idea of schools potentially building capacity for OL, the use of Pathways may support practitioners to question assumptions underpinning certain types of knowledge and evidence, which influence practitioners’ responses to external accountability measures and to leaders and schools’ own priorities and visions for school improvement (that may overlap but also be distinct). Certainly, Sydow, Lindkvist and Defillippi [98] warn of the tensions between participants’ autonomy and the demands of within-organisation coordination efforts and integration. The argument here is not that teachers’ interactions with Pathways necessarily permeated through all echelons of schooling but that Pathways’ resources combined with leadership support and trust may have helped to create favourable conditions within which organisational learning may be encouraged and nurtured. An example of this is providing a space for the preservation of ‘organisational memory’ [99] through the accumulation of information in Pathways’ templates and document exemplars that may have previously been handled more informally. Links can be drawn with Magpie School’s engagement with Pathways in questioning what the problems with PP children’s attainment might have been, and actually opening up the scope of the enquiry to other pupils to elucidate possible underlying mechanisms in teachers’ expectations and engagement with parents more widely. Mrs. Raven expressed how she started questioning and considering how to further support the role of fathers in pupils’ learning experiences. Huber [99] (p. 93) argues that “experimenting organisations” may be amenable to adopting more unfamiliar points of view and, thus, experience increased adaptability. In addition, Raven School’s leadership structures and scaffolding indicated efforts to protect classroom teachers’ and members of senior leadership’s workload. This ties in with Altrichter and Kemethofer’s [100] (p. 32,34) argument of how “accountability pressure” from inspections may fine-tune principals’ attentiveness towards stakeholder reactions and quality expectations. This point may be relevant to the current pandemic era where teachers’ well-being in England reportedly decreased while workloads increased after the first round of school closures [81,101].
In this paper, the case studies reveal that Pathways’ resources could be used as tools for stimulating more structured critical reflection, and as part of teachers’ ongoing learning. However, reflection without the foundations of what being an effective teacher might mean is unlikely to promote improvement of itself. As was noted in Sparrow School, Antoniou and Kyriakides [102] (p. 308) argue that reflection is more effective if it also takes into account individual teachers’ own improvement needs and priorities and is linked to using research evidence accumulated from effectiveness and improvement studies in CPD:
“[T]o establish an effective approach to teacher professional development, the knowledge base of EER should be taken into account. Every effort to train teachers inevitably should refer to what an effective teacher is or how an effective teacher should behave in the classroom in order to maximize the learning potential of the students. That is exactly the reason why teacher professional development programs should be linked with the results of EER.”.
[102] (p. 308)
Pathways may be seen as a way for school teams to gain feedback on leadership practices [103], and ‘translate’ knowledge between theory and practice, by making more explicit links between the two domains and applying the ‘new’ knowledge to a practical problem. Might the space between theory and practice be regarded as a ‘third space’, i.e., situating potential professional learning communities (PLCs) in a specific context and multiple environments, on- and offline [104] (p. 204)? Crossan, Lane and White [105] (p. 522), for example, argue that “for renewal to be strategic it should encompass the entire enterprise”, which is enhanced by looking beyond internal perspectives, and seeking out the external. This may take the form of research evidence within Pathways (very similar to the ‘Advisory and Research’ role in the DASI) or information gleaned from Pathways’ surveys and self-evaluation templates. Lave and Wenger [83] discuss the idea of situated learning within the concept of communities of practice where learning is experienced as context specific. Recent work on assessment for learning supports this idea of shared context. Stobart and Hopfenbeck [106] (p. 39) suggest that the emphasis of the development of ‘learner identities’ is:
“[O]n the processes of negotiating roles and active social interaction, which in turn modify the teacher-learner contract, with assessment being done with the learner rather than to the learner. Motivation is not seen as separate from learning but as part of developing a learner’s identity…”
Besides taking into account individual teachers’ improvement priorities, EER, particularly the dynamic model and DASI [1] could be used to develop a more integrated approach to teacher professional development:
“There needs to be both knowledge and bodies of intellectual and performance skills that form the basis for critical analysis. Without these, and the ability to translate the critical analysis into action to improve performance, there is little overt social benefit to be gained from engagement in critical analysis and reflection approaches more generally [107].”
[102] (p. 294)
Experiences which help teachers progress in their professional learning and practices include experimentation, collaboration with external parties, taking appropriate risks and “analysing and critically reflecting on the evidence and on practice” [108] (p. 12). The idea of experimentation links well with the potential affordances of Pathways as a space for professional learning. This can be seen in Mrs. Raven’s approach of mediating teachers’ autonomy to improve Pupil Premium children’s attainment using highly structured tools such as the MAID. Returning to my original argument about Pathways’ artefacts potentially supporting practitioners’ efforts towards ‘collective school improvement’, Bell et al. [109] (p. 40) advance the idea that when research engagement involves multiple practitioners, “it becomes more important to identify key principles and the underlying theory in order to adapt research findings from elsewhere or from micro-inquiry safely”.

6. Concluding Remarks and Study Limitations

My research investigated, described and analysed how Pathways’ approaches to school improvement were perceived, interpreted and used in five different primary schools across England, and how educational theory and practice were assembled and communicated by OUP and within the research project itself. Findings outlined here provide examples of practitioners’ use of digital technology to facilitate whole school improvement tasks and their own professional learning. Schools’ efforts to build collective capacity for improvement and organisational learning were also discussed. Hopkins, Ainscow and West [14] suggest that greater reconciliation between approaches to school improvement and school effectiveness research may be reached through concerted efforts to further develop theoretical frameworks. Although this study of Pathways does not profess to create a new grand theory of school improvement, one of the aims of the DBR approach of the study in using an empirically validated theory such as DASI, is to try to formulate proto-theories and ‘potentialities’ that may help to enhance understanding between school improvement processes and effective outcomes. Bringing together the elements of principled practical knowledge (PPK), school improvement and accountability and how they might work together to engage practitioners in reflective, critical and evidence-informed enquiry is a means of achieving this aim.
The study is limited firstly by its timing, as COVID-19 has undoubtedly altered schooling conditions and contexts for teachers, students, and society in general. Secondly, even though a rich set of data was collected, the experiences of only five schools were explored. The fields of school improvement and digital technology would benefit from a bigger project which investigates journeys of more schools in varied contexts, taking into account the changing concerns and demands of the current COVID-19 era.

Funding

My PhD was funded by the Economic and Social Research Council (ESRC).

Institutional Review Board Statement

This research followed BERA’s ethical guidelines and was approved by the University of Oxford’s Central University Research Ethics Committee (CUREC).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Research data are not shared.

Acknowledgments

This project would not have been possible without the support of my doctoral advisor Pamela Sammons, the OUP and all participant schools, teachers and senior leaders.

Conflicts of Interest

The author declares no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Creemers, B.P.M.; Kyriakides, L. Improving Quality in Education: Dynamic Approaches to School Improvement; Routledge: Oxford, UK, 2012. [Google Scholar]
  2. School Improvement Pathways (Oxford University Press). Available online: https://global.oup.com/education/content/primary/series/oxford-owl/school-improvement-pathways/?region=uk#:~:text=Developed%20by%20educational%20experts%20and,adaptable%20templates%20and%20practical%20guidance (accessed on 12 November 2021).
  3. Levin, B.B.; Schrum, L. Lessons Learned from Secondary Schools Using Technology for School Improvement: It’s Just Not That Simple! J. Sch. Leadersh. 2014, 24, 640–665. [Google Scholar] [CrossRef]
  4. Bakkenes, I.; Vermunt, J.D.; Wubbels, T. Teacher Learning in the Context of Educational Innovation: Learning Activities and Learning Outcomes of Experienced Teachers. Learn. Instr. 2010, 20, 533–548. [Google Scholar] [CrossRef]
  5. Messmann, G.; Mulder, R.H. Innovative Work Behaviour in Vocational Colleges: Understanding How and Why Innovations Are Developed. Vocat. Learn. 2011, 4, 63–84. [Google Scholar] [CrossRef]
  6. Ilomäki, L.; Lakkala, M.; Toom, A.; Muukkonen, H. Teacher Learning within a Multinational Project in an Upper Secondary School. Educ. Res. Int. 2017, 2017, 1614262. [Google Scholar] [CrossRef] [Green Version]
  7. Weiner, J.; Francois, C.; Stone-Johnson, C.; Childs, J. Keep Safe, Keep Learning: Principals’ Role in Creating Psychological Safety and Organizational Learning During the COVID-19 Pandemic. Front. Educ. 2021, 5, 1–17. [Google Scholar] [CrossRef]
  8. Williamson, B.; Eynon, R.; Potter, J. Pandemic politics, pedagogies and practices: Digital technologies and distance education during the coronavirus emergency. Learn. Media Technol. 2020, 45, 107–114. [Google Scholar] [CrossRef]
  9. Bubb, S.; Jones, M.A. Learning from the COVID-19 home-schooling experience: Listening to pupils, parents/carers and teachers. Improv. Sch. 2020, 23, 209–222. [Google Scholar] [CrossRef]
  10. Oxford Owl (Oxford University Press). Available online: https://www.oxfordowl.co.uk/ (accessed on 12 November 2021).
  11. Collins, A.; Joseph, D.; Bielaczyc, K. Design Research: Theoretical and Methodological Issues. J. Learn. Sci. 2004, 13, 15–42. [Google Scholar] [CrossRef]
  12. Chapman, C.; Reynolds, D.; Muijs, D.; Sammons, P.; Stringfield, S.; Teddlie, C. (Eds.) Educational effectiveness and improvement research and practice: The emergence of the discipline. In The Routledge International Handbook of Educational Effectiveness and Improvement: Research, Policy, and Practice; Routledge: Oxfordshire, UK, 2016; pp. 1–24. [Google Scholar]
  13. Harris, A. School Improvement: What’s in it for Schools? RoutledgeFalmer: New York, NY, USA, 2002. [Google Scholar]
  14. Hopkins, D.; Ainscow, M.; West, M. School Improvement in an Era of Change; Cassell: London, UK, 1994. [Google Scholar]
  15. Mortimore, P.; MacBeath, J. School Effectiveness and Improvement: The story so far. In Strategic Leadership and School Improvement; Preedy, M., Glatter, R., Eds.; Sage: Thousand Oaks, CA, USA, 2003; pp. 233–251. [Google Scholar]
  16. Hargreaves, D.H. Creative Professionalism: The Role of Teachers in the Knowledge Society; Demos: London, UK, 1998. [Google Scholar]
  17. Creemers, B.P.M.; Kyriakides, L. The Dynamics of Educational Effectiveness: A Contribution to Policy, Practice and Theory in Contemporary Schools; Routledge: London, UK, 2008. [Google Scholar]
  18. Creemers, B.P.M.; Kyriakides, L. Developing, testing, and using theoretical models for promoting quality in education. Sch. Eff. Sch. Improv. 2015, 26, 102–119. [Google Scholar] [CrossRef]
  19. Creemers, B.P.M.; Kyriakides, L. Explaining stability and changes in school effectiveness by looking at changes in the functioning of school factors. Sch. Eff. Sch. Improv. 2010, 21, 409–427. [Google Scholar] [CrossRef]
  20. Creemers, B.P.M.; Kyriakides, L. Situational effects of the school factors included in the dynamic model of educational effectiveness. S. Afr. J. Educ. 2009, 29, 293–315. [Google Scholar] [CrossRef]
  21. Creemers, B.P.M.; Kyriakides, L. Using the dynamic model to develop an evidence based and theory-driven approach to school improvement. Ir. Educ. Stud. 2010, 29, 5–23. [Google Scholar] [CrossRef]
  22. Silins, H.; Mulford, B. Schools as learning organisations: The case for system, teacher and student learning. J. Educ. Adm. 2002, 40, 425–446. [Google Scholar] [CrossRef]
  23. Louis, K.S.; Toole, J.; Hargreaves, A. Rethinking school improvement. In Handbook of Research on Educational Administration, 2nd ed.; Murphy, J., Louis, K.S., Eds.; Jossey-Bass: San Francisco, CA, USA, 1999; pp. 251–276. [Google Scholar]
  24. Argyris, C.; Schön, D. Organizational Learning: A Theory of Action Perspective; Addison Wesley: Reading, MA, USA, 1978. [Google Scholar]
  25. Spector, J.M.; Edmonds, G.S. Knowledge Management in Instructional Design. ERIC Digest EDO-IR-2002-02; ERIC Information Technology Clearinghouse: Syracuse, NY, USA, 2002. [Google Scholar]
  26. Bain, A.; Swan, G. Technology enhanced feedback tools as a knowledge management mechanism for supporting professional growth and school reform. Educ. Technol. Res. Dev. 2011, 59, 673–685. [Google Scholar] [CrossRef]
  27. Huber, S.G.; Helm, C. COVID-19 and schooling: Evaluation, assessment and accountability in times of crises—Reacting quickly to explore key issues for policy, practice and research with the school barometer. Educ. Assess. Eval. Account. 2020, 32, 237–270. [Google Scholar] [CrossRef]
  28. Chapman, C. School improvement research and practice: A case of back to the future? In School Effectiveness and Improvement Research, Policy and Practice: Challenging the Orthodoxy? Chapman, C., Armstrong, P., Harris, A., Muijs, D., Reynolds, D., Sammons, P., Eds.; Routledge: Oxfordshire, UK, 2012; pp. 27–43. [Google Scholar]
  29. MacBeath, J. School Inspection and Self-Evaluation: Working with the New Relationship; Routledge: London, UK, 2006. [Google Scholar]
  30. MacGilchrist, B.; Myers, K.; Reed, J. The Intelligent School, 2nd ed.; Sage: London, UK, 2004. [Google Scholar]
  31. Antoniou, P.; Myburgh-Louw, J.; Gronn, P. School Self-Evaluation for School Improvement: Examining the Measuring Properties of the LEAD Surveys. Aust. J. Educ. 2016, 60, 191–210. [Google Scholar] [CrossRef] [Green Version]
  32. Reed, J.; Street, H. School Self Evaluation: A Process to Support Pupil and Teacher Learning. Research Matters, 18; National School Improvement Network, Institute of Education: London, UK, 2002. [Google Scholar]
  33. Harris, A.; Jamieson, I.; Russ, J. School Effectiveness and School Improvement: A Practical Guide; Pitman: London, UK, 1996. [Google Scholar]
  34. Matthews, P.; Sammons, P. Improvement through Inspection: An Evaluation of the Impact of Ofsted’s Work; Ofsted; Institute of Education: London, UK, 2004.
  35. Matthews, P.; Sammons, P. Survival of the weakest: The differential improvement of schools causing concern in England. Lond. Rev. Educ. 2005, 3, 159–176. [Google Scholar] [CrossRef]
  36. Reeves, T. Design research from a technology perspective. In Educational Design Research; Van Den Akker, J., Gravemeijer, K., McKenney, S., Nieveen, N., Eds.; Routledge: Oxfordshire, UK, 2006; pp. 52–66. [Google Scholar]
  37. Donaldson, L. The Contingency Theory of Organizations: Foundations for Organisational Science; Sage: Thousand Oaks, CA, USA, 2001. [Google Scholar]
  38. Tushman, M.L.; O’Reilly, C.A., III. Building ambidextrous organizations: Forming your own “skunk works”. Health Forum J. 1999, 42, 20–24. [Google Scholar]
  39. Baldwin, C.Y.; Clark, K.B. Design Rules. Volume 1. The Power of Modularity; MIT Press: Cambridge/London, UK, 2000. [Google Scholar]
  40. Nissen, M. Organization Design for Dynamic Fit: A Review and Projection. J. Organ. Des. 2014, 3, 30–42. [Google Scholar] [CrossRef] [Green Version]
  41. Leask, M.; Younie, S. National models for continuing professional development: The challenges of twenty-first-century knowledge management. Prof. Dev. Educ. 2013, 39, 273–287. [Google Scholar] [CrossRef]
  42. Leask, M.; Preston, C. ICT Tools for Future Teachers. Available online: http://www.beds.ac.uk/__data/assets/pdf_file/0010/19459/ict-tools2009.pdf (accessed on 18 June 2012).
  43. EEF. Teaching and Learning Toolkit. Available online: https://educationendowmentfoundation.org.uk/evidence-summaries/teaching-learning-toolkit (accessed on 3 July 2018).
  44. EPPI Centre. Available online: https://eppi.ioe.ac.uk/cms/ (accessed on 3 July 2018).
  45. Institute for Effective Education, IEE. Available online: https://www.york.ac.uk/iee/ (accessed on 3 July 2018).
  46. Evidence Based Teaching Network, EBTN. Available online: https://ebtn.org.uk/ (accessed on 12 November 2021).
  47. Hord, S.M. Professional Learning Communities: Communities of Continuous Inquiry and Improvement; Southwest Educational Development Laboratory: Austin, TX, USA, 1997. [Google Scholar]
  48. DuFour, R.; Eaker, R. Professional Learning Communities at Work: Best Practices for Enhancing Student Achievement; Solution Tree: Bloomington, IL, USA, 1998. [Google Scholar]
  49. Martindale, T.; Dowdy, M. Personal learning environments. In Emerging Technologies in Distance Education; Veletsianos, G., Ed.; Athabasca University Press: Edmonton, AB, USA, 2010; pp. 177–193. [Google Scholar]
  50. Wenger, E. Communities of Practice: Learning, Meaning and Identity; Cambridge University Press: New York, NY, USA, 1998. [Google Scholar]
  51. Trikoilis, D.; Papanastasiou, E. The potential of research for professional development in isolated settings during the COVID-19 crisis and beyond. J. Technol. Teach. Educ. 2020, 28, 295–300. [Google Scholar]
  52. Mintzberg, H. The Structuring of Organizations; Prentice Hall: Englewood Cliffs, NJ, USA, 1979. [Google Scholar]
  53. Scheerens, J. The use of theory in school effectiveness research revisited. Sch. Eff. Sch. Improv. 2013, 24, 1–38. [Google Scholar] [CrossRef]
  54. Day, C.; Sammons, P.; Hopkins, D.; Harris, A.; Leithwood, K.; Gu, Q.; Brown, E.; Ahtaridou, E.; Kington, A. The Impact of School Leadership on Pupil Outcomes. Final Report. DCSF-RR108; Department for Children Schools and Families, DCSF: London, UK, 2009.
  55. Sammons, P.; Davis, S.; Day, C.; Gu, Q. Using mixed methods to investigate school improvement and the role of leadership: An example of a longitudinal study in England. J. Educ. Adm. 2014, 52, 565–589. [Google Scholar] [CrossRef]
  56. Scheerens, J. Theories on educational effectiveness and ineffectiveness. Sch. Eff. Sch. Improv. 2015, 26, 10–31. [Google Scholar] [CrossRef] [Green Version]
  57. Luyten, H.; Visscher, A.; Witziers, B. School Effectiveness Research: From a review of the criticism to recommendations for further development. Sch. Eff. Sch. Improv. 2005, 16, 249–279. [Google Scholar] [CrossRef]
  58. Edmondson, A.C. Psychological safety and learning behavior in work teams. Adm. Sci. Q. 1999, 44, 350–383. [Google Scholar] [CrossRef] [Green Version]
  59. Hallinger, P.; Heck, R.H. Collaborative leadership and school improvement: Understanding the impact on school capacity and student learning. Sch. Leadersh. Manag. 2010, 30, 95–110. [Google Scholar] [CrossRef]
  60. Hargreaves, D. A Self-Improving School System: Towards Maturity. Available online: http://dera.ioe.ac.uk/15804/1/a-self-improving-school-system-towards-maturity.pdf (accessed on 3 July 2018).
  61. Pajares, M.F. Teachers’ Beliefs and Educational Research: Cleaning Up a Messy Construct. Rev. Educ. Res. 1992, 62, 307–332. [Google Scholar] [CrossRef]
  62. Nespor, J. The role of beliefs in the practice of teaching. J. Curric. Stud. 1987, 19, 317–328. [Google Scholar] [CrossRef]
  63. Botha, R.J. Excellence in leadership: Demands on the professional school principal. S. Afr. J. Educ. 2004, 24, 239–243. [Google Scholar]
  64. Sammons, P.; Power, S.; Elliot, K.; Robertson, P.; Campbell, C.; Whitty, G. New Community Schools in Scotland Final Report of the National Evaluation of the Pilot Phase. Insight 7; Scottish Executive Education Department: Edinburgh, UK, 2003.
  65. Caldwell, B.J. Leadership and Innovation in the Transformation of Schools; NCSL—National College for School Leadership: Nottingham, UK, 2001.
  66. Vangrieken, K.; Dochy, F.; Raes, E.; Kyndt, E. Teacher collaboration: A systematic review. Educ. Res. Rev. 2015, 15, 17–40. [Google Scholar] [CrossRef]
  67. Brouwer, P.; Brekelmans, M.; Nieuwenhuis, L.; Simons, R.J. Community development in the school workplace. Int. J. Educ. Manag. 2011, 26, 403–418. [Google Scholar] [CrossRef]
  68. Darling-Hammond, L.; Hyler, M.E. Preparing educators for the time of COVID … and beyond. Eur. J. Teach. Educ. 2020, 43, 457–465. [Google Scholar] [CrossRef]
  69. Sleegers, P.; Leithwood, K. School development for teacher learning and change. In International Encyclopaedia of Education; Peterson, P., Baker, E., McGaw, B., Eds.; Elsevier: Oxford, UK, 2010; Volume 7, pp. 557–562. [Google Scholar]
  70. Thoonen, E.E.J.; Sleegers, P.J.C.; Oort, F.J.; Peetsma, T.T.D. Building school-wide capacity for improvement: The role of leadership, school organizational conditions, and teacher factors. Sch. Eff. Sch. Improv. 2012, 23, 441–460. [Google Scholar] [CrossRef]
  71. Chin, R.; Benne, K. General strategies for effecting changes in human systems. In The Planning of Change, 2nd ed.; Bennis, W., Benne, K., Chin, R., Eds.; Holt, Rinehart & Winston: New York, NY, USA, 1969; pp. 32–59. [Google Scholar]
  72. Miles, M.B.; Huberman, A.M.; Saldaña, J. Qualitative Data Analysis: A Methods Sourcebook, 3rd ed.; Sage: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  73. Statistics at Department for Education (DfE), UK. 2016. Available online: https://www.gov.uk/government/organisations/department-for-education/about/statistics (accessed on 31 December 2016).
  74. Department for Education (DfE), UK. Teachers’ Pay: Changes Since September 2013. Available online: https://www.gov.uk/government/news/teachers-pay-changes-since-september-2013 (accessed on 31 December 2016).
  75. The Key, Welcome to the Key for School Leaders. Available online: https://schoolleaders.thekeysupport.com/ (accessed on 3 July 2018).
  76. National College for Teaching and Leadership—NCTL. 2017. Available online: https://www.gov.uk/government/news/local-leaders-of-education-lle (accessed on 12 November 2021).
  77. Dorset Council. Available online: https://www.dorsetnexus.org.uk/Page/9602 (accessed on 12 November 2021).
  78. Hopkins, D.; Reynolds, D.; Gray, J. Moving on and Moving up: Confronting the Complexities of School Improvement in the Improving Schools Project. Educ. Res. Eval. 1999, 5, 22–40. [Google Scholar] [CrossRef]
  79. Kirschner, P.A.; Wopereis, I.G.J.H. Mindtools for Teacher Communities: A European Perspective. Technol. Pedagog. Educ. 2003, 12, 105–124. [Google Scholar] [CrossRef] [Green Version]
  80. Ackerman, M.S. The intellectual challenge of CSCW: The gap between social requirements and technical feasibility. In Online Communication and Collaboration; Donelan, H., Kear, K., Ramage, M., Eds.; Routledge: Oxford, UK, 2010; pp. 66–72. [Google Scholar]
  81. Müller, L.-M.; Goldenberg, G. Education in Times of Crisis: The Potential Implications of School Closures for Teachers and Students; Chartered College of Teaching: London, UK, 2020; Available online: https://my.chartered.college/wp-content/uploads/2020/05/CCTReport070520_FINAL.pdf (accessed on 23 December 2021).
  82. Gee, J.P. Situated Language and Learning: A Critique of Traditional Schooling; Routledge: New York, NY, USA, 2004. [Google Scholar]
  83. Lave, J.; Wenger, E. Situated Learning: Legitimate Peripheral Participation; Cambridge University Press: Cambridge, UK, 1991. [Google Scholar]
  84. Rodesiler, L. For Teachers, by Teachers: An Exploration of Teacher-Generated Online Professional Development. J. Digit. Learn. Teach. Educ. 2017, 33, 138–147. [Google Scholar] [CrossRef]
  85. Golby, M.; Viant, R. Means and ends in professional development. Teach. Dev. 2007, 11, 237–243. [Google Scholar] [CrossRef]
  86. Muijs, D.; Kyriakides, L.; van der Werf, G.; Creemers, B.; Timperley, H.; Earl, L. State of the art—Teacher effectiveness and professional learning. Sch. Eff. Sch. Improv. 2014, 25, 231–256. [Google Scholar] [CrossRef] [Green Version]
  87. MacGilchrist, B. Improving self-improvement? Res. Pap. Educ. 2000, 15, 325–338. [Google Scholar] [CrossRef]
  88. MacBeath, J. Background, Principles and Key Learning in Self-Evaluation: A Guide for School Leaders; National College for School Leadership: Nottingham, UK, 2005.
  89. Rogers, G.; Badham, L. Evaluation in Schools; Routledge: London, UK, 1992. [Google Scholar]
  90. MacGilchrist, B.; Myers, K.; Reed, J. The Intelligent School; Sage: London, UK, 1997. [Google Scholar]
  91. Tinoca, L.; Oliveira, I. Formative assessment of teachers in the context of an online learning environment. Teach. Teach. 2013, 19, 221–234. [Google Scholar] [CrossRef]
  92. Hattie, J.; Timperley, H. The power of feedback. Rev. Educ. Res. 2007, 77, 81–112. [Google Scholar] [CrossRef]
  93. Herrington, J.; Herrington, A. Authentic assessment and multimedia: How university students respond to a model of authentic assessment. High. Educ. Res. Dev. 1998, 17, 305–322. [Google Scholar] [CrossRef]
  94. Brown, C.; Correll, P.; Stormer, K.J. The “new” normal: Re-imagining professional development amidst the COVID-19 pandemic. Middle Sch. J. 2021, 52, 5–13. [Google Scholar] [CrossRef]
  95. Bereiter, C. Principled Practical Knowledge: Not a Bridge but a Ladder. J. Learn. Sci. 2014, 23, 4–17. [Google Scholar] [CrossRef]
  96. Simons, H.; Kushner, S.; Jones, K.; James, D. From evidence-based practice to practice-based evidence: The idea of situated generalisation. Res. Pap. Educ. 2003, 18, 347–364. [Google Scholar] [CrossRef]
  97. Brown, C.; Rogers, S. Knowledge Creation as an Approach to Facilitating Evidence Informed Practice: Examining Ways to Measure the Success of Using This Method with Early Years Practitioners in Camden (London). J. Educ. Chang. 2015, 16, 79–99. [Google Scholar] [CrossRef] [Green Version]
  98. Sydow, J.; Lindkvist, L.; Defillippi, R. Project-Based Organizations, Embeddedness and Repositories of Knowledge: Editorial. Organ. Stud. 2004, 25, 1475–1489. [Google Scholar] [CrossRef] [Green Version]
  99. Huber, G.P. Organizational Learning: The Contributing Processes and the Literatures. Organ. Sci. 1991, 2, 88–115. [Google Scholar] [CrossRef]
  100. Altrichter, H.; Kemethofer, D. Does accountability pressure through school inspections promote school improvement? Sch. Eff. Sch. Improv. 2015, 26, 32–56. [Google Scholar] [CrossRef]
  101. Ofsted. What’s Working Well in Remote Education. 2021. Available online: https://www.gov.uk/government/publications/whats-working-well-in-remote-education/whats-working-well-in-remote-education (accessed on 23 December 2021).
  102. Antoniou, P.; Kyriakides, L. The impact of a dynamic approach to professional development on teacher instruction and student learning: Results from an experimental study. Sch. Eff. Sch. Improv. 2011, 22, 291–311. [Google Scholar] [CrossRef]
  103. Kelley, C.; Dikkers, S. Framing Feedback for School Improvement around Distributed Leadership. Educ. Adm. Q. 2016, 52, 392–422. [Google Scholar] [CrossRef]
  104. Riveros, A.; Newton, P.; Burgess, D. A situated account of teacher agency and learning: Critical reflections on professional learning communities. Can. J. Educ. 2012, 35, 202–216. [Google Scholar]
  105. Crossan, M.M.; Lane, H.W.; White, R.E. An Organizational Learning Framework: From Intuition to Institution. Acad. Manag. Rev. 1999, 24, 522–537. [Google Scholar] [CrossRef]
  106. Stobart, G.; Hopfenbeck, T.N. Assessment for Learning and formative assessment. In State of the Field Review of Assessment and Learning. Norwegian Knowledge Centre for Education Study 13/4697; Baird, J.-A., Hopfenbeck, T., Newton, P., Stobart, G., Steen-Utheim, A.T., Eds.; Oxford University Centre for Educational Assessment: Oxford, UK, 2014. [Google Scholar]
  107. Cornford, I.R. Reflective teaching: Empirical research findings and some implications for teacher education. J. Vocat. Educ. Train. 2002, 54, 219–236. [Google Scholar] [CrossRef]
  108. CUREE—Centre for the Use of Research Evidence in Education. Understanding What Enables High Quality Professional Learning: A Report on the Research Evidence; Pearson School Improvement: London, UK, 2012. [Google Scholar]
  109. Bell, M.; Cordingley, P.; Isham, C.; Davis, R. Report of Professional Practitioner Use of Research Review: Practitioner Engagement in and/or with Research; CUREE; GTCE; LSIS; NTRP: Coventry, UK, 2010. [Google Scholar]
Table 1. Pupil populations in Pathways’ case study schools [73].
Table 1. Pupil populations in Pathways’ case study schools [73].
Measure (2015/16)MagpieRavenSparrowJackdawJayNational
Age range3–117–114–115–75–11
Total number of pupils on roll (all ages)4793131203832124,679,382
Girls on roll42.8%53.0%42.5%46.2%54.7%48.8%
Boys on roll57.2%47.0%57.5%53.8%45.3%51.2%
Pupils with a statement of special educational needs (SEN) or education, health and care (EHC) plan 14.4%0.3%0.0%2.1%1.4%2.6%
Pupils whose first language is not English (EAL)33.9%3.8%3.1%6.5%19.3%20.0%
Pupils eligible for free school meals at any time during the past 6 years (Pupil Premium)17.9%32.6%11.7%13.1%30.2%25.4%
Absence3.9%4.0%4.7%3.3%3.3%4.0%
Ofsted last inspection date2 October 20133 July 20159 February 201127 April 201029 January 2016-
Ofsted last inspection gradeGoodGoodGoodOutstandingRequires improvement-
1 Does not include pupils ‘with SEN’ or ‘with SEN but without statements or EHC plans’.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Davis-Singaravelu, S. The Potential to Build Collective Capacity for Organisational Learning in the Context of Teachers’ Use of Digital Technology for School Improvement. Educ. Sci. 2022, 12, 33. https://doi.org/10.3390/educsci12010033

AMA Style

Davis-Singaravelu S. The Potential to Build Collective Capacity for Organisational Learning in the Context of Teachers’ Use of Digital Technology for School Improvement. Education Sciences. 2022; 12(1):33. https://doi.org/10.3390/educsci12010033

Chicago/Turabian Style

Davis-Singaravelu, Susila. 2022. "The Potential to Build Collective Capacity for Organisational Learning in the Context of Teachers’ Use of Digital Technology for School Improvement" Education Sciences 12, no. 1: 33. https://doi.org/10.3390/educsci12010033

APA Style

Davis-Singaravelu, S. (2022). The Potential to Build Collective Capacity for Organisational Learning in the Context of Teachers’ Use of Digital Technology for School Improvement. Education Sciences, 12(1), 33. https://doi.org/10.3390/educsci12010033

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop