Next Article in Journal
Impact of Trump’s Digital Rhetoric on the US Elections: A View from Worldwide Far-Right Populism
Next Article in Special Issue
Learning from, through and about Differences: A Multiple Case Study on Schools as Practice Grounds for Citizenship
Previous Article in Journal
Violence, Terrorism, and Identity Politics in Afghanistan: The Securitisation of Higher Education
Previous Article in Special Issue
Citizenship Educational Policy: A Case of Russophone Minority in Estonia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Citizenship Education for Political Engagement: A Systematic Review of Controlled Trials

Department of Social Science, University of Roehampton, London SW15 5SL, UK
*
Author to whom correspondence should be addressed.
Soc. Sci. 2021, 10(5), 151; https://doi.org/10.3390/socsci10050151
Submission received: 26 February 2021 / Revised: 29 March 2021 / Accepted: 12 April 2021 / Published: 25 April 2021

Abstract

:
Citizenship Education could play a pivotal role in creating a fairer society in which all groups participate equally in the political progress. But strong causal evidence of which educational techniques work best to create political engagement is lacking. This paper presents the results of a systematic review of controlled trials within the field based on transparent search protocols. It finds 25 studies which use controlled trials to test causal claims between Citizenship Education programs and political engagement outcomes. The studies identified largely confirm accepted ideas, such as the importance of participatory methods, whole school approaches, teacher training, and doubts over whether knowledge alone or online engagement necessarily translate into behavioral change. But the paucity of identified studies also points both to the difficulties of attracting funding for controlled trials which investigate Citizenship Education as a tool for political engagement and real epistemological tensions within the discipline itself.

1. Introduction

Despite the critical democratic role Citizenship Education could and should play in encouraging and enabling political engagement, there remains a dearth of robust evidence as to “what works” (Geboers et al. 2013). Whilst academic interest in approaching the issue through robust methodologies is growing, as this Special Issue is testament to, the field lacks a sense of how many of the multitude of available evaluations can truly be considered reliable members of the evidence base. This paper is therefore the beginning of an attempt to consolidate controlled trial evidence of the causal efficacy of Citizenship Education to produce politically active citizens. This review focuses exclusively on controlled trials (ideally randomized) as a robust method for measuring cause and effect. This is not to suggest that other methods have no value in understanding citizenship education, for controlled trials are certainly limited in their explanatory power, scope, and scalability, but controlled trials represent a frequent omission in the current evidence base which is difficult to compensate through other methods. Campbell (2019) points precisely to this in a recent literature review entitled “What Social Scientists Have Learned About Citizenship Education”, and similar reviews by Bramwell (2020) and Manning and Edwards (2014) are also suggestive of a lack of controlled trials. As no systematic review of controlled trials within this area has yet been undertaken, we do so here for explorative purposes, to see how many studies of this kind exist and what aspects of Citizenship Education they address. The aims of this review are therefore two-fold: scoping and mapping, as described by Grant and Booth (2009) in their typology of reviews. These translate into two simple research questions:
  • What is the size and scope of the available research literature documenting control trials of Citizenship Education for political engagement?
  • What type of education initiatives have been described in the literature identified in (1) and what do their findings show?
Whilst we offer some discussion of pedagogical approaches, program delivery methods, and the political outcomes realized, we do not attempt a meta-analysis or grand conclusions in response to the narrower question of what exactly the causal relationship between Citizenship Education and political engagement is, as this would require us to go well beyond the capacity of the evidence we found.

2. Citizenship Education for Political Engagement

As far back as Addams ([1902] 2002), Dewey (1923), and Marshall (1950), thinkers have recognized that social justice is not guaranteed by mere legal rights but requires active and informed participation in decision making. In other words, social justice must be asserted through the ballot box and an active civil society. A strong participatory democracy (Barber 2003) grounded in equality in political engagement (Dahl 2008; Verba et al. 1995) is therefore a prerequisite for a truly inclusive society. In such a democracy, individuals from all parts of society vote and express their views within their communities to promote the kind of society they wish to see. Crucially, the health of democracies relies on political engagement from citizens of all social backgrounds. Yet in western democracies, in particular in the UK and the US, we see a recurring pattern in which the most privileged social groups are also the most politically active, and consequently able to direct political decision making toward their own interests and priorities (Dalton 2017; Verba et al. 1995). Conversely, disadvantaged groups, which should have the most to gain from asserting their democratic power, have become alienated from a political realm which is not seen as addressing their concerns or speaking their language (Bovens and Wille 2017). One hope of disrupting this vicious circle of political socialization, which reproduces and exacerbates inequalities, is to use education to politically engage all young people, regardless of social backgrounds, during their formative years (Hoskins et al. 2017; Hoskins and Janmaat 2019).
In principle, the subject in which to address young people’s political engagement at school is Citizenship Education (alternatively known as Civic Education or Civics, in the US). However, not every conception of citizenship promoted by national education systems encourages active political engagement. In some cases, the co-option by nationalistic agendas (Starkey 2018) might stress compliance, quiet obedience, or intolerance, whilst in others the subject is simply deprioritized (Burton et al. 2015) or depoliticized (from the students’ point of view, at least) through the use of a thin liberal conception of citizenship which protects the status quo. As an example of the latter, government policy on Citizenship Education in England has departed in more recent years from an agenda of political participation toward character education and moral responsibilities (Weinberg 2020). Despite this, many teachers and third-sector Citizenship Education organizations have tried to keep the original political focus alive, and it is this interpretation of the concept of Citizenship Education as a tool for encouraging political engagement which is of interest to us in this paper.

3. Why Control Trials?

Empirical research on Citizenship Education for political engagement has advanced rapidly in recent years, particularly in relation to the analysis of large international datasets such as the IEA International Citizenship and Civic Study (cross-sectional and comparative data) and even some longitudinal datasets at the national level, such as the Citizenship Education Longitudinal Survey (England). This allows for the analysis of varying degrees of exposure to diverse forms of learning citizenship across educational pathways and different education systems. For example, Hoskins and Janmaat (2019) find an association between exposure to Citizenship Education in schools in England and voting intentions at age 16 and, particularly encouragingly, some indication that disadvantaged students appear to benefit the most (Hoskins and Janmaat 2019). However, Hoskins et al. (2012) also warn that Citizenship Education does not always have this positive effect, and it is in establishing exactly “what works” that the picture become far less clear, not least because modes of delivery, program design, and implementation can vary considerably within the same education system. One attempt to parse apart different pedagogical approaches to Citizenship Education has been through the conceptual distinction between acquisition and participatory models of learning (Sfard 1998), with some within the field suggesting that the evidence weighs more heavily on the success of the participatory approaches (Hoskins et al. 2021). For example, there is very strong evidence that an open classroom method of learning, which would be considered an inherently participatory approach, is associated with political engagement (Torney-Purta 2002; Campbell 2008; Hoskins et al. 2012; Quintelier and Hooghe 2012; Keating and Janmaat 2016; Knowles et al. 2018), positive attitudes towards political engagement (Hoskins et al. 2021; Geboers et al. 2013, p. 164), critical thinking (Ten Dam and Volman 2004), citizenship skills (Finkel and Ernst 2005), political knowledge (Hoskins et al. 2021; McDevitt and Kiousis 2006), and political efficacy (Hoskins et al. 2021). Such evidence is certainly highly suggestive, but does it demonstrate causation?
In reality, convincingly establishing causation between different types of Citizen Education programs and political engagement outcomes is something that can only be approached by degree. There is no panacea, and the notion of unequivocal demonstrable causality falls apart on metaphysical as well as methodological grounds. Nevertheless, there are pragmatic criteria (such as Bradford Hill (Hill 2015)) which can be turned to when making a case for or against the existence of a causal relationship. Different methodological approaches allow for different elements of such criteria to be invoked. For example, theory-led approaches may allow for plausible causative mechanisms to be revealed, whilst longitudinal data may allow one to show that the suspected cause temporally precedes the implied outcome. Away from the analysis of secondary data, many small-scale evaluations of specific Citizenship Education initiatives combine these two principles by explaining the theoretical basis of the program and then administering surveys to participants before and after the program. A successful example of this is Oberle and Leunig (2016), who used this approach to suggest that using simulation games in Citizenship Education classes can lead to improved knowledge about the European Union’s political processes and increasing levels of trust, in particular for more socioeconomically deprived groups.
But controlled trials can add unique value to this mix of methods, as they have a characteristic not available to other methods (we should note here that Oberle and Leunig themselves acknowledge that control groups would have strengthened their study). For whilst statistical techniques applied to data may attempt to retrospectively estimate the effect of both observed and omitted variables (i.e., unobserved heterogeneity), they cannot be expected to satisfactorily reconstruct the counterfactual. In other words, what would have happened if the participants did not receive the educational treatment? By comparison, a randomized controlled trial (RCT) comes as close as is possible outside of laboratory experiments to reconstructing the counterfactual by introducing a control group whose members are subject to the same measurements (normally pre- and post-intervention) as the treatment group but are not exposed to the treatments itself. Given sufficient numbers, the statistical expectation is that the random allocation of individuals to the control or treatment group reduces any other difference between the groups other than their exposure to the treatment, with the highest level of confidence requiring multiple trials carried out by independent research terms, each with large numbers of participants. In this review we also include studies in which the allocation of participants or participant-groups is not strictly random, as Citizenship Education initiatives are frequently compelled to make use of existing organizational structures, such as classes within schools. This clearly weakens the method to some extent but can still be a useful step toward making a causal argument if the groups have comparable baseline characteristics and are in the same environment.
As Connolly et al. (2017, p. 14) put it, “What RCT’s offer, therefore, is not just the opportunity to provide robust evidence relating to whether a particular program is effective or not, but also—and over time—the creation of a wider evidence base that allows for not only the comparison of the effectiveness of one program or educational approach over another but also for how well any particular program works in specific contexts and for differing subgroups of learners”. Yet control trials have also been contested within education research. Connolly et al. (2018a) identify four underlying criticisms: (1) that RCTs are not possible, on a practical level, to undertake; (2) that they ignore context; (3) that they seek to generate universal laws of cause and effect; and (4) that they are inherently descriptive and do not advance theoretical understanding. But the subsequent analysis by these authors of over 1000 RCTs of educational initiatives casts doubt over each of these criticisms, demonstrating that controlled trials can be undertaken, can acknowledge context by including process evaluation and differentiating effects on subgroups, can discuss the limitations of the generalizability of findings, and can be both rooted in theory and make arguments for the future development of theory. Though Connolly et al. (2018a) also note that the extent to which particular studies address these concerns can vary, and the debate within educational research continues. Each of these points of contestation are as applicable to Citizenship Education research as they are to educational research in general, to which might be further added the particularly acute influence of Paolo Freire’s critical pedagogy (Freire 1996) on Citizenship Education for political engagement (Crawford 2010) and by association his scrutiny of research power dynamics and wariness of techniques associated with positivism and the reinforcement of structures of control (Freire 1982; Brydon-Miller 2001). We do not resolve these debates in this paper, but simply note them as an important context prior to presenting the results of the systematic review.

4. Method

4.1. Search Protocols

Our approach is similar to that of Sant (2019), who recently undertook an exploratory systematic review within a related field, though focusing on conceptualizations rather than controlled trials. The systematic review begins with searches for standardized terms (known as protocols) in all appropriate academic databases before the articles were screened manually. We operationalized our focus on controlled trials within the search protocol through the inclusion of the term “controlled trial” as well as the common variant “control trial”. The abbreviation for randomized controlled trials, “RCT”, was found to be largely redundant given the previous terms and was left out, as it leads to the inclusion of studies on Rational Choice Theory. We also include the terms “citizenship” or “civic” along with “education”, capturing what we believe to be the most common signifiers within the field. Admittedly, there is now a proliferation of different terminology used for Citizenship Education, both in schools and also non-formal learning within the youth and third sectors, so our coverage cannot be considered complete. Variants such as Global Citizenship Education, Education for European Citizenship, and Education for Democratic Citizenship each have slightly different meanings and associations, but by including the words “education” and “citizenship/civic” as free floating search terms rather than joining them into a phrase (i.e., “citizenship education” or “civic education”), our searches should at least include studies which use alternative phrasings of this type.
The word “political” is also included to narrow the results to those studies concerned with education as a route to political engagement rather than the nationalistic or liberal (depoliticized) conceptions of Citizenship Education described previously. As with all the qualifier terms, and no more so than with the word “political”, the mere use within a search protocol does not guarantee that the resulting articles reflect the meaning of the words in the way we would wish them to. The false inclusion of articles by the protocol, whereby studies do not, for example, measure what we consider to be political outcomes, is dealt with during the manual screening process explained in next section and is only problematic in so much that it necessitates subjectivity and injects some inefficiency into the review process. Of far more concern were false exclusions, whereby the protocol, when applied to a database, does not return articles which actually do describe control trials of Citizenship Education for political engagement.
Indeed, it soon became apparent that searching conventional academic databases and indexes was producing sparse results. To give one example, the Web of Science produced only three results which fulfilled the criteria, two of which passed the manual screening. This trend was widely repeated with 40 other relevant databases, yielding just 125 results, with only four of these passing the manual screening. This appears to be due to the inability of most databases to perform full text searches on many of the articles and our requirement of four search terms to properly specify what we were looking for. We therefore turned to Google Scholar, whereby the same protocol produced results of an entirely different scale of magnitude (>13,000) and included all the studies from the conventional academic databases previously searched that had successfully passed the manual screening stage. Although Google Scholar is far more restrained in the sources it draws from than a conventional Google search, its coverage is much wider than curated academic databases, is inherently multidisciplinary, and makes use of semantic search algorithms which attempt to return results corresponding to the meaning of the search terms rather than only literal matches. All of this contributes to a liberal return of results, but with a trade-off in accuracy and reproducibility, and makes Google Scholar rather less systematic than is ideal for a systematic review, as documented as well by Gusenbauer and Haddaway (2020). But these same authors note the popularity of semantic search engines for exploratory research. Moreover, despite the shortcomings, this study is illustrative of their undoubted appeal in this regard, as it was only Google Scholar that allowed for the studies we eventually selected, albeit combined with considerable manual screening. Researchers will find that an immediate problem which arises when taking this more inclusive route is that the number of results can exceed the capacity for manually screening. In our case, the inspection of the results showed them to be dominated by medical studies of little relevance, RCTs being far more prevalent within medical research. Therefore after some experimentation, we found that by using some medical terms as disqualifiers we were able to reduce the search results back to a manageable number of 2620 articles which progressed to the manual screening stage.

4.1.1. Search Protocol

“education” AND “political” AND (“citizenship” OR “civic”)
AND (“control trial” OR “controlled trial”))

4.1.2. List of Academic Databases Searched

ACM Digital Library, Annual Reviews, Bloomsbury Collections, BMJ Journals, Brill Journals, Cambridge Companions Online, Cambridge University Press Journals, Directory of Open Access Journals, EBSCO Child Development & Adolescent Studies, Education Index Retrospective: 1929–1983, Education Research Complete, Educational Administration Abstracts, Emerald Group Publishing Limited, Emerald Social Sciences eBook Series Collection, ERIC (Educational Research Information Centre), Google Scholar Ingenta Connect, JSTOR Arts & Sciences I Collection, JSTOR Arts & Sciences II Collection, JSTOR Arts & Sciences III Collection, JSTOR Arts & Sciences IV Collection, JSTOR Arts & Sciences V Collection, JSTOR Arts & Sciences VI Collection, JSTOR Arts & Sciences VII Collection, JSTOR Current Scholarship Journals, JSTOR Life Sciences Collection, Linguistics and Language Behavior Abstracts (LLBA),Oxford Journals, Project Muse, ProQuest Ebook Centra, PsycARTICLES, PsycBOOKS, PsycTESTS, SAGE Research Methods (SRM), ScienceDirect, Social Theory: First Edition, SpringerLink, Taylor and Francis Journals, Wiley Online Library, WorldCatOCLC, WorldCat.org

4.1.3. Amended Search Protocol for Google Scholar

(“citizenship” OR “civic”) AND “political”
AND (“control trial” OR “controlled trial”) AND —”HIV” AND —”illness” AND —”nursing” AND —”medical”

4.2. Manual Screening

All 2620 articles identified by the amended search protocol were then screened manually, first by title, then by abstract, and then by full text where necessary. The process through which a decision was made as to whether to include an article in the final list can be conceived of as a set of criteria, some of which are objective in nature and therefore simple to apply, and some of which unavoidably require more subjective judgments. We briefly list the criteria below and provide some examples of the more subjective judgments which were made in implementing the final two criteria.
  • Article returned by search protocol. Results were not filtered by date, though the oldest study identified as fulfilling all of the subsequent criteria below was published in 2006.
  • Article provides sufficient detail in English (or has an accessible English translation available) on which to make assessments for all other criteria. A certain amount of detail of the study is required in order to make an informed judgment. If a study was briefly outlined in an article with references to a more adequate description elsewhere, then it was included on the basis of the secondary source. It should be noted that the search itself biases results toward English language articles, as the search terms entered were in English.
  • Article is not a representation of a study which has already be identified. Although Google Scholar is efficient in nesting multiple versions of the same article within a single result item, occasionally multiple accounts of the same evaluation were found (e.g., a policy paper and academic article), in which case the most complete account was selected.
  • Study uses control groups to produce quantitative data to which statistical testing is applied. Studies which do not use control groups, use comparison groups only for qualitative purposes, or do not deploy statistical testing on results were excluded. However, no stipulations were made on sample size, and allocation to control groups did not need to be random.
  • Study evaluates an education scheme. Whilst the interdisciplinary nature of Google Scholar allows for studies to be included which have not been published in education journals, it creates a slight issue during screening in having to decide what represents an educational program. In the case of Citizenship Education, it is not appropriate to limit a review to initiatives which take place within formal learning environments such as a school. Rather, we must make a wider but more subjective judgment as to whether the scheme involved a process of systematic formative instruction rooted in pedagogy. In practice, this meant the exclusion of short-term positive reinforcement or suggestive “nudge” mechanisms such as those studied by Aker et al. (2011); Bond et al. (2012) and Costa et al. (2018). Similarly, real-life exposure to political events outside of a learning framework was also excluded, though some studies of this type may nevertheless be instructive for the design of future educational programs. For example, Wong and Wong (2020) undertook an interesting RCT involving exchange students during the Umbrella Movement in Hong Kong, but the experience was not situated within an educational framework, and to include such studies would imply the review should also look at the effect of other life experiences on politicization and begin to broaden the topic away from our core concern.
  • Study measures political outcomes. Given that one of the gaps in the evidence base is an accepted theory of change for instigating political participation, we take a broad approach to political engagement, that includes both political actions (protesting in all the diversity of ways this occurs, including both online and offline voting in elections at different levels and contacting and volunteering for political parties) and the competences (attitudes, values, knowledge, and skills) that enhance the quality of the engagement and enable competent political behavior. The list of possible knowledge, skills, attitudes, and values that this could encompass are vast, but a useful delineation which resonates with our own understanding is the Council of Europe (2018) reference framework for democratic culture. In practice, this amounted to the exclusion of initiatives aimed at developing teamwork or individual character traits featured prominently in the search results, but for the most part had little direct relevance to political engagement (e.g., Siddiqui et al. 2019; Connolly et al. 2018b; Silverthorn et al. 2017; Siddiqui et al. 2017; Kang 2019). We also found several studies dealing with conflict resolution, community cohesion, and reducing violent behavior, but these were again screened out, as their concern was generally restricted to harmonious societal relations rather than active political behavior (e.g., Niens et al. 2013; Chaux et al. 2017; Enos 2013), though we acknowledge that counter-arguments could be made here.

5. Results

5.1. What Types of Programs Have Been Tested by Control Trials?

In total, 25 controlled trials which test political outcomes deriving from educational initiatives have been identified (Table 1). To structure the discussion of these studies we group the RCT articles based on different approaches that have been considered, within the international practitioner field of citizenship educators, to be successful in teaching Citizenship Education (UNESCO 2015). The first three categories describe different strategies to delivering Citizenship Education within schools. School-based Citizenship Education can either be delivered as a stand-alone program, as a cross-curricular approach, or as a holistic whole school approach which influences multiple aspects of school life under a guiding ethos. Underpinning each of these three is teaching training, which can itself be the focus of initiatives and therefore represents our fourth category. However, Citizenship Education does not only happen within schools, and any initiative outside of the education system (e.g., by NGOs or community groups) is referred to as “non-formal”, and the articles on such programs comprise our fifth category. Our final two categories could occur both in non-formal programs and in the various aspects of school life. These two themes describe initiatives with a clear participatory learner-centered approach (category six) and those looking to unlock the potential of digital techniques, generally within online environments (category seven). Our categories should not be considered mutually exclusive parts of a comprehensive typology, but rather as useful ways to present the results which reflect common practitioners’ vocabulary. To avoid repetition in the discussion below, we focus upon the most illustrative studies for each category, with Table 1 representing a more thorough categorization, in which some articles are tagged as belonging to more than one category.

5.1.1. School-Based Program (Stand-Alone)

The classroom is the theatre in which specific teaching practices play out, and it is the specific activities within the classroom which most immediately come to mind when thinking of Citizenship Education. Representative of this is the Student Voice program (Syvertsen et al. 2009), in which students practice civic skills, debate political issues, and connect their own community interests to the platforms of candidates before simulating the process through mock elections. Teachers invite local candidates and journalists into the schools for question-and-answer sessions with students. The RCT was of 1670 high school students in 80 social studies classrooms and found significant effects of the program on various self-reported political measures, such as the ability to cast an informed vote, knowledge of the voter registration process, belief that their vote matters, communication with others at school about politics, sense of civic obligation, and media use and analysis. This alone is quite persuasive evidence that the type of basic participatory good practices long spoken about in the field (Hoskins et al. 2012) can show signs of causal efficacy under control trial conditions.
Yet some programs have gone beyond this standard good practice and produced intriguing results in doing so. Notably, the study by McDevitt and Kiousis (2006) of the Kids Voting program appears to show that incorporating the students’ home environments as part of the learning environment may bring an added effect. The Kids Voting program included experiential learning based on group-problem solving, peer discussion, and cooperative activities, and in many ways is somewhat analogous to the previously described Student Voice Program. However, what seems to be unique to this program is that it includes activities for the children to complete with their families, such as creating a family election album, roleplaying in which students act as political reporters interviewing family members, and a children’s ballot where students can cast a vote at the same polling stations as their parents. The analysis of 491 students aged 16–18 years old suggests that the interplay of influences from school and family magnified the effects of the election-based curriculum and sustained them in the long term, resulting in an increased probability of voting for students when they reached voting age.
However, not all school-based activities will be as successful as hoped, and given the publication bias toward positive results, it is extremely useful to have control trial evidence of the possible limits of some approaches. For example, a promising interactive environmental program which, as in the previous study, involved activities for children to complete with their own families, was ran in the UK. Yet Goodwin et al. (2010) found in their study of 448 primary school students in 27 primary schools that there were no effects compared with the control group on behavior, and an extended version of the program did not yield positive results. There is no clear reason why the program did not produce better results, though the vagaries of context and implementation can be difficult to appreciate from a distance. The authors themselves note that the awareness of the control group also rose during this period, which would seem to suggest contextual complications.
Continuing on a cautionary note is the study by Green et al. (2011), who strongly question the assumption that knowledge alone leads to attitudinal or behavioral change. They undertook an RCT of an enhanced civics curriculum of 1000 15 to 16 year-old students in 59 high schools. The curriculum looked to increase their awareness and understanding of constitutional rights and civil liberties, and although the students displayed significantly more knowledge, no corresponding changes in their support for civil liberties were found. The association between knowledge and behavior change has been critiqued before, not least from the stance of critical pedagogy, which suggests that the assimilation of knowledge can lead to a passive acceptance of the status quo, but to have such clear control trial evidence of the inability of knowledge alone to lead to political behaviors is of real value.

5.1.2. Cross-Curricular Approach

Whilst the efficacy of the acquisition of knowledge alone is widely doubted, the significance of skills development is a much more contested area, and one study provides evidence that learning environments which consistently encourage social skills can encourage political engagement. Holbein (2017) addresses this by testing the hypothesis that the targeted development of social and emotional skills can in itself lead to behavioral changes in political engagement. The study looks at the impact of a wide program of interventions to develop social and emotional skills including parent training, peers training, stories, films, games, roleplays, and joint reading activities. The study involving 812 students across 55 schools seems to point toward the importance of the quality of social interaction within the learning environment for the development of these skills rather than the valorizing of a single activity. The finding is quite striking, as it seems to indicate that the early development of psychosocial skills leads to a noticeable increase in long-term voter turnout.
In some jurisdictions, schools can decide to run Citizenship Education itself across the curricula, traversing traditional subject areas. One successful example of this was the science and civics instruction used to promote sustainable development in the article by Condon and Wichowsky (2018), who studied the program for 11–14 year-olds aimed to develop citizen-scientists in the US. The program was based on a real-world, community improvement, and problem-based inquiry that focused on reducing the unnecessary use of resources. It gets students to monitor the use of gas, electricity, and water in their home and in their school and to conduct experiments to identify if they can reduce consumption. The clustered RCT included 551 students across 13 schools and found that integrating science and civics into a unit about community water conversation improved engagement in both areas.

5.1.3. Whole School Approach

The ultimate elevation of school-based Citizen Education from a single subject, and even beyond a cross-curricular approach, is the whole school approach (Gibb 2016). Given that the practice itself is less common, we are fortunate to have the experiment by Gill et al. (2018) involving a U.S. charter school which uses control trial principles to evaluate the effects of a whole school approach driven by the unique mission and strategy of the organization. One of the more unique features of these types of schools in the U.S. context is that they are publicly funded schools but independent from officials and yet still have a core civic mission. The specific school studied, “Democracy Prep”, has educated more than 5000 students across multiple campuses in New York, and its mission statement is “to educate responsible citizen scholars for success in the college of their choice and a life of active citizenship”. This school facilitates the learning of citizenship throughout its curricula, including experiential learning (visiting legislators, attending public meetings, testifying before legislative bodies, and running get out and vote campaigns during elections) and more traditional knowledge-based activities like writing essays on civic and governance. To give just one specific example, during the final year students develop a “change the world” project that investigates a real-world social problem, then design a method for addressing the issue, and then implement their plan. By taking advantage of the random allocation of 1060 students (due to oversubscription) into the charter school, Gill et al. (2018) found that those who were admitted to the school went on to have an increased probability of future voting. This is very important evidence that a school, by adopting a civic mission and civic ethos, which then allows citizenship to flow into all aspects of school life, can motivate tangible differences in political behaviors.

5.1.4. Teacher Training

Any Citizenship Education scheme is only as good as its implementation, and it can be easy to overlook the differences in the capabilities and enthusiasm of teachers to deliver programs. Indeed, there are two studies which provide some indication that investing in the development of teachers really can make a difference. For example, Andersson et al. (2013) showed that an initial teacher training on education for sustainable development (ESD) led to positive effects regarding the attitudes, perceptions, felt personal responsibility, and desire to contribute toward sustainable development among the student-teachers. This comes from an analysis of parallel-panel data surveys of 404 student-teachers which included a control group but was not randomized.
Whether or not well-intentioned teachers are then able to pass this on to their own students is of course another question. But the Facing History program studied by Barr et al. (2015) suggests that arming teachers with conceptual tools and teaching materials can result in observable changes in the students. The program was evaluated in the US through an RCT amongst 14–16 year-olds (n = 1371) and found that when teachers that had received this training and given the materials brought the program into the classroom, it promoted respect and tolerance for the rights of others among the students, an increased awareness of prejudice and discrimination, and a sense of civic efficacy. Whilst untangling the training of the teachers from the classroom methods they then implement is difficult, these examples provide some evidence that quality teacher training should at least be a component of introducing effective political Citizenship Education into the classroom.

5.1.5. Non-Formal Education

Stepping momentarily away from schools, we now consider some control trials which looked at interventions outside of the formal education system and are therefore referred to as “non-formal”. Some of these non-formal programs look at the effect of community or group-level initiatives on the political engagement of the individual. For example, Blattman et al. (2011) used a clustered RCT to evaluate a community empowerment program in Liberia across over 230 communities. Their study measured the respect for human rights, equality, civic participation, and community cohesion, and the findings showed modest increases in the first two but little change in the latter two. The authors also stress that the observed impacts were not always in expected ways, which perhaps highlights the complexity of operating in the community and the relative lack of control organizers have over such socially dynamic environments when compared to a school setting.
More encouragingly, in the UK, a Cabinet-Office-funded evaluation of the National Citizen Service program by Booth et al. (2014) yielded some positive results. The National Citizen Service runs over five phases, from residential inductions to community-based action projects. Though initially restricted to self-reported attitudes the quasi-experimental study goes on to measure overall increases in community engagement, volunteering, and intention to vote amongst 7379 of the 15 to 17 year-olds in the study.
Other non-formal initiatives looked at the effect of providing basic information to adults. For example, Pang et al. (2013) investigated the effects of training women in China on their voting rights for village committee elections. Involving 700 adults, the RCT demonstrated that the women who had received the training not only had a greater knowledge of their rights but were also more likely to exercise these rights. The authors are clear that the study shows that the lack of basic knowledge in rural villages is a barrier to voting in village committee elections. Barros (2017) also looked at the effect of providing basic information on the importance of voting, concluding that the participants studied in Portugal could be encouraged to vote if this led to their valuing the act itself, a phenomenon the author terms warm glow voting. These results appear to nuance the previous observation that knowledge does not lead to action, by showing that, in specific contexts, and in applied settings rather than in the classroom, basic timely information can make a difference. However, as acknowledged in the latter experiment, it is the value placed on the act as a result of a greater understanding, rather than merely the knowledge itself, which is ultimately responsible for motivating the action.
An interesting project that operated as a hybrid between formal and non-formal education and combined knowledge acquisition with participatory approaches was conducted in Peru (Agurto and Torres 2020). This project combined knowledge acquisition on financial literacy and life skills training on leadership, public speaking, and team-work with sending students as ambassadors into the community as change makers to support the provision of basic bank accounts and financial inclusion for disadvantaged communities. The project involved 131 students from a university scholarship program and led to an increased level of self-efficacy, empowerment, and community engagement for female students.
Finally, Bowen and Kisida (2018) looked at different perceptions of civil rights after Holocaust museum visits. They report a positive impact on students’ desires to protect civil rights and liberties across 865 students participating in an RCT in 15 middle and high schools. However, the effects are limited and seem to stop short of behavioral change, with no significant evidence that the intervention affected students’ sense of civic obligation, empathy, willingness to take on roles as upstanders, or inclinations toward civil disobedience. This study is therefore more consistent with the notion that knowledge alone, even when affecting students, has its limits in triggering political mobilization. There are also notable interactions with gender, ethnicity, and social class which should serve as a warning of the danger of drawing universal conclusions from controlled trials and the benefit of obtaining large sample sizes, so that these finer grain analyses can be investigated.

5.1.6. Participatory Approaches

Many of the initiatives described by the articles identified in this review have made some use of participatory techniques to a greater or lesser extent, among which we can include regular discussions, debates, and simulation exercises (such as mock elections and trials) (Hoskins et al. 2012). For example, Kawashima-Ginsberg (2013) found evidence for the efficacy of exactly these practices in a control trial analysis of 10 to 16 year-old pupil scores on the national civics assessment test. For brevity, we will not repeat the description of other studies with common participatory elements described under different headings, but would encourage readers interested in this theme to look at the studies by Syvertsen et al. (2009); McDevitt and Kiousis (2006); Gill et al. (2018); Condon and Wichowsky (2018).
That said, special attention under this heading is given to a couple of articles which are particularly instructive. Firstly, a very thorough participatory approach was studied by Ozer and Douglas (2013). This program in the U.S. tested the difference that participating in youth-led research has for the young people involved. The approach is learner-centered at every stage, with the research topics selected by the students themselves, and consequently included a diverse range of topics, such as: prevention of school drop-out; stress related to family, academics, or peers; improving the school lunch; cyber-bullying; improving teaching practices to engage diverse students; and improving inter-ethnic friendships at the school. The RCT study involved 401 students at five high schools and found that attending these participatory research elective classes during the school day was associated with increases in the students’ sociopolitical skills and motivation to influence their schools and communities. The indication that learner-led approaches such as this may circumvent the previously discussed disconnect between knowledge and motivation to act is a primary attraction of participatory methods over more acquisition-based approaches.
Secondly, the study by Feldman et al. (2007) is quite unique, as it was able to isolate the effects of various elements of a Student Voice program. The program as a whole was quite participatory in that teachers were given a framework of election-based activities but could deviate significantly based on student interests. Overall, the program produced increased interest, knowledge, and efficacy in regards to politics, as measured across 22 U.S. high schools, each of which had a control group. But they were also able to show that it was political discussion within classrooms which was the primary driver of this change, more so than other eye-catching activities within the program, such as actually meeting the election candidates.

5.1.7. Digital

Perhaps the timeliest studies are those which evaluate the emergence of online learning environments. The findings across this section suggest that the digital world is similar to the offline world and that it is high engagement actions, in this case the student-led creation of content, that lead to changes in attitudes and behavior.
A study by Smith et al. (2009) stresses the importance of active participation in online environments. They conducted a novel RCT of online discussions on moderated chatrooms using a large mixed age market research panel in the UK (n = 6009) and found that only those who posted content showed evidence of developing their opinions through discussion. This is contrasted with those who spent time reading the message boards but did not actively post themselves, and subsequently showed no discernible change in opinions. Strandberg (2015) carried out a similar online RCT deliberation across 70 adults in Finland, finding that some alleviation of the polarization of opinion as well as the participants’ feelings of efficacy.
The importance of social support within online environments is taken up by Levy et al. (2015), who studied a sample of 309 US high school students, out of which one class was instructed to keep political blogs to document their thoughts on the unfolding election. The authors find that the “bloggers” developed greater political interest and confidence in their political skills and knowledge, even when compared to their peers in other government courses. However, the authors also note that some students got frustrated at the lack of responses to their blog posts, pointing toward the importance of a receptive audience within a community of learning if this technique is to be further developed. On this same point, Margetts et al. (2009) showed that a mechanism can be built into online environments which simulates the social support and pressure of collective action. Their controlled trial found that among 668 adults, it was those who had received positive feedback from supportive participants who were more likely to go on to sign more online petitions. But a note of caution is sounded by Vissers et al. (2012) to those who assume that online political activity necessarily translates into offline action. Their RCT study on Belgian university students found that learning activities run online on climate change only influenced online behavior and did not change offline behavior.
Yet the evidence that an online environment can develop core political skills is stronger. A study from Hong Kong, China (Chan 2019), looked specifically at the use of a digital storytelling program run through the online platform Facebook for the development of civic identity and skills. Though not explicitly political, we include this RCT, involving 87 16 to 24 year-olds outside the formal education system, as it showed evidence of improvements in relevant skills and dispositions, namely enhanced critical thinking, along with an accompanying decline in ethnocentric views. The article by Kawashima-Ginsberg (2012) also demonstrates how online methods can stimulate political skill development. Using assessment scores to evaluate an iCivics computer-based teaching module, they showed through a clustered RCT of 1526 students in 42 schools in the US, of students aged 12 to 15 years-old, that the program was effective in improving the grades students received from writing a persuasive letter to a newspaper.

6. Why Aren’t There More Control Trials?

During the course of our searches, we also came across several papers which help to explain why there have not been more RCTs in Citizenship Education. These largely reflect the more general concerns of applying RCTs to the education research discussed previously, but with specific reference to Citizenship Education. Some of these underline the valid, practical concerns that the demands of running a satisfactory RCT are too exacting and expensive. Bakker and Denters (2012) point out that the ideal of a classical experiment is generally unachievable, as the number of subjects in each of the treatment and control groups really has to be quite large to even out the variance in all relevant characteristics, and this is without considering whether the true unit of analysis should be the collective rather than the individual (the clustering of students within classes and schools should at least be taken into account). In a similar vein, Shek et al. (2012) note that it is very expensive to conduct randomized group trials in an adequate variety of settings to demonstrate the generalizability of a program outside a specific set of conditions. Yet there is more fundamental epistemological and ontological resistance. Mathison (2009) questions whether certain assumptions might be part of a neoliberal ideology of efficiency and commoditization within education, including the notion that accountability is necessarily good if linked to competitive marketplace practices or narrow econometric thinking. Postcolonial critiques, such as that given by Singh et al. (2018), point out that the relationship of the researcher-researched has been compared to that of the colonizer-colonized, particularly when reliant on the types of standardized measures which are a feature of all RCTs. For such reasons, decoloniality has tended to favor the transparency and inclusiveness of qualitative or participatory research praxis. Yet we also found the argument that RCTs can be a part of progressive post-positivism. Shek et al. (2012) suggest in their evaluation of a youth development course in Hong Kong that post-positivism can be understood as embracing the multiplicity of available methods, rather than valorizing certain qualitative approaches, whilst Singh et al. (2018) go on to reject that quantitative paradigms are impermeable to reflexivity and decoloniality and begin to demonstrate how the methodological principles of controlled trials can be more reflectively administered so as to properly acknowledge oppression. Bakker and Denters (2012) note the parallels between experiments and action research as a reason for optimism, in that both actively interfere in reality. This points to a possible path toward rehabilitation for controlled trials if they follow action research tenets to place the disadvantaged group as the primary stakeholder and client, which may involve minimizing the influence of preconceived policy and academic agendas. Bakker and Denters (2012) go on to suggest the design experiment methodology represents a way forward (the term “design” referring to the blueprint of a new instrument that is to be developed during the research process). Stoker and John (2009) similarly indicate that if experimentation is stripped of its black box dogmatism and researchers try to directly observe and understand apparent change, then comparison groups can still play an important role in providing policy makers with the type of evidence they respond to.

7. Conclusions

The number of control trials which truly address Citizenship Education for political engagement is unsurprisingly small. Not only does the field have a history of institutional abandonment and co-option, but there is some reluctance within the research community to fully embrace controlled trials. This concern is based on a desire to promote the interests of powerless and unrepresented groups, but those who champion controlled trials also share that same goal and see those groups as poorly served by a lack of understanding as to which educational methods really do work to break the cycle of political socialization which reproduces and exacerbates inequalities. Reconciling these epistemological tensions within the field will doubtless be an ongoing theme over the coming years.
It would be premature to draw too concrete conclusions, given the very limited evidence base, but the general picture is one which appears to broadly confirm the existing knowledge in the field rather than revealing new findings, underlining the role of control trials in ensuring that an existing educational method is effective. The starkest gap in the evidence base is geographical, with 17 out of the 25 studies being from the US or UK and only four studies evaluating projects from the global south, with two of these from China. This is particularly important, given that there can be no safe assumptions that findings in one cultural context will stand in another.
The studies identified are quite evenly split between those which aim to improve knowledge and skills and those which seek to change attitudes or behaviors. These two domains do not necessarily cross-pollinate, and many of the studies which showed enhanced cognitive learning did not show alterations to behavioral change, a point made most explicitly by Green et al. (2011). However, the studies do suggest some nuance is necessary with this view, as it seems that the provision of basic knowledge on civic duties, such as how to vote and why it is important, may initiate changes in attitudes and behaviors in circumstances in which this base awareness is lacking (Pang et al. 2013; Syvertsen et al. 2009). Likewise, the teaching of psychosocial or noncognitive skills, even when separated from political education, appears to yield promising results (Holbein 2017). But most of the studies which led to changes in attitudes or behaviors were essentially participatory. The clearest examples of this participatory approach is perhaps Ozer and Douglas’ (2013) study of a participatory research class and McDevitt and Kiousis’ (2006) study of simulated political discussions within families. There are also signs that the participatory approach to attitudinal and behavioral change is also applicable to online interventions, with the evidence being that active engagement (as opposed to passively viewing) and peer feedback mechanisms play a similarly critical role online, as they do offline (Smith et al. 2009; Strandberg 2015; Margetts et al. 2009), though whether online engagement translates into offline action remains in doubt (Vissers et al. 2012). The evidence also supports the effectiveness of a whole school approach (Gill et al. 2018) and of the necessity of quality teacher training (Barr et al. 2015).

Author Contributions

Writing—Original Draft: S.D., B.H. Both authors have participated in all sections. Both authors have read and agreed to the published version of the manuscript.

Funding

This report was supported by the Robert Bosch Stiftung.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interests.

References

  1. Addams, Jane. 2002. Democracy and Social Ethics. Urbana: University of Illinois Press. First published 1902. [Google Scholar]
  2. Agurto, Marcos, and Sandra Buzinsky-Fernando Fernandez-Javier Torres. 2020. Empowering College Students through Community Engagement: Experimental Evidence from Peru. No. 2020-7. Miraflores: Lima School of Economics. [Google Scholar]
  3. Aker, Jenny C., Paul Collier, and Pedro C. Vincente. 2011. Is Information Power? A Study of Voter Education Using Cell. London: Department for International Development. [Google Scholar]
  4. Andersson, Klas, Sverker C. Jagers, Annika Lindskog, and Johan Martinsson. 2013. Learning for the future? Effects of education for sustainable development (ESD) on teacher education students. Sustainability 5: 5135–52. [Google Scholar] [CrossRef] [Green Version]
  5. Bakker, Judith, and Bas Denters. 2012. Experimentando con un diseño experimental. Revista Internacional de Sociologia 70: 121–41. [Google Scholar]
  6. Barber, Benjamin. 2003. Strong Democracy: Participatory Politics for a New Age. Berkeley: Univ of California Press. [Google Scholar]
  7. Barr, Dennis J., Beth Boulay, Robert L. Selman, Rachel McCormick, Ethan Lowenstein, Beth Gamse, Melinda Fine, and M. Brielle Leonard. 2015. A randomized controlled trial of professional development for interdisciplinary civic education: Impacts on humanities teachers and their students. Teachers College Record 117: 1–52. [Google Scholar]
  8. Barros, Henrique P. 2017. Warm Glow Voting? An Analysis of Turnout in Portugal. SSRN Electronic Journal. [Google Scholar] [CrossRef] [Green Version]
  9. Blattman, Christopher, Alexandra Hartman, and Robert Blair. 2011. Can We Teach Peace and Conflict Resolution?: Results from a Randomized Evaluation of the Community Empowerment Program (CEP) in Liberia: A Program to Build Peace, Human Rights, and Civic Participation. New Haven: Innovations for Poverty Action. [Google Scholar]
  10. Bond, Robert M., Christopher J. Fariss, Jason J. Jones, Adam D. I. Kramer, Cameron Marlow, Jaime E. Settle, and James H. Fowler. 2012. A 61-million-person experiment in social influence and political mobilization. Nature 489: 295–98. [Google Scholar] [CrossRef] [Green Version]
  11. Booth, Caroline, Daniel Cameron, Lauren Cumming, Nicholas Gilby, Chris Hale, Finn Hoolahan, and Jayesh Navin Shah. 2014. National Citizen Service 2013 Evaluation: Main Report. Resource document. London: Ipsos MORI Social Research Institute. [Google Scholar]
  12. Bovens, Mark, and Anchrit Wille. 2017. Diploma Democracy: The Rise of Political Meritocracy. Oxford: Oxford University Press. [Google Scholar]
  13. Bowen, Daniel, and Brian Kisida. 2018. An experimental investigation of the Holocaust educational impacts on students’ civic attitudes and behaviors. Paper presented at the Meeting of the American Educational Research Association, New York, NY, USA, April 13–17. [Google Scholar]
  14. Bramwell, Daniela. 2020. Systematic review of empirical studies on citizenship education in Latin America 2000 to 2017 and research agenda proposal. Citizenship, Social and Economics Education 19: 100–17. [Google Scholar] [CrossRef]
  15. Brydon-Miller, Mary. 2001. Education, research, and action theory and methods of participatory. In From Subjects to Subjectivities: A Handbook of Interpretive and Participatory Methods. New York: New York University Press, p. 76. [Google Scholar]
  16. Burton, Diana, Stephanie May, and Liverpool John. 2015. Citizenship education in secondary schools in England. Educational Futures 7: 76–91. [Google Scholar]
  17. Campbell, David E. 2008. Voice in the classroom: How an open classroom climate fosters political engagement among adolescents. Political Behavior 30: 437–54. [Google Scholar] [CrossRef]
  18. Campbell, David E. 2019. What social scientists have learned about civic education: A review of the literature. Peabody Journal of Education 94: 32–47. [Google Scholar] [CrossRef]
  19. Chan, Chitat. 2019. Using digital storytelling to facilitate critical thinking disposition in youth civic engagement: A randomized control trial. Children and Youth Services Review 107: 104522. [Google Scholar] [CrossRef]
  20. Chaux, Enrique, Madeleine Barrera, Andrés Molano, Ana María Velásquez, Melisa Castellanos, Maria Paula Chaparro, and Andrea Bustamante. 2017. Classrooms in peace within violent contexts: Field evaluation of Aulas en Paz in Colombia. Prevention Science 18: 828–38. [Google Scholar] [CrossRef]
  21. Condon, Meghan, and Amber Wichowsky. 2018. Developing citizen-scientists: Effects of an inquiry-based science curriculum on STEM and civic engagement. The Elementary School Journal 119: 196–222. [Google Scholar] [CrossRef]
  22. Connolly, Paul, Andy Biggart, Sarah Miller, Liam O’Hare, and Allen Thurston. 2017. Using Randomised Controlled Trials in Education. Thousand Oaks: Sage. [Google Scholar]
  23. Connolly, Paul, Ciara Keenan, and Karolina Urbanska. 2018a. The trials of evidence-based practice in education: A systematic review of randomised controlled trials in education research 1980–2016. Educational Research 60: 276–91. [Google Scholar] [CrossRef] [Green Version]
  24. Connolly, Paul, Sarah Miller, Frank Kee, Seaneen Sloan, Aideen Gildea, Emma McIntosh, and John Martin Bland. 2018b. A Cluster Randomised Controlled Trial and Evaluation and Cost-Effectiveness Analysis of the Roots of Empathy Schools-Based Programme for Improving Social and Emotional Well-Being Outcomes among 8-to 9-Year-Olds in Northern Ireland. Public Health Research. York: The University of York. [Google Scholar]
  25. Costa, Mia, Brian F. Schaffner, and Alicia Prevost. 2018. Walking the walk? Experiments on the effect of pledging to vote on youth turnout. PLoS ONE 13: e0197066. [Google Scholar] [CrossRef]
  26. Council of Europe. 2018. Reference Framework for Democratic Culture: Vol. 1. Contexts, Concepts and Model. Strasbourg: Council of Europe Publishing. [Google Scholar]
  27. Crawford, Keith. 2010. Active citizenship education and critical pedagogy. In International Research Handbook on Values Education and Student Wellbeing. Dordrecht: Springer, pp. 811–23. [Google Scholar]
  28. Dahl, Robert A. 2008. On Political Equality. New Haven: Yale University Press. [Google Scholar]
  29. Dalton, Russell J. 2017. The Participation Gap: Social Status and Political Inequality. Oxford: Oxford University Press. [Google Scholar]
  30. Dewey, John. 1923. Democracy and Education: An Introduction to the Philosophy of Education. New York: Macmillan. [Google Scholar]
  31. Enos, Ryan D. 2013. The Causal Effect of Prolonged Intergroup Contact on Exclusionary Attitudes: A Test Using Public Transportation in Homogeneous Communities. Working Paper. Cambridge: Harvard University. [Google Scholar]
  32. Feldman, Lauren, Josh Pasek, Daniel Romer, and Kathleen Hall Jamieson. 2007. Identifying best practices in civic education: Lessons from the student voices program. American Journal of Education 114: 75–100. [Google Scholar] [CrossRef] [Green Version]
  33. Finkel, Steven E., and Howard R. Ernst. 2005. Civic education in post-apartheid South Africa: Alternative paths to the development of political knowledge and democratic values. Political Psychology 26: 333–64. [Google Scholar] [CrossRef]
  34. Freire, Paolo. 1982. Creating alternative research methods: Learning to do it by doing it. In Creating Knowledge: A Monopoly. Edited by Budd Hall, A. Gillette and Rajesh Tandon. New Delhi: Society for Participatory Research in Asia. [Google Scholar]
  35. Freire, Paolo. 1996. Pedagogy of the Oppressed (Revised). New York: Continuum. [Google Scholar]
  36. Geboers, Ellen, Femke Geijsel, Wilfried Admiraal, and Geert ten Dam. 2013. Review of the effects of citizenship education. Educational Research Review 9: 158–73. [Google Scholar] [CrossRef] [Green Version]
  37. Gibb, Natalie. 2016. Getting Climate Ready: A Guide for Schools on Climate Action and the Whole-School Approach. Paris: UNESCO Publishing. [Google Scholar]
  38. Gill, Brian, Charles Tilley, Emilyn Whitesell, Mariel Finucane, Liz Potamites, and Sean Corcoran. 2018. The Impact of Democracy Prep Public Schools on Civic Participation. Cambridge: Mathematica Policy Research. [Google Scholar]
  39. Goodwin, Matthew J., Stephen Greasley, Peter John, and Liz Richardson. 2010. Can we make environmental citizens? A randomised control trial of the effects of a school-based intervention on the attitudes and knowledge of young people. Environmental Politics 19: 392–412. [Google Scholar] [CrossRef]
  40. Grant, Maria J., and Andrew Booth. 2009. A typology of reviews: An analysis of 14 review types and associated methodologies. Health Information & Libraries Journal 26: 91–108. [Google Scholar]
  41. Green, Donald P., Peter M. Aronow, Daniel E. Bergan, Pamela Greene, Celia Paris, and Beth I. Weinberger. 2011. Does knowledge of constitutional principles increase support for civil liberties? Results from a randomized field experiment. The Journal of Politics 73: 463–76. [Google Scholar] [CrossRef] [Green Version]
  42. Gusenbauer, Michael, and Neal R. Haddaway. 2020. Which academic search systems are suitable for systematic reviews or meta-analyses? Evaluating retrieval qualities of Google Scholar, PubMed, and 26 other resources. Research Synthesis Methods 11: 181–217. [Google Scholar]
  43. Hill, Austin Bradford. 2015. The environment and disease: Association or causation? Journal of the Royal Society of Medicine 108: 32–37. [Google Scholar] [CrossRef]
  44. Holbein, John B. 2017. Childhood Skill Development and Adult Political Participation. American Political Science Review 111: 572–83. [Google Scholar] [CrossRef]
  45. Hoskins, Bryony, and Jan Germen Janmaat. 2019. Education, Democracy and Inequality: Political Engagement and Citizenship Education in Europe. Cham: Springer. [Google Scholar]
  46. Hoskins, Bryony, Jan Germen Janmaat, and Ernesto Villalba. 2012. Learning citizenship through social participation outside and inside school: An international, multilevel study of young people’s learning of citizenship. British Educational Research Journal 38: 419–46. [Google Scholar] [CrossRef]
  47. Hoskins, Bryony, Jan Germen Janmaat, and Gabriella Melis. 2017. Tackling inequalities in political socialisation: A systematic analysis of access to and mitigation effects of learning citizenship at school. Social Science Research 68: 88–101. [Google Scholar] [CrossRef] [PubMed]
  48. Hoskins, Bryony, Lihong Huang, and Cecilia Arensmeier. 2021. Socioeconomic Inequalities in Civic Learning in Nordic Schools: Identifying the Potential of In-School Civic Participation for Disadvantaged Students. In Northern Lights on Civic and Citizenship Education: A Cross-National Comparison of Nordic Data from ICCS. Cham: Springer, pp. 93–122. [Google Scholar]
  49. Kang, Oh-Han. 2019. Analysis of the sociality and democratic-citizenship changes from the application of the Scratch remix function in cooperative learning. Journal of Information Processing Systems 15: 320–30. [Google Scholar]
  50. Kawashima-Ginsberg, Kei. 2012. Summary of Findings from the Evaluation of iCivics’ Drafting Board Intervention. CIRCLE Working Paper. Medford: Tufts University. [Google Scholar]
  51. Kawashima-Ginsberg, Kei. 2013. Do Discussion, Debate, and Simulations Boost NAEP Civics Performance. CIRCLE Fact Sheet. Medford: Tufts University. [Google Scholar]
  52. Keating, Avril, and Jan Germen Janmaat. 2016. Education through citizenship at school: Do school activities have a lasting impact on youth political engagement? Parliamentary Affairs 69: 409–29. [Google Scholar] [CrossRef] [Green Version]
  53. Knowles, Ryan T., Judith Torney-Purta, and Carolyn Barber. 2018. Enhancing citizenship learning with international comparative research: Analyses of IEA civic education datasets. Citizenship Teaching & Learning 13: 7. [Google Scholar]
  54. Levy, Brett L., Wayne Journell, Yi He, and Brian Towns. 2015. Students blogging about politics: A study of students’ political engagement and a teacher’s pedagogy during a semester-long political blog assignment. Computers & Education 88: 64–71. [Google Scholar]
  55. Manning, Nathan, and Kathy Edwards. 2014. Does civic education for young people increase political participation? A systematic review. Educational Review 66: 22–45. [Google Scholar] [CrossRef]
  56. Margetts, Helen, Peter John, Tobias Escher, and Stephane Reissfelder. 2009. Can the internet overcome the logic of collective action? An experimental approach to investigating the impact of social pressure on political participation. In Paper to the Political Studies Association Annual Conference. Manchester: University of Manchester, pp. 7–9. [Google Scholar]
  57. Marshall, Thomas H. 1950. Citizenship and Social Class. New York: Cambridge, vol. 11. [Google Scholar]
  58. Mathison, Sandra. 2009. Serving the public interest through educational evaluation: Salvaging democracy by rejecting neo-liberalism. CB Cousins & K. Ryan. [Google Scholar] [CrossRef]
  59. McDevitt, Michael, and Spiro Kiousis. 2006. Experiments in Political Socialization: Kids Voting USA as a Model for Civic Education Reform. CIRCLE Working Paper 49. College Park: Center for Information and Research on Civic Learning and Engagement (CIRCLE), University of Maryland. [Google Scholar]
  60. Niens, Ulrike, Karen Kerr, and Paul Connolly. 2013. Evaluation of the Effectiveness of the ‘‘Promoting Reconciliation through a Shared Curriculum Experience’’ Programme. Belfast: Centre for Effective Education, Queen’s University Belfast. [Google Scholar]
  61. Oberle, Monika, and Johanna Leunig. 2016. Simulation games on the European Union in civics: Effects on secondary school pupils’ political competence. Citizenship, Social and Economics Education 15: 227–43. [Google Scholar] [CrossRef]
  62. Ozer, Emily J., and Laura Douglas. 2013. The impact of participatory research on urban teens: An experimental evaluation. American Journal of Community Psychology 51: 66–75. [Google Scholar] [CrossRef]
  63. Pang, Xiaopeng, Junxia Zeng, and Scott Rozelle. 2013. Does Women’s Knowledge of Voting Rights Affect their Voting Behaviour in Village Elections? Evidence from a Randomized Controlled Trial in China. The China Quarterly 213: 39–59. [Google Scholar] [CrossRef] [Green Version]
  64. Quintelier, Ellen, and Marc Hooghe. 2012. Discussing Politics at School. Does an open classroom climate contribute to the willingness to participate in political life? In Transformations of the State. Oxford: Centre for Political Research. [Google Scholar]
  65. Sant, Edda. 2019. Democratic education: A theoretical review (2006–2017). Review of Educational Research 89: 655–96. [Google Scholar] [CrossRef] [Green Version]
  66. Sfard, Anna. 1998. On two metaphors for learning and the dangers of choosing just one. Educational Researcher 27: 4–13. [Google Scholar] [CrossRef]
  67. Shek, Daniel T. L., Rachel C. F. Sun, Y. H. Chui, S. W. Lit, Walter W. Yuen, Yida Y. H. Chung, and S. W. Ngai. 2012. Development and evaluation of a positive youth development course for university students in Hong Kong. The Scientific World Journal 2012: 263731. [Google Scholar] [CrossRef] [Green Version]
  68. Siddiqui, Nadia, Stephen Gorard, and Beng Huat See. 2017. Non-Cognitive Impacts of Philosophy for Children. Durham: School of Education, Durham University. [Google Scholar]
  69. Siddiqui, Nadia, Stephen Gorard, and Beng Huat See. 2019. Can learning beyond the classroom impact on social responsibility and academic attainment? An evaluation of the Children’s University youth social action programme. Studies in Educational Evaluation 61: 74–82. [Google Scholar] [CrossRef]
  70. Silverthorn, Naida, David L. DuBois, Kendra M. Lewis, Amanda Reed, Niloofar Bavarian, Joseph Day, Peter Ji, Alan C. Acock, Samuel Vuchinich, and Brian R. Flay. 2017. Effects of a school-based social-emotional and character development program on self-esteem levels and processes: A cluster-randomized controlled trial. Sage Open 7: 2158244017713238. [Google Scholar] [CrossRef]
  71. Singh, Sukhmani, Megan Granski, Maria del Pilar Victoria, and Shabnam Javdani. 2018. The praxis of decoloniality in researcher training and community-based data collection. American Journal of Community Psychology 62: 385–95. [Google Scholar] [CrossRef] [PubMed]
  72. Smith, Graham, Peter John, Patrick Sturgis, and Hisako Nomura. 2009. Deliberation and internet engagement: Initial findings from a randomised controlled trial evaluating the impact of facilitated internet forums. Paper presented at ECPR General Conference 2009, Potsdam, Germany, September 10–12. [Google Scholar]
  73. Starkey, Hugh. 2018. Fundamental British values and citizenship education: Tensions between national and global perspectives. Geografiska Annaler: Series B, Human Geography 100: 149–62. [Google Scholar] [CrossRef]
  74. Stoker, Gerry, and Peter John. 2009. Design experiments: Engaging policy makers in the search for evidence about what works. Political Studies 57: 356–73. [Google Scholar] [CrossRef]
  75. Strandberg, Kim. 2015. Designing for democracy?: An experimental study comparing the outcomes of citizen discussions in online forums with those of online discussions in a forum designed according to deliberative principles. European Political Science Review 7: 451–74. [Google Scholar] [CrossRef]
  76. Syvertsen, Amy K., Michael D. Stout, Constance A. Flanagan, Dana L. Mitra, Mary Beth Oliver, and S. Shyam Sundar. 2009. Using elections as teachable moments: A randomized evaluation of the student voices civic education program. American Journal of Education 116: 33–67. [Google Scholar] [CrossRef]
  77. Ten Dam, Geert, and Monique Volman. 2004. Critical thinking as a citizenship competence: Teaching strategies. Learning and Instruction 14: 359–79. [Google Scholar] [CrossRef] [Green Version]
  78. Torney-Purta, Judith. 2002. The school’s role in developing civic engagement: A study of adolescents in twenty-eight countries. Applied Developmental Science 6: 203–12. [Google Scholar] [CrossRef]
  79. UNESCO. 2015. Global Citizenship Education: Topics and Learning Objectives. Paris: UNSECO, Available online: https://unesdoc.unesco.org/ark:/48223/pf0000232993/PDF/232993eng.pdf.multi (accessed on 16 December 2020).
  80. Verba, Sidney, Kay Lehman Schlozman, and Henry E. Brady. 1995. Voice and Equality: Civic Voluntarism in American Politics. Cambridge: Harvard University Press. [Google Scholar]
  81. Vissers, Sara, Marc Hooghe, Dietlind Stolle, and Valerie-Anne Maheo. 2012. The impact of mobilization media on off-line and online participation: Are mobilization effects medium-specific? Social Science Computer Review 30: 152–69. [Google Scholar] [CrossRef] [Green Version]
  82. Weinberg, James. 2020. Politics in Schools: ‘What Exists’ and ‘What Works’? Research Evaluation Report. London: Joseph Rowntree Reform Trust. [Google Scholar]
  83. Wong, Stan Hok-Wui, and Mathew Y. H. Wong. 2020. Distant Participation and Youth Political Attitudes: Evidence from a Natural Experiment. Social Science Quarterly 101: 1489–512. [Google Scholar] [CrossRef]
Table 1. Identified control trials of citizenship education for political engagement.
Table 1. Identified control trials of citizenship education for political engagement.
Education ProgramPolitical OutcomeAuthorsCountryType
Student as ambassadors supporting the provision of basic bank accounts in the communityTreated female students show positive effects regarding attitudes of empowerment, self-efficacy, motivation, and community engagement. Agurto and Torres (2020)PeruNon-formal, Participatory
Teacher training on Education for Sustainable DevelopmentPositive attitudes, personal responsibility, and willingness to contribute to sustainable development.Andersson et al. (2013)SwedenTeacher training
Facing HistoryRespect and tolerance for the rights of others with differing views, awareness of the danger of prejudice and discrimination and increased sense of civic efficacy.Barr et al. (2015)USTeacher training, School-based program
Providing information about the importance of votingIncreased voter turnout.Barros (2017)PortugalNon-formal
Community empowerment programLittle impact on specific measures of civic participation and community cohesion; modest increases in respect for human rights and equality; and large impacts on conflict and conflict resolution, though not always in expected ways.Blattman et al. (2011)LiberiaNon-formal
National Citizen ServiceImproved attitude toward social mixing in local area, community engagement, & intention to vote. Booth et al. (2014)UKNon-formal
Holocaust Museum visitsPositive impact on students’ desires to protect civil rights and libertiesBowen and Kisida (2018)USNon-formal
Digital storytellingIncreased self-esteem and critical thinking disposition. Ethnocentric views declined.Chan (2019)ChinaDigital
Intervention integrates science and civics instruction in a unit about community and family water conservationEngagement (including self-effiacy) in both areas was positively affected.Condon and Wichowsky (2018)USCross-curricular, Participatory, School-based program
Student Voices ProgramClass deliberative discussions, community projects, and informational use of the Internet increased political participation.Feldman et al. (2007)USParticipatory
Charter schoolIncreased probability of future voting.Gill et al. (2018)USWhole school approach, Participatory
Two types of class-based instruction on environmental issues, one long and the other short, which were designed to increase environmental awareness.The results show no statistically significant differences on awareness of behavior between schools in the intervention groups compared to the control group schools. Goodwin et al. (2010)UKSchool-based program
Enhanced civics curriculum designed to promote awareness and understanding of constitutional rights and civil libertiesMore knowledge in this domain than students in conventional civics classes. However, no corresponding change in the treatment group’s support for civil liberties.Green et al. (2011)USSchool-based program
Psychosocial skillsNoticeable long-run impact voter turnout.Holbein (2017)USCross-curricular
Participants were exposed to climate change information either by way of face-to-face interaction or by websiteWeb-based mobilization only has a significant effect on online participation. Vissers et al. (2012)Belgium Digital
iCivics computer-based teaching moduleHigher grades on writing a persuasive letter to a school newspaper.Kawashima-Ginsberg (2012)USDigital
Regular discussions, debates and simulations.Higher National Assessment of Educational Progress (NAEP) Civics test scores.Kawashima-Ginsberg (2013)USParticipatory
The creation of political blogs during an electionStudents in the blog-focused class had more gains in political interest, self-efficacy, and confidence. But a lack of online interactions could limit gains.Levy et al. (2015)USDigital
Social pressure/critical mass of supportPeople who received positive feedback from small numbers of supportive participants signed more petitions.Margetts et al. (2009)UKDigital
An interactive, election-based curriculumPolitical communication in the home increased the probability of voting for students when they reached voting age.McDevitt and Kiousis (2006)USParticipatory
Youth-led participatory research classIncreases in sociopolitical skills, motivation to influence schools and communities, and participatory behavior.Ozer and Douglas (2013)USParticipatory
Voting training for womenScores on a test of voting knowledge increased and more fully exercised their voting rights.Pang et al. (2013)ChinaNon formal
Online moderated asynchronous discussionActively contributing to deliberation in the form of posting has the most significant impact on opinion change. Lurking has little effect. Smith et al. (2009)UKDigital
Online forums designed according to deliberative principles produce better ‘democratic outcomesThe effects of designing for deliberation were generally positive, albeit not for all of the democratic outcomes.Strandberg (2015)FinlandDigital
Election-based civics programIncreased self-reported ability to cast an informed vote, knowledge of the voter registration process, belief that their vote matters, communication with others at school about politics, sense of civic obligation, and media use and analysis.Syvertsen et al. (2009)USParticipatory, School based program
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Donbavand, S.; Hoskins, B. Citizenship Education for Political Engagement: A Systematic Review of Controlled Trials. Soc. Sci. 2021, 10, 151. https://doi.org/10.3390/socsci10050151

AMA Style

Donbavand S, Hoskins B. Citizenship Education for Political Engagement: A Systematic Review of Controlled Trials. Social Sciences. 2021; 10(5):151. https://doi.org/10.3390/socsci10050151

Chicago/Turabian Style

Donbavand, Steven, and Bryony Hoskins. 2021. "Citizenship Education for Political Engagement: A Systematic Review of Controlled Trials" Social Sciences 10, no. 5: 151. https://doi.org/10.3390/socsci10050151

APA Style

Donbavand, S., & Hoskins, B. (2021). Citizenship Education for Political Engagement: A Systematic Review of Controlled Trials. Social Sciences, 10(5), 151. https://doi.org/10.3390/socsci10050151

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop