Next Article in Journal
Remythologizing Mystery in Mathematics: Teaching for Open Landscapes versus Concealment
Previous Article in Journal
Incorporating Sustainability into Engineering and Chemical Education Using E-Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Dynamics of the Community of Inquiry (CoI) within a Massive Open Online Course (MOOC) for In-Service Teachers in Environmental Education

1
LUMA Centre Finland, University of Helsinki , 00100 Helsinki, Finland
2
Graduate School of Education, Peking University, Beijing 100871, China
*
Author to whom correspondence should be addressed.
Educ. Sci. 2018, 8(2), 40; https://doi.org/10.3390/educsci8020040
Submission received: 2 February 2018 / Revised: 19 March 2018 / Accepted: 22 March 2018 / Published: 24 March 2018

Abstract

:
One of the greatest ways to transform education systems is to develop community-centered professional supports for in-service teachers. Given the rise of distance learning platforms such as massive open online courses (MOOCs), there is a growing potential to deliver such supports at scale. The community of inquiry (CoI) framework models the asynchronous, text-based communication that defines educational experiences within such collaborative learning environments; however, methods of CoI transcript analysis must be improved. This paper uses the University of Helsinki’s 2016 MOOC, Sustainable Energy in Education, as a case study on how the CoI framework can be used to characterize the educational experience of in-service teachers in distance learning environments. Using the CoI coding protocol, this paper employs a transcript analysis of the discussion forum posts on the MOOC (n = 78), and applies improved measures of reliability in order to understand the capacity of CoI transcript analysis to reliably define online learning experiences. The findings suggest that, while the CoI framework is able to characterize some elements of online learning communities, more work needs to be done to ensure the framework captures the more nuanced elements of such educational experiences, such as the effects of course design and the relative engagement of course participants.

1. Introduction

The understanding that learning is a necessarily collaborative, social exercise is arguably older than the enterprise of formal education itself; however, the introduction of technology into formal and non-formal learning channels has raised serious questions regarding the viability of supporting such collaboration within emerging virtual contexts [1]. Massive open online courses (MOOCs), in particular, were first conceived of as a way to test the potential of connective learning in a digital age [2]. The question remains, however, whether or not MOOCs can effectively support and/or augment the community-centered learning that is possible within in-person contexts.
Building on John Dewey’s theory that the process of inquiry is inseparable from its social environment [3], the community of inquiry (CoI) framework situates this understanding of learning within the context of computer-mediated communication (CMC) and models the asynchronous, text-based group discussions that occur within spaces such as MOOC forums. While a reliable survey has been developed to instrumentalize the framework [4], applying the CoI framework to the sort of informal, text-based communication that occurs between course participants in online MOOC forums still requires the development of a more reliable methodology [5]. Given the understanding of the CoI framework that reflective communication is critical to the educational experience [6], this framework has a lot of potential to model collaborative learning in massive virtual contexts and guide online course design and pedagogy.
The aim of this paper is to expand CoI transcript analysis methods in order to better understand the potential of the framework to effectively characterize educational experiences within online learning communities, such as MOOCs. The University of Helsinki’s 2016 MOOC, Sustainable Energy in Education, serves as a relevant case study for these purposes, given the focus on reflective practice and community-formation within professional development for in-service teachers. This paper will proceed by providing a background on the use of MOOCs for the professional development of teachers, and will then review the components of the CoI theoretical framework. Next, the transcript data from the course will be described, analyzed using the existing CoI coding protocol, and tested for reliability, before the findings of the analysis are presented.

2. Background

2.1. Massive Open Online Courses (MOOCs) for the Professional Development of In-Service Teachers

In 2013, leading MOOC provider, Coursera, announced that they would be launching a new series of free online courses to support elementary and secondary school teachers’ pedagogy, classroom management, and subject-specific content knowledge [7]. Though the use of MOOCs by college students has skyrocketed in recent years [8], a less explored area of empirical literature is how MOOCs might be designed to support the professional development of in-service teachers.
The need for such professional development stems from an understanding of teachers as “reflective practitioners” that are constantly in the process of learning and improving both their content knowledge and pedagogy [9] and teaching through “deliberative action that results from the active and thoughtful consideration of specific beliefs and knowledge in relation to past and future consequences” [10]. Given findings that teachers’ subject matter and pedagogical knowledge are correlated with their students’ learning outcomes [11], the value of supporting professional development for in-service teachers cannot be overstated.
The potential of MOOCs to serve as a platform for reflective professional development for in-service teachers is supported by existing research [12,13]. In particular, teachers’ participation in online professional development models has been found to increase their willingness to adopt similar progressive, 21st century teaching practices within their own classrooms [14]. Additionally, although low retention and completion rates are challenges for MOOCs in general, teachers completing a MOOC for professional development may be more likely to complete MOOCs than other students [15].
Despite the large body of evidence detailing the necessity of professional development for in-service teachers [16,17], the prevailing barrier to providing such supports in-person is cost [18]. If MOOCs are able to foster learning communities that are comparable to those within in-person professional development, however, then they have the potential to overcome the barriers of cost and reach more teachers. The primary challenge of employing MOOCs as a means of professional development is “design[ing] the social supports within the MOOC structure to sustain collaboration, dialogue and ongoing reflection that is necessary for the changes in pedagogical orientation and classroom practices” [14]. Such professional communities have been found to be the preferred mode of teachers’ professional development within both online and offline contexts [19,20,21]. Therefore, if MOOCs can be employed at scale as an effective means of community-centered professional development for teachers, they could have a transformative impact on education at large.

2.2. Community of Inquiry (CoI) Theoretical Framework

The phrase “community of inquiry” comes from Lipman, who characterized such learning environments as those in which “students and teachers are co-inquirers engaged in deliberating together about the issues or problems at hand” [22]. The goal of the CoI framework is to “define, describe, and measure the elements of a collaborative and worthwhile educational experience” [1].
The CoI model characterizes “educational experiences” by examining the interaction of the three elements, or “presences”: cognitive presence, social presence, and teaching presence. According to the model, an educational experience is defined by the ways in which these three presences overlap with and work alongside each other (see Figure 1). For example, the process of “Selecting Content”—i.e., determining which topics to cover in a course—is a function of the cognitive and teaching presences within that course, but not the social presence (Figure 1). The ways in which selecting content would fully affect the educational experience of participants in the course would also depend on aspects of the social presence, such as the nature of the relationships between course members. To better understand the dynamics of these interactions, it is critical to look more closely at each of the presences that the model is built upon.

2.2.1. Social Presence

Social presence is defined as “the degree to which a person is perceived as a ‘real person’ in mediated communication” [23]. The methods for understanding how social presence is manifested in distance learning platforms have been adapted according to the nature of text-based communication. As such, the three indicators of social presence are emotional expression, open communication, and group cohesion [24].
Emotional expression is reflected and measured by features of a text-based conversation such as emoticons and expressive punctuation [24]. There are inherent challenges to measuring emotions through text-based communication, given the role of body language and non-verbal cues in face-to-face communication; however, participants in such learning environments have been found to “compensate for this loss of physical presence by using unconventional symbolic representations, such as emoticons, to facilitate their expressiveness in the medium” [24].
The second and third indicators had not previously been conceptualized of as part of the social presence literature prior to the inception of the CoI framework [1]. Open communication is defined as “reciprocal and open exchanges,” as measured by the presence of “mutual awareness and recognition of each other’s contributions” [24]. The third indicator, Group cohesion, was introduced to highlight the role of collaboration within learning communities [1] and is measured by the presence of “activities that build and sustain a sense of group commitment” [24]. All three of these indicators apply existing pedagogical theory to the context of social relations within virtual contexts, wherein communication is primarily, if not entirely, text-based.

2.2.2. Cognitive Presence

Cognitive presence is defined as “the extent to which the participants in any particular configuration of a community of inquiry are able to construct meaning through sustained communication” [24]. This understanding builds on John Dewey’s concept of practical inquiry and adopts the four stages of practical inquiry in Dewey’s framework—i.e., triggering event, exploration, integration, and resolution—as the indicators of a cognitive presence.
According to this model (Figure 2), inquiry begins with a triggering event, which one perceives and then explores in one’s private world, before integrating that knowledge into one’s existing knowledge, and then finally resolving that idea through discourse [24]. By situating the process of inquiry within both the private and shared worlds (Figure 2), critical inquiry is understood as a necessarily social process [24]. While there is evidence of all stages of practical inquiry being explored through computer-mediated communication, the majority of empirical findings reveal, “inquiry invariably has a great difficulty of moving beyond the information exchange or exploration phase” [25].

2.2.3. Teaching Presence

Teaching presence is defined as “the design, facilitation, and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” [26]. The three indicators of a teaching presence are instructional management, building understanding, and direct instruction [24].
Instructional management (or, instructional design and organization) includes activities such as setting a curriculum, designing methods, establishing time parameters (i.e., course deadlines), utilizing the online medium effectively, and establishing online course etiquette [26]. Given that many of these sub-indicators for instructional management are measures of back-end course design, they might be less evident, or at least more difficult to measure, in asynchronous text-based communication in spaces such as MOOC forums, especially if course instructors are not directly participating in discussions.
Building understanding is defined as the “process of creating an effective group consciousness for the purpose of shared meaning, identifying areas of agreement and disagreement, and generally seeking to reach consensus and understanding” [24]. An example of what this indicator may actually look like in practice is the course instructor (or another course participant) intervening to encourage the participation of less-active course participants [24].
Direct instruction is concerned with “the discourse and the efficacy of the educational process” [24]. More specifically, this indicator considers the quality of teacher feedback and how well the content and course directions encourage reflection [24]. The CoI model recognizes that text-based communication will not be the primary means of instruction by course teachers, as that instruction likely happens through means such as video instruction. Therefore, this indicator instead reflects the idea that “teachers do have a responsibility for providing contextualized knowledge relevant to subject domain […] including personal knowledge derived from the teacher’s experiences” [24].

2.3. CoI Coding Protocol

This theoretical framework has been operationalized with all of the aforementioned indicators (i.e., categories) into a coding protocol for qualitative transcript analysis (see Table 1). The validity of this coding protocol has been empirically tested, and the protocol has become the basis of much of the existing empirical literature on CoI [4]. The indicators provide examples of the nature of comments that might be characterized as each of the categories respectively. For example, if a course participant were to take material and connect it to experiences they had outside of the course (i.e., “connecting ideas”), then that would be a measure of integration, and therefore part of the educational experience’s cognitive presence (Table 1).
Much of the empirical research on the CoI framework in recent years employs the 34-item CoI framework survey instrument to characterize communities of inquiry [27,28,29,30], and the use of the coding protocol within CoI transcript analysis has been critiqued for not revealing more complex features of the educational experience [1]. There are also methodological concerns regarding the capacity of the protocol to measure deep and meaningful learning [31]. For example, a recent critical review of CoI literature documents the failure of the framework to effectively identify instances of cognitive presence [31]. Despite these methodological concerns, these informal discussions are often the primary, if not sole, means of direct interaction between course participants and instructors [32], and forum participation has been found to be a predictor of course retention [33]. Therefore, it is necessary not to give up on including them in CoI analyses, as improving methods of transcript analysis of such communication may prove critical in characterizing the educational experience within online learning communities. Such findings could have implications on future studies exploring the best methods for designing such learning communities in the first place.

3. Materials and Methods

3.1. Research Setting

The data collected for this analysis comes from postings in the online forums of the Sustainable Energy in Education course that was hosted on the University of Helsinki MOOC website during the spring of 2016. This course was offered a few months after an equivalent course was hosted for students [34]. The MOOC had four stated goals, as expressed to registered participants:
  • Give an overview on how our energy production and use could be more sustainable
  • Provide ideas, support, and stimulation to handle the topic in teaching
  • Foster the interaction between the experts in the field and share good practice internationally, as well as support learning together
  • Bring up Finnish expertise and studying possibilities within the field.
Course participants completed a questionnaire that asked them to select up to three of the goals that were most important to them in their decision to take the course. While the creation of a supportive learning environment was important to 50% of the course participants (Goal 3), 66% of participants expressed more interest in receiving direct support in their own teaching practice (Goal 2). The delivery of content knowledge related to sustainable energy (Goal 1) was important to 52% of course participants, whereas bringing up Finnish expertise and studying opportunities within the field (Goal 4) was a primary goal for only 33% of participants. Though the course was free and open to anyone, it was designed to specifically support in-service teachers seeking up-to-date pedagogical content knowledge on topics in sustainable energy.
The course included a variety of content, such as instructional videos from expert researchers in the field, quizzes and forum discussions designed to have course participants process and reflect on the material, and a final assignment in which course participants designed their own lesson plans integrating what they learned in the MOOC. All responses to assignments on the discussion forums were public to all course participants.

3.2. Study Participants

There were a total of 149 students enrolled in the Sustainable Energy in Education MOOC, but the sample considered within this study (n = 78) was only composed of the subset of active members. To determine the relevant sample for this study, “active” course participants were limited to those who responded to the initial questionnaire, because accessing the rest of the course materials was conditional on this task. The questionnaire collected basic information from participants, such as their level of teaching, country of residence, goals in taking the course, etc. Course material was progressively made available to participants, contingent upon their completion of relevant assignments. The discrepancy between the number of registered participants and the numbers of active participants who stay engaged with the course is consistent with documented patterns of MOOC engagement at large [35,36,37].
All course participants were enrolled voluntarily in the MOOC, and those who completed all of the assignments were eligible to download a Certificate of Achievement. The course was publicized online and course participants cited that they found the course via Facebook, email, Twitter, or from their colleagues. The initial questionnaire revealed that the course was composed of a mixed group of teachers, living in twelve different countries. They level at which they taught ranged greatly, from the primary school level through the doctoral level. With the exception of a few posts in forums in Finnish (between Finnish-speaking course participants), all communication and instruction in the course was in English. The majority of participants (64.1%) had never previously participated in a MOOC.

3.3. Materials

As this study is concerned with the transcript analysis of asynchronous, text-based communication within MOOC courses, the materials considered within this analysis were forum discussions on the MOOC. Such communication channels can be critical for generating new ideas, forming community, and co-creating knowledge [38], so there are a useful platform for characterizing the nature of the educational experience within the MOOC.
There were seven forums on the MOOC course, but the transcript analysis involved only the three forums that specifically required the course participants reflect on course material. Of the forums that were excluded, one served as a course introduction and primarily involved participants sharing their names and background, the second involved participants submitting full original lesson plans that integrated what they learned, the third asked participants to share their thoughts on the course content, and the final forum was a space to submit feedback on the course. These forums that were excluded from the analysis would have been more difficult to code within the CoI framework, as the postings (the units of analysis) for these forums—especially the one that involved the creation of lesson plans—varied so greatly in length and content.
Accordingly, the three forums analyzed were those that included prompts that asked the participants to directly reflect on how the course material could be connected to their teaching practice. Though the other discussions also revealed communication relevant to the CoI framework, not all of them could be included, given the time and resources available for this study. The selected discussions were also the most consistent in post length (i.e., unit of analysis) relative to each other—the first discussion consisted of 56 posts, the second discussion consisted of 58 posts, and the final discussion consisted of 24 posts (138 posts total).
For the purposes of the Discussion section, the CoI code findings from the Results section will also be compared to the final questionnaire that course participants filled out to provide feedback and reflection on the structure and content of the MOOC upon completing the course.

3.4. Methodology

The transcript analysis applied in this study builds on previous methods in order to build a more reliable means of applying the CoI coding protocol. The transcript analysis of this study was grounded in existing practices, whereby two coders receive the transcripts of the forums and are asked to code each unit of analysis—in this case, the individual posts in MOOC forums—based on the category it exhibits within the CoI coding protocol (Table 1). Though these posts ranged in length, using individual postings as the unit of analysis allowed both coders to reliably identify which text to include in their coding [5].
Both coders had no role in the design or implementation of the course, and all of the names of course participants were removed from the transcripts they coded. The postings were presented in order, and the posts that were responses to previous threads in the forums were designated as such to the coders. In order to not bias the coders’ decisions regarding teaching presence, the coders did not know which posts were from the course instructors rather than students in the course. After this initial process of coding, reliability measures were introduced.
Building off of methods employed by Garrison et al. [5], the coders met after independently coding the transcripts to discuss the discrepancies between their coding and bring as much of their coding into agreement as possible through the process of negotiation. Agreement rates were then calculated, before and after negotiation, for each discussion or transcript evaluated. Garrison et al. argue that this process of negotiation takes the inter-rater reliability a step further “into a state of intersubjectivity, where raters discuss, present and debate interpretations to determine whether agreement can be reached” [5].
In order to take this method further, the analysis integrates these negotiated agreement findings into the Holsti’s C.R. and Cohen’s kappa reliability measures. Following the precedent of Shea et al. [3], the Holsti’s C.R. and Cohen’s kappa values are calculated for both the initial and negotiated agreement values. After omitting the coding units on which no agreement could be negotiated, the frequencies for all of the posts the coders agreed on (initially and through negotiation) are reported for each category of the CoI framework.
While the frequencies of category will be considered, this methodology remains a qualitative, rather than quantitative, analysis. Additionally, the results of this methodology are descriptive, rather than inferential [5]. The frequencies will be presented as a means of characterizing the community of inquiry that may be present within the MOOC, but they are not to be used for any sort of quantitative analysis. Rather, such frequencies can be used to more generally characterize the nature of the community of inquiry within the MOOC.

4. Results

4.1. Reliability Findings

Prior to reporting the frequencies of each CoI indicator observed in the discussion forums of the MOOC, it is first necessary to ensure that these findings are reliable. As discussed in the Methodology section, this analysis first determines the initial and the negotiated percentages of agreement between the two coders. These values are reported in Table 2 for each of the discussion forums respectively. Because both of the coders evaluated the same units of analysis (i.e., individual postings on the MOOC forums), the percentage agreements reported are equivalent to the Holsti’s C.R. values [39].
This analysis took the negotiated agreement framework of Garrison et al. one step further by integrating the Cohen’s kappa as an added measure of reliability. However, given the challenge of calculating Cohen’s kappa for more than three categories, and the fact that all remaining disagreements after the process of negotiation were due to disagreements between presences rather than within them, these kappa values were not calculated at the category, rather than indicator, level. This methodological decision is supported by concerns that coding should even be conducted at the indicator level [40]. Therefore, the Holsti’s C.R. values are reported at the indicator level (i.e., triggering event, exploration, integration, etc.), whereas the Cohen’s kappa values are reported at the category level (i.e., cognitive presence, social presence, teaching presence) in Table 3.
Given the interpretation of kappa values proposed by Landis and Koch, the initial total values reveal good agreement, and the negotiated total values reveal very good agreement; however, the initial kappa value for Discussion 3 rates demonstrate a poor level of agreement [41]. In part, this may have to do with the relatively small sample size of that forum, but the coders also discussed during the negotiation process that this was the hardest discussion to code because participants were synthesizing large amounts of material (i.e., the course materials at large) and their posts exhibited a larger range in content and length, such that the unit of analysis became more inconsistent for this particular discussion. That may reveal the necessity to employ more flexible units of analysis, such that coders are not left to subjectively interpret which parts of a transcript to prioritize in their coding.

4.2. Presence of a CoI

Given these reliability findings, the presence of a community of inquiry can be considered in the MOOC. After omitting the 13 forum posts that the coders were unable to agree upon, the frequencies of each of the categories for the remaining 125 forum posts are reported in Table 4, where the reported frequencies include the combined raw and negotiated agreement for each category. Examples for MOOC posts that were classified under each category are also provided in Table 4 so as to better contextualize how posts were coded within this analysis, and to reveal the nature of communication that occurred within this particular MOOC.

5. Discussion

5.1. Community of Inquiry within Sustainable Energy in Education MOOC

Given this characterization of the Sustainable Energy in Education MOOC as a community of inquiry, it is possible to understand the broader potential of the CoI framework to explore and define the dynamics of online learning communities. The following section will examine the findings from the transcript analysis alongside to the feedback presented by course participants in the final questionnaire of the course in order to better understand if there were relevant features of educational experiences within the MOOC that were not picked up by the CoI framework, but were expressed within the direct feedback of course participants or revealed by the experiences of the coders during the negotiation process. Based on these findings, broader considerations of how to improve the CoI framework and methodologies will be suggested.

5.1.1. Social Presence

While there was some social presence documented in the MOOC, the observed frequencies were not as significant as one might expect for a course with the specific goal of connecting teachers with one another. Part of the reason for this may have to do with which discussions were selected for this analysis. For example, if this analysis had included the first forum discussion on the MOOC, then postings where participants were introducing themselves and commenting on each other’s reflections would have also been coded. The three discussions that were selected for this analysis were chosen because they documented the course participants’ direct engagement with the course content, rather than on their feedback on the MOOC itself, and so any social networking that may be documented in this analysis would have had to occur within these sections of the course.
During the negotiation process, both coders also noted that many of the postings that revealed a social presence also revealed a teaching or cognitive presence, and they had difficulty in deciding which was the dominant presence. Their deliberations were largely subjective, given that the coding protocol did not provide the coders any real way to determine which presence was most prominent in such situations. This reveals a limitation of existing methods of CoI transcript analysis: though the framework highlights the multi-dimensional nature of learning, the CoI coding protocol has not developed a way to reliably measure all of the nuances of these dynamics. This is a unique challenge of transcript analysis, and it has been highlighted by existing literature that warns of the methodological challenges of determining an appropriate unit of analysis [1].
That being said, feedback from course participants was consistent with the findings of a minimal social presence. Participants commented that the course would have benefited from having more students enrolled, so the course administrators should have been advertised more effectively to reach a greater number of students. However, beyond simply the number of students being an issue, one course participant notes that the overall group dynamics could have been improved, both between participants and the course coordinators:
“I hope [for] even more interaction with the other students. When the course is international, it would be nice to include one assignment with national aspect. Instructions for contacting the coordinator of the course were a little bit confusing for a first timer.”
Therefore, the low frequency of social presence findings in the transcripts from the forums may indeed capture how course participants felt regarding the nature of social interactions in the course. Given the coders’ comments regarding their potential bias against coding social presence in situations where a comment exhibited multiple presences, however, it is worth further investigating the coding methods for social presence, as such a bias may have broader implications on the framework as a whole.

5.1.2. Cognitive Presence

Because of the focus on reflection in professional development settings, the role of cognitive presence in a course such as this one is of special interest. The findings regarding the frequency of each CoI category reveal that course participants’ postings engaged with the content primarily at the level of exploration and integration (see Table 4). This indicates that course participants moved past simply asking questions, and critically engaged with the content, brainstormed ideas, converged with each other, and even started to create practical solutions to problems (such as how to engage with a particular topic in their classrooms) [3]. Though it was less common, some participants took their inquiry even further by systematically defending their proposals, suggesting the strengths of their approach and why their solution would be effective [3].
One reason why course participants demonstrated higher levels of integration in this course than that empirically observed in other courses [25] may be due to the nature of the teaching presence—i.e., the ways in which course discussions were framed and the prompts participants were asked to respond to [40]. For example, the prompt for Discussion 1 asked participants to provide an example of the content covered that they observed within their subject field, reflect on how to introduce this concept to their students, and defend their approach to teaching that material. Though not all participants engaged with the prompt fully in their responses, this prompt specifically asked participants to develop the level of inquiry exhibited in their responses to a point of integration and resolution. Even still, course participants often brainstormed such plans, but their postings in the forums did not move beyond that level of inquiry.
A critical shortcoming of the CoI methodology in this case was that it is possible that participants were still applying their knowledge in their teaching practices, but their postings within the forum may not have documented this behavior. Even though the CoI framework is specifically interested in the nature of the educational experience, rather than its outcomes, an important measure of how in-service teachers engage with such professional development training is how they are integrate what they are learning within their actual teaching practices. This is not something that can be directly measured by looking at the forum posts alone though, so this may contribute to the framework’s difficulty in documenting deep and meaningful learning [31]. It further highlights the challenge of measuring the stages of practical inquiry through transcript analysis [1].

5.1.3. Teaching Presence

Though it has been discussed that the teaching presence was manifested in many ways outside of the forum conversations, participants’ feedback on the course indicate that the community of inquiry may have also benefited from a stronger teaching presence in the form of more communication on the forums. For example, one course participant wrote:
“Although perhaps it’s not the aim of the course, I think that course contacts should take part more in discussions.”
Other students also noted that they would have wanted more comments and feedback from moderators, and an area of improvement expressed by course participants would be to have “more expert involvement in the interactive discussions.” Given findings that “teaching presence is a significant determinant of student satisfaction, perceived learning, and sense of community” [40], these student comments should be taken seriously. However, as the findings from this MOOC reveal, “depending on the design and pedagogical approach, students may not differentiate between design and direction or facilitation and direction” [1]. In other words, though teaching presence may take on less direct forms in the context of online learning environments, that may not always be apparent to course participants, making teaching presence difficult to measure with the CoI framework. This holds true with both transcript analysis and the survey instrument, given that both approaches rely solely on the students’ impressions of the learning environment.
This may also reveal that the transcript selection for this analysis was biased in that it excluded the primary means of communication used by instructors in the course—i.e., instructional videos, instructions for assignments, etc.—because the methodology employed focuses on the less structured channel of asynchronous text-based communication of the forums. In the context of this MOOC, most of the instructors took a more “hands off” approach when it came to forum discussions and let the students self-moderate their discussions. From a methodological standpoint, it is worth further exploring the nature of the value of instructor engagement in spaces like MOOC forums. For example, it may be possible that course participants benefit from some participation of course instructors in forums, but that if the teaching presence becomes too over-powering, they might become less engaged within forum discussions. Given the current framework, however, this lack of a reported teaching presence may be more tellingof the ways that the three presences may manifest in different forms of communication within the MOOC platform. Future CoI research needs to take seriously the distinction between perceived teaching presences by students versus teachers (or even by a third-party observer), as there may be differences that have bearings on the nature of the community of inquiry framework at large.

5.2. Improving the CoI Transcript Analysis Methods

The aim of this analysis was to revisit methods of CoI transcript analysis in order to consider whether or not such methods can effectively and reliably be used to characterize the nature of educational experiences within online learning communities. While this analysis was able to progress methodologies for transcript analysis in terms of improving their reliability, questions still remain regarding the capacity of the CoI framework to effectively document all of the relevant aspects of an educational experience. In order to explore the power of the CoI framework in characterizing dynamics of learning within online communities, it is important to synthesize what the findings from this case study reveal about CoI transcript analysis methods more broadly.
The process of negotiation, in particular, revealed several limitations of the CoI coding protocol and of the CoI framework at large. For one, the majority of initial disagreements between the coders were due to their difficulty in distinguishing between the levels of practical inquiry within cognitive presence. This is consistent with criticisms that the framework fails to actually identify what instances of these categories look like [31]. In the case of this analysis, the coders had to define their own instances for these categories and then systematically apply these understandings across the coding to resolve their disagreements. While the process of negotiation made this reliable in this analysis, this raises serious concerns regarding the reliability of comparing findings between different CoI studies that involve transcript analysis. This may, in fact, account for the fact that this MOOC had higher levels of integration observed than has been documented in previous empirical studies [25]. If each set of coders are left to resolve their own understanding of instances of cognitive presences, then it will be difficult to compare the frequencies of categories observed in various studies in any meaningful capacity.
Another shortcoming of the CoI framework revealed by this analysis is the difficulty in capturing the multi-disciplinary nature of the educational experience. During the negotiation process, the coders had differences between their coding because they observed multiple presences and had to make subjective deliberations regarding which was the most dominant presence. This problem particularly affected measures of social presence; in cases where a forum post revealed both a social presence and another presence, the coders often did not code the posting with a social presence category. Given how central the notion that the presences are mutually enforcing is to the core of the CoI framework, this concern needs to be taken seriously in CoI transcript analysis moving forward. While the CoI survey instrument has been used to study the interdependent nature of the presences [1], observing the actual communication between course participants and moderators may reveal features of the social presence that the more structured survey instrument cannot.
An additional problem related to social presence is that the existing CoI methods of transcript analysis do not take into account the relative engagement of course participants. For example, one or two so-called “super-posters” might dominate communication with a course, as is common within MOOC settings [42], but reporting the frequencies of each category alone fails to capture these dynamics. While the frequencies are not to be employed in any sort of quantitative analysis, they are still the primary means of understanding the nature of social presence within the transcript analysis method of CoI analysis. Empirically, these dynamics affect the overall educational experience of the course, as the presence of “super-posters” is positively correlated with higher-quality postings from other course participants [42]. Therefore, this is a potential area for growth with CoI transcript analysis methods—and the framework at large—because it points to a relevant feature of an online learning community’s social dynamics that the framework is not yet designed to capture.
All of these challenges reflect the more significant problem that transcript analysis alone may be unable to capture the full nature of an educational experience in an online learning community. As was particularly the case with teaching presence in this case study’s MOOC, there are aspects of such courses that affect the nature of an online community’s educational experience that are not necessarily observable in the content of asynchronous communication. For example, the teaching presence may become more manifest in the back-end course design, rather than in the communication that transcript analysis measures. In the case of this MOOC, it appeared that the course design—specifically the way that discussion forum prompts were worded—was associated with more reflective postings by course participants; however, it would be more useful if the framework itself was able to directly account for the design of the course in its characterization of the educational experience. Alternatively, transcripts from instructional videos could also be coded as part of the analysis. Given the multi-dimensional concerns that go into fostering any learning space—but especially an online one—as one of community, developing a more holistic understanding of course dynamics would benefit the CoI framework.
Finally, a continued area for growth in the CoI framework is to transition from being a descriptive tool to a valid tool of inference [40]. While it is helpful to characterize the MOOC in this analysis as a community of inquiry, it would be even more powerful if the framework could provide grounds for inferring which characteristics of the course would be best built upon for future professional development MOOCs. If this methodology is unable to transition for being merely descriptive to predictive, then it may soon lose relevance to theories that are capable of providing this sort of analysis.

6. Conclusions

There is a great need to build reliable frameworks for characterizing the education experiences within distance learning contexts; however, the existing methods are still inadequate at capturing all relevant dimensions of such learning environments. This analysis develops methods of CoI transcript analysis in order to address concerns regarding the methodology’s reliability, but there is still a need to develop the CoI framework further, so as to ensure that the framework is capable of effectively characterizing the nuances of educational experience in online settings. Though this analysis relied on descriptive, rather than inferential, methods of analysis, future studies might explore the potential of the CoI framework to be used to develop models of collaborative learning in distance learning channels. There is tremendous value to developing these models for effective online learning community-building because improving the ability to deliver effective, community-centered education at scale to groups such as in-service teachers could have a profound impact on the landscape of education at large.

Acknowledgments

This work was supported in part by the Fulbright Finland Foundation [Fulbright Student Grant]. We would also like to thank Sakari Tolppanen and Marianne Juntunen for their role in planning the Sustainable Energy in Education MOOC, Lauri Vihma for being the course coordinator of the MOOC, Julia Halonen for her role in organizing the data, and Veera Sinikallio for her research assistance.

Author Contributions

Maya Kaul, as primary author, led the design, implementation, and writing of this study, under the supervision of Principal Investigator, Maija Aksela. Xiaomeng Wu also advised the writing of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Garrison, D.R.; Anderson, T.; Archer, W. The first decade of the community of inquiry framework: A retrospective. Internet High. Educ. 2010, 13, 5–9. [Google Scholar] [CrossRef]
  2. Kesim, M.; Altinpulluk, H. A Theoretical Analysis of Moocs Types from a Perspective of Learning Theories. Procedia Soc. Behav. Sci. 2015, 186, 15–19. [Google Scholar] [CrossRef]
  3. Shea, P.; Hayes, S.; Vickers, J.; Gozza-Cohen, M.; Uzuner, S.; Mehta, R.; Valchova, A.; Rangan, P. A re-examination of the community of inquiry framework: Social network and content analysis. Internet High. Educ. 2010, 13, 10–21. [Google Scholar] [CrossRef]
  4. Arbaugh, J.B.; Cleveland-Innes, M.; Diaz, S.R.; Garrison, D.R.; Ice, P.; Richardson, J.C.; Swan, K.P. Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet High. Educ. 2008, 11, 133–136. [Google Scholar] [CrossRef]
  5. Garrison, D.R.; Cleveland-Innes, M.; Koole, M.; Kappelman, J. Revisiting methodological issues in transcript analysis: Negotiated coding and reliability. Internet High. Educ. 2006, 9, 1–8. [Google Scholar] [CrossRef]
  6. Garrison, D.R.; Akyol, Z. Toward the development of a metacognition construct for communities of inquiry. Internet High. Educ. 2013, 17, 84–89. [Google Scholar] [CrossRef]
  7. Anderson, N. “Coursera to offer MOOCs for Teachers”. Available online: https://www.washingtonpost.com/local/education/coursera-to-offer-moocs-for-teachers/2013/05/02/d7978988-b35a-11e2–9a98–4be1688d7d84_story.html?utm_term=.a374411eec09 (accessed on 20 November 2017).
  8. Allen, I.E.; Seaman, J. Grade change: Tracking online education in the United States. Babson Survey Research Group and Quahog Research Group. Available online: https://www.onlinelearningsurvey.com/reports/gradechange.pdf (accessed on 28 November 2017).
  9. Schön, D.A. Education the Reflective Practitioner: Toward a New Design for Teaching and Learning in the Profession, 1st ed.; Jossey-Bass Publishers: San Francisco, CA, USA, 1987; ISBN-1-55542-220-9. [Google Scholar]
  10. Fien, J.; Rawling, R. Reflective Practice: A Case Study of Professional Development for Environmental Education. J. Environ. Educ. 1996, 27, 11–20. [Google Scholar] [CrossRef]
  11. Darling-Hammond, L. Teacher Quality and Student Achievement: A Review of State-Policy Evidence. Educ. Policy Anal. Arch. 2000, 8. Available online: https://epaa.asu.edu/ojs/article/viewFile/392/515 (accessed on 21 January 2018). [CrossRef]
  12. Laurillard, D. The educational problem that MOOCs could solve: Professional development for teachers of disadvantaged students. Res. Learn. Technol. 2016, 24. [Google Scholar] [CrossRef]
  13. Jobe, W.; Östlund, C.; Svensson, L. MOOCs for Professional Teacher Development. In Proceedings of the Society for Information Technology and Teacher Education International (SITE) Conference, Chesapeake, VA, USA, 17 March 2014; Association for the Advancement of Computing in Education (AACE): Chesapeake, VA, USA, 2014. [Google Scholar]
  14. Butler, D.; Leahy, M.; Hallissy, M.; Brown, M. Scaling a Model of Teacher Professional Learning—To MOOC or Not to MOOC? In Proceedings of the International Conferences on Internet Technologies & Society (ITS), Education Technologies (ICEduTECH), and Sustainability, Technology and Education (STE), Melbourne, Australia, 6–8 December 2016. [Google Scholar]
  15. Hodges, C.; Lowenthal, P.; Grant, M. Teacher Professional Development in the Digital Age: Design Considerations for MOOCs for Teachers. In Proceedings of the Society for Information Technology & Teacher Education International Conference, Savannah, GA, USA, 21 March 2016; Chamblee, G., Langub, L., Eds.; Association for the Advancement of Computing in Education (AACE): Chesapeake, VA, USA, 2016; pp. 2076–2081. [Google Scholar]
  16. Borko, H. Professional Development and Teacher Learning: Mapping the Terrain. Educ. Res. 2004, 33, 3–15. [Google Scholar] [CrossRef]
  17. Van Driel, J.; Beijaard, D.; Verloop, N. Professional development and reform in science education: The role of teachers’ practical knowledge. J. Res. Sci. Teach. 2001, 38, 137–158. [Google Scholar] [CrossRef]
  18. Garet, M.S.; Porter, A.C.; Desimone, L.; Birman, B.F.; Yoon, K.S. What Makes Professional Development Effective? Results from a National Sample of Teachers. Am. Educ. Res. J. 2001, 38, 915–945. [Google Scholar] [CrossRef]
  19. Darling-Hammond, L.; Hyler, M.E.; Gardner, M. Effective Teacher Professional Development. Learning Policy Institute. Available online: https://learningpolicyinstitute.org/product/effective-teacher-professional-development-report (accessed on 5 March 2018).
  20. Hendricks, M.; Luyten, H.; Scheerens, J.; Gleegers, P.; Steen, R. Teachers’ Professional Development. Europe in International Comparison; Scheerens, J., Ed.; Office for Official Publications of the European Union: Luxembourg, 2010. [Google Scholar] [CrossRef]
  21. Ranieri, M.; Manca, S.; Fini, A. Why (and how) do teachers engage in social networks? An exploratory study of professional use of Facebook and its implications for lifelong learning. Br. J. Educ. Technol. 2012, 43, 754–769. [Google Scholar] [CrossRef]
  22. Lipman, M. Thinking in Education, 2nd ed.; Cambridge University Press: Cambridge, UK, 2003; pp. 81–121. [Google Scholar]
  23. Gunawardena, C.N.; Zittle, F.J. Social presence as a predictor of satisfaction within a computer-mediated conferencing environment. Am. J. Distance Educ. 1997, 11, 8–26. [Google Scholar] [CrossRef]
  24. Garrison, D.R.; Anderson, T.; Archer, W. Critical inquiry in a text-based environment: Computer conferencing in higher education. Internet High. Educ. 2000, 2, 87–105. [Google Scholar] [CrossRef]
  25. Garrison, D.R.; Arbaugh, J.B. Community of inquiry framework: Review, issues, and future directions. Internet High. Educ. 2007, 10, 157–172. [Google Scholar] [CrossRef]
  26. Anderson, T.; Rourke, L.; Garrison, D.R.; Archer, W. Assessing Teaching Presence in a Computer Conferencing Context. J. Asynchronous Netw. 2001, 5, 1–17. Available online: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.409.9114&rep=rep1&type=pdf (accessed on 3 January 2018).
  27. Garrison, D.R.; Cleveland-Innes, M.; Fung, T.S. Exploring the causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework. Internet High. Educ. 2010, 13, 13–36. [Google Scholar] [CrossRef]
  28. Archibald, D. Fostering the development of cognitive presence: Initial findings using the community of inquiry survey instrument. Internet High. Educ. 2010, 13, 73–74. [Google Scholar] [CrossRef]
  29. Burgess, M.; Slate, J.R.; Rojas-LeBouef, A.; LaPrairie, K. teaching and learning in Second Life: Using the Community of Inquiry (CoI) model to support online instruction with graduate students in instructional technology. Internet High. Educ. 2010, 13, 84–88. [Google Scholar] [CrossRef]
  30. Arbaugh, J.B.; Bangert, A.; Cleveland-Innes, M. Subject matter effects and the Community of Inquiry (CoI) framework: An exploratory study. Internet High. Educ. 2010, 13, 37–44. [Google Scholar] [CrossRef]
  31. Rourke, L.; Kanuka, H. Learning in Communities of Inquiry: A Review of the Literature. J. Distance Educ. 2009, 23, 19–48. [Google Scholar]
  32. Onah, D.; Sinclair, J.; Boyatt, R. Exploring the use of MOOC discussion forums. In Proceedings of the London International Conference on Education (LICE-2014), London, UK, 10–12 November 2014; pp. 1–4. [Google Scholar]
  33. Onah, D.; Sinclair, J.; Boyatt, R. Dropout rates of massive open online courses: Behavioral patterns. In Proceedings of the 6th International Conference on Education and New Learning Technologies, Barcelona, Spain, 7–9 July 2014; pp. 5825–5834. [Google Scholar]
  34. Aksela, M.; Wu, X.; Halonen, J. Relevancy of the Massive Open Online Course (MOOC) about Sustainable Energy for Adolescents. Educ. Sci. 2016, 6, 40. [Google Scholar] [CrossRef]
  35. Greene, J.A.; Oswald, C.A.; Pomerantz, J. Predictors of Retention and Achievement in a Massive Open Online Course. Am. Educ. Res. J. 2015, 52, 925–955. [Google Scholar] [CrossRef]
  36. De Freitas, S.; Morgan, J.; Gibson, D. Will MOOCs transform learning and teaching in higher education? Engagement and course retention in online learning provision. Br. J. Educ. Technol. 2015, 46, 455–471. [Google Scholar] [CrossRef]
  37. Khalil, H.; Ebner, M. MOOCs completion rates and possible methods to improve retention—A Literature Review. In Proceedings of the EdMedia 2014—World Conference on Educational Media and Technology, Tampere, Finland, 23 June 2014; Viteli, J., Leikomaa, M., Eds.; Association for the Advancement of Computing in Education (AACE): Chesapeake, VA, USA, 2014; pp. 1305–1313. [Google Scholar]
  38. Chen, B.; deNoyelles, A.; Patton, K.; Zydney, J. Creating a community of inquiry in large-enrollment online courses: An exploratory study on the effect of protocols within online discussions. Online Learn. 2017, 21, 165–188. [Google Scholar] [CrossRef]
  39. Wang, W.A. Content Analysis of Reliability in Advertising Content Analysis Studies. Master’s Thesis, East Tennessee State University, Johnson City, TN, USA, 2011. [Google Scholar]
  40. Garrison, D.R. Online Community of Inquiry Review: Social, Cognitive, and Teaching Presence Issues. J. Asynchronous Learn. Netw. 2007, 11, 61–72. Available online: https://files.eric.ed.gov/fulltext/EJ842688.pdf (accessed on 10 November 2017).
  41. Landis, R.J.; Koch, G.G. The Measurement of Observer Agreement for Categorical Data. Biometrics 1977, 33, 159–174. [Google Scholar] [CrossRef] [PubMed]
  42. Huang, J.; Dasgupta, A.; Ghosh, A.; Manning, J.; Sanders, M. Superposter behavior in MOOC forums. In Proceedings of the Learning at Scale (L@S) Conference, Atlanta, GA, USA, 4–5 March 2014; pp. 117–126. [Google Scholar]
Figure 1. Community of inquiry model [1].
Figure 1. Community of inquiry model [1].
Education 08 00040 g001
Figure 2. Practical inquiry model [25].
Figure 2. Practical inquiry model [25].
Education 08 00040 g002
Table 1. Community of Inquiry coding protocol [1].
Table 1. Community of Inquiry coding protocol [1].
ElementsCategoriesIndicators (Examples Only)
Cognitive presenceTriggering eventSense of puzzlement
ExplorationInformation exchange
IntegrationConnecting ideas
ResolutionApplying new ideas
Social presenceEmotional expressionEmotions
Open communicationRisk-free expression
Group cohesionEncouraging collaboration
Teaching presenceInstructional managementDefining/initiating discussion topics
Building understandingSharing personal meaning
Direct instructionFocusing discussion
Table 2. Negotiated agreement vs. raw agreement (adapted from Garrison et al. [5]).
Table 2. Negotiated agreement vs. raw agreement (adapted from Garrison et al. [5]).
Discussion 1Discussion 2Discussion 3
Agree33288
Negotiated agreement112313
Disagreement931
Total messages565824
Agreement w/o negotiation58.2%48.3%33.3%
Negotiated agreement83.6%87.9%87.5%
Table 3. Inter-rater Reliability.
Table 3. Inter-rater Reliability.
Holsti’s C.R. 1Cohen’s Kappa 2
InitialNegotiatedInitialNegotiated
Discussion 10.580.840.600.64
Discussion 20.480.880.840.93
Discussion 30.330.880.140.65
Total0.460.860.700.82
1 Calculated at the indictor level; 2 Calculated at the category level.
Table 4. CoI frequencies and examples from the Sustainable Energy in Education MOOC.
Table 4. CoI frequencies and examples from the Sustainable Energy in Education MOOC.
CategoriesFrequencyExample Forum Posts
Cognitive Presence
Triggering event7“How to provide a clean and environmentally friendly energy sources without large increase in energy prices? Will the rich countries become richer, and the poor countries even poorer? Will it increase the economic gap? Students are discussed and decided that the transition to alternative energy sources should go gradually and on a global level. Larger investments are needed in developing countries to meet their energy needs (environmentally-friendly energy sources).”
Exploration44“Currently Europe and (north) America pushes all the electronic waste to recycling in Ghana, Africa. Recycling and treatment of that harmful computer parts are there cheap. Western companies do not have to protect the health of workers because there are no labor unions and so on. This type of waste treatment costs too much in the Western world... Out of sight, out of mind! It would be great if schools could somehow take part in to this the e-waste topics in teaching! [...] Perhaps this could be processed into a philosophical debate?”
Integration30 “During my classes I use »Under Cover«, a resource book on global dimensions. (http://www.nazemi.cz/sites/default/files/undercover_intro.pdf) where you can find all the topics mentioned in posts below. Kids really like topic called »Virtual water« where they learn how much water is used for production different things e.g., t-shirt, coffee, meat ... Further on we try to find out what happens with wasted water and how is this influenced on environmental foot print of one city. This is long term project and I try to find some eTwinnig partners and children are investigating usually for few months. At the end they have a presentations of their results in front school mates, other school employees, parents and they share their results with eTwinning partners.”
Resolution8“Deforestation is a huge environmental problem seeing as forests absorb a huge amount of CO2 […] After the students have fully understood the problem, they could be divided into three teams. The first should do a research on the benefits of stopping deforestation, the second should research the negative effects of stopping deforestation and the last one should research the reasons for which deforestation has become such a huge problem. When all teams have finished their researches, students should discuss their findings and have a “play-roll” of a trial where each student defends his/her opinion […] The benefits one student would have from this experience would be that he/she has gained knowledge on an important matter. In addition, he/she has learned that problems like this one are more complex than they seem and we have to take every aspect of that problem into consideration.”
Social Presence
Emotional Expression1“I agree that experiments are important. I found it a bit challenging in innovating those for the life-cycle analysis topic.”
Open communication30“I agree with you dear colleague. Experiments help students investigate by themselves, decide and find solutions.”
Group cohesion2“We, as teachers, should try to sensitize children to a major ecological, health and social threat of the century: climate change.”
Teaching Presence
Instructional management0No example available
Building understanding0No example available
Direct instruction3“The best method to teach LCA to students is to make sure they understand the concept that everything we do in a given process, has its own consequences. […] the most important idea is to help students to identify what makes a product.”

Share and Cite

MDPI and ACS Style

Kaul, M.; Aksela, M.; Wu, X. Dynamics of the Community of Inquiry (CoI) within a Massive Open Online Course (MOOC) for In-Service Teachers in Environmental Education. Educ. Sci. 2018, 8, 40. https://doi.org/10.3390/educsci8020040

AMA Style

Kaul M, Aksela M, Wu X. Dynamics of the Community of Inquiry (CoI) within a Massive Open Online Course (MOOC) for In-Service Teachers in Environmental Education. Education Sciences. 2018; 8(2):40. https://doi.org/10.3390/educsci8020040

Chicago/Turabian Style

Kaul, Maya, Maija Aksela, and Xiaomeng Wu. 2018. "Dynamics of the Community of Inquiry (CoI) within a Massive Open Online Course (MOOC) for In-Service Teachers in Environmental Education" Education Sciences 8, no. 2: 40. https://doi.org/10.3390/educsci8020040

APA Style

Kaul, M., Aksela, M., & Wu, X. (2018). Dynamics of the Community of Inquiry (CoI) within a Massive Open Online Course (MOOC) for In-Service Teachers in Environmental Education. Education Sciences, 8(2), 40. https://doi.org/10.3390/educsci8020040

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop