Next Article in Journal
The Relation between Collaborative Consumption and Subjective Well-Being: An Analysis of P2P Accommodation
Previous Article in Journal
Traditional Foods and Sustainable Rural Development: Exploiting the Case of the Comoros Tea as a Potential Source of Bioactive Compounds
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Systematic Literature Review on the Quality of MOOCs

by
Christian M. Stracke
1,* and
Giada Trisolini
2
1
Faculty of Educational Sciences, Open Universiteit, 6401 DL Heerlen, The Netherlands
2
Department of Educational Sciences, University of Bologna, 40126 Bologna, Italy
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(11), 5817; https://doi.org/10.3390/su13115817
Submission received: 13 March 2021 / Revised: 8 April 2021 / Accepted: 22 April 2021 / Published: 21 May 2021
(This article belongs to the Section Sustainable Education and Approaches)

Abstract

:
This paper presents the findings from a systematic literature review on the quality of massive open online courses (MOOCs). The main research question was “How can the quality criteria for MOOCs identified in the analysed studies from the systematic literature review be best organised in a categorisation scheme?” The systematic literature review was conducted using the PRISMA procedures. After conducting the screening and eligibility analysis according the pre-defined criteria, 103 studies were finally selected. The analysis was done in iterative cycles for continuous improvements of the assignments and clustering of the quality criteria. The final version was validated in consensus through the categorisation and assignment of all 103 studies in a consistent way to four dimensions (pedagogical, organisational, technological, and social) and their sub-categories. This quality framework can be re-used in future MOOC research and the discussion of the analysed studies provides a current literature overview on the quality of MOOCs.

1. Introduction

Currently, innovative education is often wrongly characterised as involving continual introduction of new technologies [1,2]. Thus, to understand innovative education, it is necessary to re-visit the social function of pedagogy in the development and growth of individual learners. In the European political context, some educational systems have been adopting an increasingly open approach, in line with the global movement for open education and open educational resources (OER). The OER acronym was first adopted at the 2002 UNESCO forum on the impact of open educational content for higher education in developing countries, funded by the William and Flora Hewlett Foundation [3]. The OER movement can be considered one of the most significant effects of digital innovation in the educational field [1]. The idea behind this movement is to make all created educational resources openly and freely available through appropriate repositories [4]. This allows all learners worldwide to re-use, adapt, and take advantage of these learning opportunities, in order to enable education as a common good [5].
During recent years, massive open online courses (MOOCs) became popular, taking the form of free online courses that were accessible to everybody and in which a large number of learners enrolled [6]. To date, the number of MOOCs, learners, and providers is still increasing continually: Despite blame as a buzzword and hype already in the year 2012, the number of offered MOOCs as well as of learners is growing today, with the main focus and popularity on educational research [7,8]. Due to its unbroken growth, this special type of OER has prompted a lot of research [9,10,11]. MOOCs prompted many authors to investigate the potential of such a so-called “disruptive innovation” [12,13,14,15], even though this innovation is not disruptive, but an adaptation of old and well-established pedagogical approaches in online learning [16]. Ref. [17] defined a MOOC as a didactic product that guarantees a structured learning path, including a syllabus and explicit didactic objectives, materials, and learning support activities, and an evaluation system based on quizzes, exercises or projects, and a certification process. A MOOC has to be usable through an online platform and must be designed and managed in such a way as to guarantee its scalability, so that it can be enjoyed by a large number of learners. In addition, it has to be accessible to all without prerequisites for registration and participation.
Different types of MOOCs and different pedagogical models for MOOCs were developed, including the initial distinction between cMOOCs and xMOOCs. Within that distinction, cMOOCs are mainly designed to support processes of aggregation, relationships, creation, and the sharing of knowledge, which is also known and promoted as a collaborative or connectivist approach [18] (for a critical debate on connectivism, see [19,20]). In contrast, xMOOCs are more institutionalised and rigorously structured, focused on cultural reproduction through videos and quizzes, known as a behaviourist approach [10]. However, research in the last few years has shown that such a separation and distinction cannot be made, due to the increasing overlap and mix of these two extremes [11,20,21]. In order to understand whether MOOCs can actually represent a revolutionary innovation in the educational field, it is necessary to investigate how they have been conceived, developed, and implemented and to analyse their pedagogical quality, learning effects, and impact [22,23,24,25]. As a starting point, the quality of MOOCs is understood as multi-dimensional and not definable in one single phrase; it has to reflect the different views, needs, and interests of the learners and other stakeholders, with specific focus on the pedagogical aspects that are most relevant for the quality of MOOCs, as highlighted in the latest research [1,10,17,20,24,26,27,28,29].
It is complex to set general quality standards for MOOCs, as there are many types of MOOCs, as well as many purposes of MOOC providers, who can offer courses to disseminate knowledge, to attract new students, or to increase learners’ motivation [30,31]. According to many authors, attention has to focus on the process rather than on the outcome, since the MOOC should be seen as a means of discovery and a way of experiencing for the learners. Therefore, current rankings and rating systems for their quality (such as simple five-star scales) are failing [20,21,26,30,31,32,33,34]. Ref. [35] promoted MOOCs as disruptive technology, and she proposed different strategies for enhancing the learner experience and quality of MOOCs. The success or failure of a MOOC depends on how specific quality criteria are met, which has also been confirmed by the latest MOOC research, including the findings from the Global MOOC Quality Survey [21,27,28]. Refs. [2,25] pointed out that MOOC research requires taking a new stance towards quality improvement. Many authors have also asked whether high drop-out rates should be interpreted as an indicator of (poor) quality or whether a high drop-out rate is an indicator of different preferences and effective choices by the learners [36]. According to [37], analysis of the motivations that push learners to leave the MOOC could improve the teaching quality of MOOCs in general. Ref. [38], on the other hand, considered that the non-completion rate does not determine the quality of a MOOC. If we focus only on quantitative comparison of the enrolment rate and the completion rate for the course, we lose sight of other elements that should be considered. First of all, there can be diverse reasons why learners did not complete the MOOC: We must distinguish between those learners who did not intend to try to complete the didactic activities (but are labelled as drop-outs) and those drop-outs who tried to complete the activities but did not meet the criteria [39]. These data are difficult to obtain, since MOOCs do not easily allow this type of monitoring [40].
Beyond the obvious difficulties in mapping thousands of learners, there are different ways for learners to participate in and to benefit from MOOCs, even without certification [41,42]. Therefore, besides the drop-out rates, several other characteristics need to be considered as far as MOOC quality [23]. They include, for example, the impact of the different evaluation approaches adopted, the differences emerging in the implementation phase, the choice of pedagogical approach, and the opportunity to re-design the MOOCs to include additional alternative pathways and materials for learners who could not pass tests. For [35], instructional design should be instrumental in encouraging reflection, facilitating dialogue and collaboration, applying theory to practice, creating peer communities, encouraging creativity, and motivating learners. Research conducted by [24] highlighted the poor teaching quality of MOOCs, in which the principles of instructional design that the authors used for the analysis of the selected MOOCs were rarely respected, resulting in a larger focus on other aspects related purely to the organisation and presentation of the teaching materials.
Do MOOCs risk being a “colorful repetition” [43] of traditional individual learning and represent only cultural reproduction? Although MOOCs represent a new opportunity for higher education, this remains a very important issue to consider. To give an example, there could be significant obstacles to implementing a pedagogical model that intends to enhance some basic principles of social constructivism, as most MOOCs are based on what Horton defined as the logic of “WAVWAVWAVAAQ” (as cited in [44]: Watch a Video Watch a Video Watch a Video And Attempt a Quiz.
A first systematic literature review covered the MOOC research studies published up to the year 2012 [45]. Afterwards, none of the available systematic literature reviews provided a full overview with a focus on the quality of MOOCs: [9] presented an analysis of the data about MOOCs (up to the year 2013), Ref. [46] offered a qualitative analysis of how MOOCs support academic employability (up to the year 2015), and three systematic literature reviews focused their studies up to the year 2018 on selected specific topics—[47] on engagement and retention in vocational education and training MOOCs, [48] on the accessibility of MOOCs, and [49] on research methods and topics in empirical MOOCs. Starting with this complex situation, the authors conducted a systematic literature review on the quality of MOOCs in order to collect data about the different studies carried out by researchers during the last seven years (2013 to 2019) and their findings and to use those data in order to develop a framework within which to organise the quality categories and criteria of MOOCs.

2. Materials and Methods

This article provides the results from a systematic literature review study that followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement and its procedures [50]. The research question for our systematic literature review was:
  • How can the quality criteria for MOOCs proposed in the analysed studies from the systematic literature review be best organised in a categorisation scheme?
Systematic literature reviews and meta-analyses have become increasingly important and useful in the educational field, as there is a lack of careful validation and review of studies in educational research [51]. A systematic literature review ensures a rigorous and accountable approach to the search for, selection, and critical analysis of studies [52,53]. Systematic reviews summarise the state of the research on a topic using methods that are pre-planned and documented in a systematic review protocol. This type of review is based on the rigorous analysis of the evidence that arises from a careful evaluation of the available literature according to shared criteria. Conducting a systematic review assumes mastery of a complex work methodology. In particular, it requires a well-structured process that defines the question of the review, as well as which studies will be located, evaluated, selected, and then synthesised. To meet and fulfil all these requirements, we chose to use the methodology outlined in the PRISMA statement.

2.1. Systematic Review Process: The Methodology in the PRISMA Statement

The methodology in the PRISMA Statement [50] consists of a 27-item checklist that concerns methods, results, discussion, and funding and a four-phase flow diagram that concerns record identification and screening, study eligibility, and studies included. We adhered to all 27 items on the PRISMA checklist.
Our review takes into account literature published from January 2013 until December 2019. We decided in advance to limit our systematic review to the most recent literature from the last seven years, both to avoid having too many studies included and because previous systematic literature reviews on the quality of MOOCs have covered the time period up to 2013 [9,45]. None of the latest systematic literature reviews on MOOCs [11,21,46,47,48,49] have targeted this specific topic of the quality of MOOCs.
Two inclusion criteria were defined before the first screening, according to formal aspects: first, only studies in English and second, only studies published in journals or peer-reviewed proceedings of scientific conferences. Next, a manual check for duplicates was defined before it took place. The abstracts of the remaining studies had to be analysed in a second screening for their eligibility according to their relation and relevance to our research question. The exclusion criterion that was defined for the second screening was relevance, meaning, “Does this study have any relevance for our research question?” This exclusion criterion is very vague; therefore, it should only be used to remove studies that are obviously addressing different topics and disciplines. Where there would be any doubts, the studies should not be excluded but rather included, and kept for the following full-text analysis. The final inclusion criterion defined for that analysis was the availability of the full texts. The remaining studies had to undergo quantitative and qualitative synthesis. The results from the quantitative analysis are presented in the Results section below, whereas the Discussion section discusses the results from the qualitative analysis.

2.2. Systematic Review Process: The Realisation of the PRISMA Statement Procedures

Following the PRISMA Statement procedures as presented in the flow chart below (see Figure 1), the first phase regarding record identification was carried out by searching for studies in four electronic databases: Web of Science (Clarivate Analytics), Scopus (Elsevier), Scholar (Google database), and AlmaStart (University of Bologna database, EBSCO host).
First, the broad search term “mooc AND quality” was tested in the four databases, resulting in 22,818 records in total. This large number of records could not be analysed in an appropriate timeframe by the authors. Therefore, the authors decided to use the more restrictive search term combination “mooc AND quality AND pedagog*”, resulting in 749 records. That was considered to be a feasible number of records for a systematic literature review. The search term amendment “AND pedagog*” was selected due to our main interest in the pedagogical aspects of the MOOC quality. Starting with these 749 records, the authors followed the pre-defined PRISMA Statement procedures as described above. First, the formal screening related to non-English studies (-154) and non-peer-reviewed journal and scientific conference publications (-221) left 374 records. Then, 123 duplicates were removed. The remaining 251 records were assessed for eligibility by the pre-defined selection criteria, analysing their titles, keywords, and abstracts (-134), leaving 117 records. These 117 records were treated as collected studies for full-text evaluation. A total of 14 studies could not be found as full text and had to be removed, leading to a final total of 103 selected studies (see the list in Appendix A).
Figure 1 below presents the flow chart for the PRISMA Statement procedures and the results from our selection of studies for our systematic literature review following the PRISMA methodology.
The datasets of our systematic literature review are publicly available in the annexes to allow other researchers to re-use them and benefit from their collection and data.

3. Results

A total of 103 studies were included in the quantitative and qualitative analysis after the selection process for our systematic literature review, as described above (see Appendix A with the full list of all 103 selected studies and their coding, consisting of three letters and two digits for the publication year and used in the following). This section presents the findings from the quantitative analysis of all 103 selected studies.
The 103 studies were published in various journals and proceedings. Only one publication was the source of four included studies: International Journal for Educational Technology in Higher Education (Cos18, Dan15, Fid16, Fre18). Five publications were the source of three included studies each: the journals Educational Media International (Con15, Liu15, Oss15), European Journal of Open, Distance and E-Learning (Liy19, Pet16, Rol15), Revista Española de Pedagogía (Dua17, Lem19, Vaz17), International Journal of Emerging Technologies in Learning (Faq18, Kha15, Tah17), and the ECEL proceedings (Bar15, Fit14, Mas15). Four publications were the source of two studies each (International Review of Research in Open and Distributed Learning with Fer17, Naj15; Journal of Online Learning and Teaching with Bal14, Mil15; the ICALT proceedings with Lop15, You14; and the EMOOCs proceedings with Alo19, VdP19). The 76 other studies were all from different publications. Out of the 103 studies, one study was from 2013, 13 studies were from 2014, 24 studies were from 2015, 18 studies were from 2016, 17 studies were from 2017, 12 studies were from 2018, and 18 were studies from 2019.
The authors developed and repeatedly applied and improved a categorisation scheme for the quality criteria of MOOCs with four main dimensions: the organisational, technical, social, and pedagogical aspects of MOOCs, as proposed by [54]. The categorisation scheme was initially designed based on the results from our analysis of the previous systematic literature reviews on MOOC research [9,11,21,45], from the Global MOOC Quality Survey, and from own experiences in the design, realisation, and evaluation of MOOCs. Furthermore, in order to assure conciseness, robustness, completeness, extendability, and explainability [55], the taxonomy development process followed the “three-level model” by [56], including alternative approaches (e.g., conceptual and empirical) that could be repeated in different combinations. This iterative process allowed the authors to add, split, remove, and merge categories and their hierarchical levels that composed the categorisation scheme [55].
The first draft of the categorisation scheme was designed as a concept map and included 58 categories under the four main dimensions, on five hierarchical levels (including the four dimensions). All 103 studies were initially assigned to the defined categories during the superficial eligibility and inclusion process. During their qualitative full-text analysis, the categories were continually revised and improved by the two co-authors as follows: They independently classified the studies and each developed a new individual draft for potential categories. Afterwards, they presented the two individual drafts to each other and discussed them. Based on these two drafts and the discussion, the first common draft of the categorisation scheme was developed in consensus (first iterative cycle). For repeated improvement, the studies were separated into 10 groups of 10 or 11 studies. After revisiting and analysing the studies of one group, the authors independently revised the categories and their hierarchical levels (the next additional iterative cycle). The results were again discussed, leading to a next categorisation scheme developed in consensus (the next additional iterative cycle). In the beginning, the tendency was towards having more hierarchical levels (up to six levels), but it turned out that it became difficult to differentiate so many hierarchical levels. After 10 rounds with the 10 groups of studies, there was a total of 19 iterative cycles of assignments of the studies and refinements of the categories and their levels. In the end, a final version of the categorisation scheme for the quality criteria of MOOCs was developed, with 71 categories on four levels, including the four dimensions as the highest level (three levels without the dimensions). It was validated, as all 103 studies could be assigned and included in this final version. As to be expected from the search terms used, the pedagogical dimension was more split up and differentiated than the other three dimensions, as most studies could be assigned to the pedagogical dimension.
The 103 studies (see the list in the Appendix A) were very unevenly distributed among the four dimensions and their categories (one study could be assigned to more than one category). The vast majority could be assigned to the pedagogical dimension (88 out of 103 studies), whereas fewer studies were assigned to the three other dimensions: 18 studies to the organisational dimension, 25 studies to the technical dimension, and 19 studies to the social dimension (see the Table A1 in the Appendix C for the assignment of all 103 studies to the categories of the Quality Framework for MOOCs).
Along with the presented analytic framework for the studies as the main result, another key finding was that only one study out of all 103 selected addressed three or more dimensions.
Differences also emerged concerning general studies covering the whole dimension. There was only one study focused on the whole social dimension, but several general studies that focused on the pedagogical dimension (6); in particular, there were many general studies for the organisational (5) and the technical (7) dimensions, in relation to the smaller total number of studies for these dimensions compared to the pedagogical dimension. In addition, not only was the pedagogical dimension covered by the greatest number of studies (88), but also within that dimension was the greatest number of assignments to the different sub-categories (207). The other three dimensions were covered by 52 studies addressing 14 sub-categories: 18 studies addressing four sub-categories related to the organisational dimension, 25 studies addressing five sub-categories related to the technical dimension, and 19 studies addressing five sub-categories related to the social dimension.
Due to this unbalanced result, the overview in Figure 2 below presents only the titles of the three under-represented dimensions (organisational, technical, and social); only the pedagogical dimension shows the complete version with the three levels of sub-categories (see the full version of the quality framework with all sub-categories and four levels for all four dimensions in Figure A1 in Appendix B).
In summary, the final version of the categorisation scheme, the quality framework for MOOCs, was helpful for the clustering and differentiation of all 103 studies. The categories in the quality framework and the structuring of their levels were improved and validated through repeated assignment of the studies, which led to only three categorisation levels for the pedagogical dimension (presented in Figure 2) and only one level for the other three dimensions.

4. Discussion

In this section, the authors discuss what emerged from the qualitative analysis of all 103 selected studies. This article aims to develop a new framework for analysing current studies focusing on the pedagogical quality of MOOCs. A systematic literature review using pre-defined selection criteria was conducted, in which 103 studies were found (see the list in Appendix A) and assigned to the categories of the Quality Framework for MOOCs (see Table A1 in Appendix C).

4.1. Contribution to the MOOC Research and Literature

This systematic literature review supports and contributes to the research and literature on the quality of MOOCs in three respects.
First, this article aims at the development of a framework for the categorisation and analysis of the selected studies. The categorisation scheme was improved in iterative cycles during the analysis and assignment of the studies, leading to the final version presented in Appendix B. The framework was helpful and could be used to categorise and assign all 103 studies in a consistent way. Thus, it can be re-used in future systematic literature reviews on the quality of MOOCs.
Second, this article provides an overview of the current state of the research and literature on the quality of MOOCs. The results from the quantitative analysis of the 103 studies were reported in the section above. In the following sub-sections, we will discuss the findings from the qualitative analysis separately for each of the four general dimensions: pedagogical, organisational, technical, and social.
Third, we will discuss all of the results from the systematic literature review on a general level. Therefore, the overall findings from the analysis across all four dimensions will be presented in the last sub-section. The key finding here is that only one out of all 103 studies focused on three or more dimensions concerning MOOC quality.

4.2. Discussion of the Pedagogical Dimension

The pedagogical dimension is considered to be the most important dimension related to the quality of MOOCs. Consequently, the vast majority of the studies (88 out of 103 studies) addressed pedagogical aspects of MOOCs. Seven studies provided an analysis of (a larger number of) empirical studies, pointing out that:
  • Good designs can help to promote MOOC participants’ levels of engagement and allow effective pedagogical principles to benefit learners (Liu15);
  • “The majority of MOOCs scored poorly on most instructional design principles” (Mar15, p. 77);
  • The instructional design of MOOCs drives desired learning outcomes (Naj15);
  • It is necessary “to continue placing emphasis on the pedagogical design of MOOCs, as this educational approach can have wider benefits for on-campus learning and scholarly activity, particularly research and dissemination” (Pet16, p. 128);
  • Educational organisations must improve their offerings to better satisfy their learners, especially through better MOOC design processes (Ram15), and responding to the need to specifically scaffold the integration of concepts, beyond their mastery as individual concepts (Fal16); and
  • Learning design principles can foster learning in MOOCs (You14).
The other 81 studies were mainly literature reviews and case studies; due to space limitations, only the most substantial publications will be discussed.
Several studies addressed learning design and the view of MOOC learners. They particularly considered the lack of instructional design (Hli16; Pil17) and the fact that learning design and learning environment are the main factors affecting learner experiences and quality (Con15; Oss15). (Str17) emphasised the need to focus on the learners’ perspective at the micro level, demanding specific learning opportunities and different learning experiences. (Alu16) considered that continual feedback from students is also crucial in creating more effective MOOCs. (Ulr15) reflected on perceptions and expectations regarding MOOCs, whereas (Che19) underlined the impact of misconceptions.
Other studies discussed the pedagogical tools and theories used in MOOCs. The qualitative study by (Tov15) explored the range of pedagogical tools used in 24 MOOCs, finding that the range of pedagogical practices currently used in MOOCs tends toward an objectivist-individual approach. (Oss15) claimed that “MOOCs rely on connectivism and co-construction of knowledge through connections and negotiation” (p. 281), whereas (Amo14) stated that “connectivism does not provide an adequate explanation of learning phenomena in Web 2.0, and therefore it is not able to provide an adequate pedagogy for MOOCs” (p. 79). (Pil17) considered the facilitation of self-directed learning processes and communities to be strategic components and demanded diverse assessment types and personalised feedback as guidance for learners.
Some studies focused on taxonomies and contexts of MOOCs going beyond the classical distinction between cMOOCs and xMOOCs. (Con15) described five different MOOC types: associative, cognitive, constructivist, situative, or connectivist. (Alb18) proposed distinguishing between the three dimensions of presage, process, and product for MOOC research, whereas (Gre18) introduced a differentiation between formal, conventional, and professional MOOCs. (Mar15) and (Ram15) considered the context as a place where learners can apply their knowledge. Research conducted by (Bra15) aimed to understand how students perceive learning outside the classroom through social media, online courses, school websites, and private tutoring. They found that more than 96% agreed or strongly agreed that learning outside the classroom has a positive impact on education, which confirms the potential of MOOCs.
Further studies related the pedagogical dimension to other fields and disciplines. (Hsu16) underlined that a nuanced understanding of the complex relationships among technology, content, and pedagogy is required for high-quality teaching. Some authors considered learning analytics as an effective way to improve the learning design and processes of MOOCs (Amo14; Tah15; You14). (AlI19) underlined that MOOCs are not disruptive innovations, but offer new learning opportunities.
Within the pedagogical dimension, it was surprising to us that many studies focused on attempts to differentiate types of MOOCs and pedagogical methodologies such as connectivism that are not proven and not even well defined (Amo14). Furthermore, most studies assigned to the pedagogical dimension did not present empirical experiments and evidence-based results (only seven out of 88 studies) but addressed theoretical proposals and presentations of potential design conceptions and guidelines: That confirms the concerns about the quality of MOOCs and related research [24,25,26,27]. It seems that studies of the pedagogical dimension are still in the beginning, surface-level stages of analysis of MOOCs. They used buzz words such as “learning analytics” and “educational data mining,” promoting their large potential without any evidence for such statements. In the meantime, the Quality Reference Framework (QRF) was developed and validated in collaboration with the MOOC community, which contains more quality criteria clustered in three dimensions and perspectives to be addressed in the design of MOOCs [28].

4.3. Discussion of the Organisational Dimension

The organisational dimension included different types of studies (18 studies in total) that focused on the organisational dimension at the general level in distinctive ways. Three studies included a theoretical analysis considering the organisational dimension to be most fundamental for assuring the quality of MOOCs (McA14; Sto15) or business revenue for educational institutions (Nie16). Two other studies analysed learning strategies and experiences at different levels, addressing the organisational dimension at the macro level (Lop15; Str17). Six other studies focused on specific organisational aspects. There were three studies related to organisational commitment to the quality of MOOCs. (Ghi16) stated in her literature review about enhancing the quality of MOOCs that the MOOC evaluation literature is still undecided between the need to adopt one of the few quality enhancement frameworks specifically created for MOOCs or to re-use the available e-learning quality models. (Kin14) defined an approach called “fit for purpose” that “produced a high quality learning experience that also supported institutional goals” (p. 117). In addition, (Mas15) and (Fit14) considered that MOOCs are potentially highly disruptive and present challenges to institutions to overcome issues connected to staffing tensions and role alterations. As for policies and licenses, (Nyo13) highlighted the importance of this category, especially for MOOC licenses that are “blurring the lines between traditional educational values and the commercial enterprise” (p. 671). As for efficiency and costs, (Dan15) wondered whether the future of MOOCs will support adaptive learning or only serve as a business model for institutions.
For the organisational dimension, it was remarkable that the development and introduction of institutional policies and strategies including visions and mission statements as well as the organisational evaluation and continuous improvement cycles were not mentioned or addressed in the studies. Thus, all studies assigned to the organisational dimension ignored research results highlighting the need for iterative cycles in organisational development to achieve quality improvements as demanded by [1,2].

4.4. Discussion of the Technical Dimension

The technical dimension was strongly addressed by 25 studies in total. Seven studies discussed the technical dimension using different criteria. (Lau16) offered a broad overview of the technical dimension considered to be strongly related to the pedagogical approaches in MOOCs, especially in order to support collaborative learning. Three other studies (Gam15; Ram15; Yal17) considered the technical dimension to be referring to a group of hardware and software tools, features, and standards that support learning processes and could even improve them. Three other studies (Lop15; Llo16; You14) took into account the technical dimension as an essential basis for MOOC challenges, such as the drop-out rate, lack of human interaction, assessment issues, and pedagogical approaches. The majority of the studies addressing this dimension considered MOOC aggregators and delivery systems as the most important technical aspects for the support of learning processes. Six studies highlighted the choice of the platform as fundamental, but for different reasons: to ensure interaction (Pil17), for decision-making (Kin14), for negotiation (Ebn16), for time management (Naw14), for collaborative learning (Bae16), or in general for MOOC quality (Bab16). The user interface and video development were discussed also in six studies, but from different points of view. (Kha15) and (You14) combined the technical implementation of interface and video in order to create an evaluation grid for observing xMOOCs to find several levels of underlying structures and design concepts in order to assure the pedagogical quality of MOOCs. The empirical study by (Kim14) considered video lectures especially, in order to analyse the learner’s interactions “depending on the visual, pedagogical, and stylistic properties of the video” (p. 8) and video player interfaces as an improvement of navigation. (Tah17) considered interoperability as a primary need: “To meet this challenge, standards have been proposed to assist interaction flow and implementation of the differentiation process” (p. 210). In addition, (Bro14) stated that a virtual learning environment (VLE/LMS) has to be enhanced with social features or combined with a community or network-like environment to foster relationships and interactions. (Kin14) considered technical design decisions to be important for optimising accessibility and scaffolding. Other authors considered technology strategic for supporting social interactions “in order to convert MOOCs into effective and adaptive learning environment” (Amo14, p. 84), an important way to achieve collaboration in MOOCs (You14), and a way to encourage learners’ participation and engagement (Pil17).
For the technical dimension, it was striking that the challenges involved in offering learning at scale for massive audiences and related technical issues for the online environment and its support were not covered by the studies, in contrast to the results by [22]. In addition, the majority of the studies assigned to the technical dimension did not reflect on the implications and complexity of MOOC integration into existing environments, as indicated by [52,54].

4.5. Discussion of the Social Dimension

The social dimension was addressed a little less often, in a total of 19 studies. (Gar17) pointed out the need for collective intelligence to develop a sense of community: When students learn together intentionally and informally in networked online environments, they form small and temporary communities. The communication aspects included building community and supporting engagement (Bar15; Kin14; Mil15); supporting instructional presence, interaction, and participation (Bro14; Kha15; Naj15); and creating a personal learning environment (Bab16). However, whereas communication referred mostly to aspects of learners’ interactions, collaboration was more related to the pedagogical model adopted. In particular, the empirical studies by (Fid16) and (Gam15) considered collaboration to be a pedagogical resource that allows for the measurement of the achievement and effectiveness of MOOCs. Findings from these studies showed that the number of drop-outs decreases when the cooperation level increases. Furthermore, collaboration is not confined to shared resources—in fact, the creation of knowledge sharing underlies the collaborative strategy. Three other studies referred to collaboration as a way to improve the relationship among learners and educators (Mas15; Pet16; Tah17). (Lau2016), as well as (Pet16), debated discussion and sharing as strategic issues for enhancing collaborative learning experiences. In addition, networking was considered by (Bas14) as a social tool, whereas other studies defined it as a learning network that generates a community of practice (Amo14; Sto15; Bar15). (Vaz17) defined networking as both a technical social tool as well as an important way to create learning communities.
For the social dimension, it was notable that the different feedback and interaction types for the support of learners by educators and by their peers were not highlighted as key social demands and instruments as identified by [28,33]. Furthermore, the studies assigned to the social dimension did not mention, and thus neglected, the specific importance of interactions with content by learners, presented as a key success factor for MOOCs by [34].

4.6. Overall Discussion across All Four Dimensions

From the overall analysis of the 103 studies across all four dimensions, it was a surprising finding that only one study addressed three or more dimensions; the other 102 studies all focused on only one or two dimensions. A potential answer could be that during the period of time analysed (publication years 2013 to 2019), studies were just beginning to analyse the quality of MOOCs and lacked general overviews and validated frameworks such as the Quality Reference Framework (QRF) [57].
Furthermore, the distribution of the 103 studies over the four dimensions was very unbalanced, as already reported in this section above, which indicates specific research interests and efforts. That was also visible in our qualitative analysis on the research focus and study content. That was not very unexpected given our main focus on pedagogical aspects and our selected search terms. Therefore, we underline the need for additional research analysis of the three under-represented dimensions.
However, we identified differences in comparison to the quantitative data. The dimension with the lowest number of studies (organisational) was addressed by a relatively high number of general studies (in particular, compared to the social dimension, with only one general study). Its flat structure of categories was not surprising, given our main focus on pedagogical aspects and in line with the technical and social dimensions.

4.7. Relevance and Future Application

We believe that this systematic literature review and its findings are relevant for MOOCs, as they inform both MOOC designers and learners. They can identify quality categories that are important for their (design as well as learning) objectives and choices of best-fitting MOOC methods. The key result of this systematic literature review, the quality framework for MOOCs, can support future MOOC research in several ways: It can be applied in MOOC design guidelines, as basis for discussions and calibration in and among MOOC design teams, adopted for standardised descriptions and quality ratings of MOOCs, and re-used in follow-up systematic literature reviews in the future.

4.8. Limitations

Some limitations must be mentioned and should be noted. First, we were not able to conduct a meta-analysis of all 103 selected studies: Too many studies would need to be excluded, as they did not provide appropriate effect sizes for a meta-analysis or gave no effect sizes at all. Second, the alternative was not even possible, that is, comparison of the provided effect sizes using the Standardized Index of Convergence (SIC) defined by [58]. The SIC can calculate the degree of consistency in findings from compared studies and can be applied when at least three studies are analysed the same relationship; it would not require comparable effect sizes [59,60,61]. However, we could not find three studies with reported effect sizes for even a single sub-category; thus, we could not apply and use the SIC method. Furthermore, we had to limit our conclusions concerning the relevance of the categories, as the numbers of studies indicated only the research interest for each category, and not necessarily the importance of each category. In addition, the limited search terms and the focus on the last seven years (2013 to 2019) led to a situation in which some relevant studies could not be selected and could have been missed in this systematic literature review. Another limitation is given through the strict focus on studies published in English and after a peer review. Finally, the quantitative as well as the qualitative analysis of the 103 studies and their assignments to the four dimensions and sub-categories revealed that there is still a great need for future in-depth research on the quality of MOOCs. A future systematic literature review could focus on the under-represented three dimensions by combining broader search terms and using only one database, to avoid search results with a high number of studies that cannot be reviewed and assigned.

5. Conclusions

In this article, the quality of MOOCs was observed and analysed from different points of view based on the results of a systematic literature review analysing 103 selected studies (see Appendix A). Our main result was the development of a quality framework as a categorisation scheme for aspects of the quality of MOOCs, distinguishing four dimensions: organisational, technical, social, and pedagogical. The pedagogical dimension was detailed and explained by a total of 33 categories on two hierarchical levels, consisting of seven first-level categories and 26 second-level categories (see Figure 1 above). The other three dimensions (see Figure A1 in Appendix B) were described by a total of only 14 first-level categories and require further research. Due to our main interest in the pedagogical aspects of MOOC quality, the other dimensions can be considered to be under-represented in this literature review compared to the pedagogical dimension, to which the vast majority of studies were assigned (see Table A1 in the Appendix C for the assignment of all 103 studies to the categories of the Quality Framework for MOOCs).
A surprising key finding from the overall analysis of the 103 studies across all four dimensions is that only one study addressed three or more dimensions. All 102 of the other studies focused on only one or two dimensions. Another finding from this systematic literature review is that it is necessary when doing research on the quality of MOOCs to consider and address several main indicators for the design and quality of MOOCs from different dimensions. Although the majority of studies focused on the pedagogical dimension, the other three dimensions (organisational, technical, and social) are also relevant and decisive for the design and quality of MOOCs. That is the case in particular for institutional commitment, which is related to the organisational dimension; social communication, belonging to the social dimension; and the provider and selected platform, which is linked to the technical dimension. These aspects were most often addressed in the studies related to each of these dimensions. These represent key research interests, but they can also be interpreted as main categories to be considered for the design and improvement of MOOCs.
MOOCs are continually becoming more and more attractive for lifelong learners, and, as a consequence, for universities and lifelong learning providers, too. MOOCs give their providers greater visibility and may allow them to address more potential students, and to enhance their purposes and impact in the context of their education, research, and involvement with other stakeholders. Therefore, it would be useful to deepen the research on the quality of MOOCs and to design new guidelines that ensure the quality of MOOCs, allowing further opening of universities to new frontiers of education.
Finally, we’ll highlight three promising directions for future research in this area. First, the results from the systematic literature review should be analysed and compared with other on-going research on the quality of MOOCs at a global level, such as the development of the Quality Reference Framework for MOOCs and open education based on the findings from the Global MOOC Quality Survey [29]. Second, the systematic literature review results should be tested and validated through other qualitative and experimental studies involving a broad range of stakeholders and institutions focusing on the design and quality of MOOCs. Third, the results should be used and evaluated in the development of new MOOCs to learn more about the practicability of the four dimensions and their quality indicators. Such mixed-method research can lead to a more comprehensive understanding and better strategies to improve the design, implementation, and evaluation of future MOOCs and therefore their quality.

Author Contributions

Conceptualization, methodology, validation, formal analysis, investigation, data curation, writing—original draft preparation, writing—review and editing, visualization, C.M.S. and G.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

All collected data from the systematic literature review are presented in Appendix A.

Acknowledgments

We appreciate the financial support through the Open Access fund from the library committee of the Open Universiteit, the Netherlands (OUNL).

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Selected 103 Studies from the Systematic Literature Review

Ada14
Adams, C.; Yin, Y.; Vargas Madriz, L. F.; Mullen, C. S. (2014). A phenomenology of learning large: the tutorial sphere of xMOOC video lectures. Distance Education, 35(2), 202–216, doi:10.1080/01587919.2014.917701.
AlI19
Al-Imarah, A. A.; Shields, R. (2019). MOOCs, disruptive innovation and the future of higher education: A conceptual analysis. Innovations in Education and Teaching International, 56(3), 258–269, doi:10.1080/14703297.2018.1443828.
Alb18
Albelbisi, N.; Yusop, F. D.; Salleh, U. K. M. (2018). Mapping the Factors Influencing Success of Massive Open Online Courses (MOOC) in Higher Education. Eurasia Journal of Mathematics, Science and Technology Education, 14(7), 2995–3012, doi:10.29333/ejmste/91486.
Alo19
Aloizou, V.; Villagrá Sobrino, S. L.; Martínez Monés, A.; Asensio-Pérez, J. I.; García Sastre, S. (2019). Quality Assurance Methods Assessing Instructional Design in MOOCs that implement Active Learning Pedagogies: An evaluative case study. Proceedings of EMOOCs 2019 (pp. 14–19). [no doi] Retrieved from http://ceur-ws.org/Vol-2356/research_short3.pdf.
Alu16
Alumu, S.; Thiagarajan, P. (2016). Massive open online courses and E-learning in higher education. Indian Journal of Science and Technology, 9(6), 1–10, doi:10.17485/ijst/2016/v9i6/81170 Retrieved from http://www.indjst.org/index.php/indjst/article/view/81170/0.
Amo14
Amo, D.; Casany, M. J.; Alier, M. (2014). Approaches for quality in pedagogical and design fundamentals in moocs. Educación y Cultura en la Sociedad de la Información. 15(1), 70–89. [no doi] Retrieved from http://campus.usal.es/~revistas_trabajo/index.php/revistatesi/article/view/11653/12068.
Ard17
Ardchir, S.; Talhaoui, M. A.; Azzouazi, M. (2017). Towards an Adaptive Learning Framework for MOOCs. In: E. Aïmeur, U. Ruhi, M. Weiss, Eds., E-Technologies: Embracing the Internet of Things. MCETECH 2017. Lecture Notes in Business Information Processing, vol 289 (pp. 236–251), doi:10.1007/978-3-319-59041-7_15.
Avs16
Avshenyuk, N. (2016). Priority Fields of Teachers’ Professional Development in Terms of Open Education Worldwide. Comparative Professional Pedagogy, 6(4), 15, doi:10.1515/rpp-2016-0042. Retrieved from https://content.sciendo.com/view/journals/rpp/6/4/article-p15.xml.
Avs18
Avshenyuk, N. M.; Berezan, V. I.; Bidyuk, N. M.; Leshchenko, M. P. (2018). Foreign experience and Ukrainian realities of mass open online courses use in international education area. Information Technologies and Learning Tools, 68(6), 262–277, doi:10.33407/itlt.v68i6.2407.
Bab16
Babanskaya, O. M.; Mozhaeva, G. V.; Feshchenko, A. V. (2016). Quality Management as a Condition for the Development of E-Learning in a Modern University. In L.G. Chova, A.L. Martinez, I.C. Torres, Eds., EDULEARN16: 8th International Conference on Education and New Learning Technologies (pp. 4367–4373), doi:10.21125/edulearn.2016.2057.
Bae16
Bae, E.; Prasad, P. W. C.; Alsadoon, A.; Bajaj, K. (2016). Framework to improve delivery methods in higher education through online learning. Proceedings 2015 IEEE 7th International Conference on Engineering Education (ICEED 2015), (pp. 130–134), doi:10.1109/ICEED.2015.7451506.
Bai14
Bailey, J.; Cassidy, D.; Breakwell, N. (2014). Keeping Them Clicking: Promoting Student Engagement In MOOC Design. AISHE-J: The All Ireland Journal of Teaching & Learning in Higher Education, 6(2), p. 1972. [no doi and no website].
Bal14
Bali, M. (2014). MOOC Pedagogy: Gleaning Good Practice from Existing MOOCs. Journal of Online Learning & Teaching, 10(1), 44–56. [no doi] Retrieved from http://www.academia.edu/download/33593146/bali_0314.pdf.
Bar15
Barber, W. (2015). Building Community in Flipped Classrooms: A Narrative Exploration of Digital Moments in Online Learning. Proceedings of the European Conference on E-Learning (pp. 24–30). [no doi and no website].
Bas14
Bassi, R.; Daradoumis, T.; Xhafa, F.; Caballé, S.; Sula, A. (2014). Software agents in large scale open e-learning: A critical component for the future of massive online courses (MOOCs). Proceedings 2014 International Conference on Intelligent Networking and Collaborative Systems (IEEE INCoS 2014) (pp. 184–188), doi:10.1109/INCoS.2014.15.
Bon17
Bonafini, F. C.; Chae, C.; Park, E.; Jablokow, K. W. (2017). How much does student engagement with videos and forums in a MOOC affect their achievement? Online Learning, 21(4), 223–240, doi:10.24059/olj.v21i4.1270.
Bor17
Borges, F. R.; Costa, L. C. S.; Avelino, C. C. V.; Nogueira, D. A.; Kirner, C.; Goyatá, S. L. T. (2017). Educational strategy on home visits based on massive open online courses. Revista Mineira de Enfermagem. Nursing Journal of Minas Gerais, 2017,21:e-1038, doi:10.5935/1415-2762.20170048.
Bou17
Bouzayane, S.; Saad, I. (2017). A preference ordered classification to leader learners identification in a MOOC. Journal of Decision Systems, 26(2), 189–202, doi:10.1080/12460125.2017.1252233.
Bra15
Brahimi, T.; Sarirete, A. (2015). Learning outside the classroom through MOOCs. Computers in Human Behavior, 51, 604–609, doi:10.1016/j.chb.2015.03.013.
Bro14
Brouns, F.; Mota, J.; Morgado, L.; Jansen, D.; Fano, S.; Silva, A.; Texeira, A. (2014). A networked learning framework for effective MOOC design: the ECO project approach. In Proceedings 8th EDEN Research Workshop. Challenges for Research into Open & Distance Learning: Doing Things Better: Doing Better Things (pp. 161–171). [no doi] Retrieved from http://dspace.ou.nl/handle/1820/5544.
Cas15
Castaño, C.; Maiz, I.; Garay, U. (2015). Design, motivation and performance in a cooperative mooc course. Comunicar, 22(44), 19–26, doi:10.3916/C44-2015-02.
Che19
Chen, C.; Sonnert, G.; Sadler, P. M.; Sasselov, D.; Fredericks, C. (2019). The impact of student misconceptions on student persistence in a MOOC. Journal of Research in Science Teaching, 2019, 1–32, doi:10.1002/tea.21616.
Cin19
Cinganotto, L.; Cuccurullo, D. (2019). Learning analytics in online social interactions. The case of a MOOC on ‘language awareness’ promoted by the European Commission. Journal of E-Learning and Knowledge Society, 15(3), 263–286, doi:10.20368/1971-8829/1135030.
Coh15
Cohen, A.; Soffer, T. (2015). Academic Instruction in a Digital World: The Virtual TAU Case. Procedia-Social and Behavioral Sciences, 177 (First Global Conference on Contemporary Issues in Education (GLOBE-EDU) 2014), 9–16, doi:10.1016/j.sbspro.2015.02.322.
Con15
Conole, G. (2015). Designing effective MOOCs. Educational Media International, 52(4), 239–252, doi:10.1080/09523987.2015.1125989.
Cos18
Costello, E.; Holland, J.; Kirwan, C. (2018). The future of online testing and assessment: question quality in MOOCs. International Journal of Educational Technology in Higher Education, 15, 42, doi:10.1186/s41239-018-0124-z.
Dan15
Daniel, S. J.; Cano, E. V.; Cervera, M. G. (2015). The Future of MOOCs: Adaptive Learning or Business Model? International Journal of Educational Technology in Higher Education, 12(1), 64–73, doi:10.7238/rusc.v12i1.2475.
Doo19
Doo, M. Y.; Zhu, M.; Bonk, C. J.; Tang, Y. (2019). The effects of openness, altruism and instructional self-efficacy on work engagement of MOOC instructors. British Journal of Educational Technology, 1–18, doi:10.1111/bjet.12882.
Dou14
Doucet, A.; Nawrot, I. (2014). Building Engagement for MOOC Students-Introducing Support for Time Management on Online Learning Platforms. Proceedings of the 23rd International Conference on World Wide Web (pp. 1077–1082), doi:10.1145/2567948.2580054.
Dua17
Duart, J. M.; Roig-Vila, R.; Mengual-Andrés, S.; Maseda Durán, M.-A. (2017). The pedagogical quality of MOOCs based on a systematic review of JCR and Scopus publications (2013–2015). Revista Española de Pedagogía, 75(266), 29–46, doi:10.22550/REP75-1-2017-02.
Ebn16
Ebner, N. (2016). Negotiation and Conflict Resolution Education in the Age of the MOOC. Negotiation Journal, 32(3), 231–260, doi:10.1111/nejo.12156.
Eli19
Elizondo-Garcia, J.; Schunn, C.; Gallardo, K. (2019). Quality of Peer Feedback in relation to Instructional Design: A Comparative Study in Energy and Sustainability MOOCs. International Journal of Instruction, 12(1), 1025-1040, doi:10.29333/iji.2019.12166a.
Fal16
Falkner, K.; Falkner, N.; Szabo, C.; Vivian, R. (2016). Applying validated pedagogy to MOOCs: An introductory programming course with media computation. Proceedings Conference on Innovation and Technology in Computer Science Education (ITiCSE) (pp. 326–331), doi:10.1145/2899415.2899429.
Faq18
Faqihi, B.; Daoudi, N.; Ajhoun, R. (2018). Design of an Intelligent Educational Resource Production System. International Journal of Emerging Technologies in Learning, 13(12), 4–18, doi:10.3991/ijet.v13i12.8914.
Fer17
Fernández-Díaz, E.; Rodríguez-Hoyos, C.; Calvo Salvador, A. (2017). The Pedagogic Architecture of MOOC: A Research Project on Educational Courses in Spanish. The International Review of Research in Open and Distributed Learning, 18(6), doi:10.19173/irrodl.v18i6.2964.
Fia18
Fianu, E.; Blewett, C.; Ampong, G. O. A.; Ofori, K. S. (2018). Factors Affecting MOOC Usage by Students in Selected Ghanaian Universities. Education Sciences, 8(2),70, doi:10.3390/educsci8020070.
Fid16
Fidalgo-Blanco, Á.; Sein-Echaluce, M. L.; García-Peñalvo, F. J. (2016). From massive access to cooperation: lessons learned and proven results of a hybrid xMOOC/cMOOC pedagogical approach to MOOCs. International Journal of Educational Technology in Higher Education, 13(1) [no page numbering], doi:10.1186/s41239-016-0024-z.
Fit14
Fitzgerald, R.; Anderson, M.; Thompson, R. (2014). MOOC’s mass marketing for a niche audience. Proceedings of the European Conference on e-Learning (ECEL) (pp. 164-170). [no doi] Retrieved from https://www.researchgate.net/publication/289353673_MOOC’s_mass_marketing_for_a_Niche_audience.
Fre18
Freitas, A.; Paredes, J. (2018). Understanding the faculty perspectives influencing their innovative practices in MOOCs/SPOCs: a case study. International Journal of Educational Technology in Higher Education, 15,5, doi:10.1186/s41239-017-0086-6.
Gam15
Gamage, D.; Perera, I.; Fernando, S. (2015). A framework to analyze effectiveness of eLearning in MOOC: Learners perspective. Proceedings 8th International Conference on Ubi-Media Computing (UMEDIA 2015) (pp. 224–229), doi:10.1109/UMEDIA.2015.7297461.
Gar17
Garreta-Domingo, M.; Sloep, P. B.; Hérnandez-Leo, D.; Mor, Y. (2017). Design for collective intelligence: pop-up communities in MOOCs. AI & SOCIETY, 1–10, doi:10.1007/s00146-017-0745-0.
Ghi16
Ghislandi, P. (2016). “The fun they had” or about the quality of MOOC. Journal of E-Learning and Knowledge Society, 12(3), 99–114, doi:10.20368/1971-8829/1178 Retrieved from http://www.je-lks.org/ojs/index.php/Je-LKS_EN/article/view/1178.
Goo18
Goosen, L. (2018). Students’ Access to an ICT4D MOOC. Proceedings of the 47th Annual Conference of the Southern African Computer Lecturers’ Association (pp. 183–199). [no doi] Retrieved from http://www.sacla.org.za/wp-content/uploads/2019/03/SACLA-2018-Proceedings.pdf#page=197.
Gre18
Gregori, E. B.; Zhang, J.; Galván-Fernández, C.; Fernández-Navarro, F. d. A. (2018). Learner support in MOOCs: Identifying variables linked to completion. Computers & Education, 122, 153–168, doi:10.1016/j.compedu.2018.03.014.
Haa19
Haba, H. F.; Dastane, O. (2019). Massive Open Online Courses (MOOCs)-Understanding Online Learners’ Preferences and Experiences. International Journal of Learning, Teaching and Educational Research, 18(8), 227–242, doi:10.26803/ijlter.18.8.14.
Hai19
Ab Jalil Habibaha, A. J.; Jowatib, J.; Arifc, I. I.; Norasiken, B. (2019). Mooc’s Daunting Journey: Bridging the Gaps between Theoretical and Practical Demands. International Journal of Innovation, Creativity and Change, 9(12), 310–331. [no doi] Retrieved from https://www.ijicc.net/images/vol9iss12/91225_Habibah_2019_E_R.pdf.
Hem17
Hemavathy, R.; Harshini, S. (2017). Adaptive Learning in Computing for Non-English Speakers. Journal of Computer Science and Systems Biology, 10, 61–63, doi:10.4172/jcsb.100025 Retrieved from https://www.omicsonline.org/open-access/adaptive-learning-in-computing-for-nonenglish-speakers-jcsb-1000250.pdf.
Hic17
Hicks, N.; Zakharov, W.; Douglas, K.; Nixon, J.; Diefes-Dux, H.; Bermel, P.; Madhavan, K. (2017). Video-related pedagogical strategies in massive open online courses: A systematic literature review. Proceedings of the 7th Research in Engineering Education Symposium. [no doi] Retrieved from https://docs.lib.purdue.edu/cgi/viewcontent.cgi?article=1227&context=lib_fsdocs.
Hli16
Hlinak, M. (2016). Flipping and Moocing Your Class Or: How I Learned to Stop Worrying and Love the MOOC. Journal of Legal Studies Education, 33(1), 23–35, doi:10.1111/jlse.12033 Retrieved from http://10.0.4.87/jlse.12033.
Hsu16
Hsu, L. (2016). Are You Ready to Use Technology in EFL Teaching? Examining Psychometric Properties of EFL Teachers’ Technological Pedagogical Content Knowledge (TPACK) Scale. International Research in Education, 4(1), 97–110, doi:10.5296/ire.v4i1.8740 Retrieved from http://www.macrothink.org/journal/index.php/ire/article/view/8740.
Ini19
Iniesto F.; Rodrigo C. (2019) YourMOOC4all: A Recommender System for MOOCs Based on Collaborative Filtering Implementing UDL. Proceedings EC-TEL 2019. Lecture Notes in Computer Science, 11722 (pp. 746–750), doi:10.1007/978-3-030-29736-7_80.
Ist15
Istrate, O.; Kestens, A. (2015). Developing and Monitoring a MOOC: The IFRC Experience. eLearning & Software for Education, 2, 576–583, doi:10.12753/2066-026X-15-179.
Kha15
Khalil, M.; Brunner, H.; Ebner, M. (2015). Evaluation Grid for xMOOCs. International Journal of Emerging Technologies in Learning, 10(4), doi:10.3991/ijet.v10i4.4653 Retrieved from http://online-journals.org/index.php/i-jet/article/view/4653.
Kim14
Kim, J.; Guo, P. J.; Seaton, D. T.; Mitros, P.; Gajos, K. Z.; Miller, R. C. (2014). Understanding in-video dropouts and interaction peaks inonline lecture videos. Proceedings of the first ACM conference on Learning @ scale conference (pp. 31–40), doi:10.1145/2556325.2566237 Retrieved from https://dl.acm.org/citation.cfm?id=2566237.
Kin14
King, C.; Doherty, K.; Kelder, J.-A.; McInerney, F.; Walls, J.; Robinson, A.; Vickers, J. (2014). “Fit for Purpose”: A cohort-centric approach to MOOC design | «Adecuación al propósito»: Un enfoque centrado en el colectivo de estudiantes para el diseño de un curso en línea masivo y abierto (MOOC). RUSC Universities and Knowledge Society Journal, 11(3), 108–121, doi:10.7238/rusc.v11i3.2090.
Lau16
Laurillard, D. (2016). The educational problem that MOOCs could solve: Professional development for teachers of disadvantaged students. Research in Learning Technology, 24 [no page numbering], doi:10.3402/rlt.v24.29369.
Lee16
Lee, Y.; Rofe, J. S. (2016). Paragogy and flipped assessment: experience of designing and running a MOOC on research methods. Open Learning, 31(2), 116–129, doi:10.1080/02680513.2016.1188690 Retrieved from http://10.0.4.56/02680513.2016.1188690.
Lee19
Lee, C.; de Vries, W. T. (2019). Sustaining a Culture of Excellence: Massive Open Online Course (MOOC) on Land Management. Sustainability, 11(12):3280, doi:10.3390/su11123280.
Lem19
Lemos-de-Carvalho-Júnior, G.; Raposo-Rivas, M.; Cebrián-de-la-Serna, M.; Sarmiento-Campos, J. A. (2017). Analysis of the pedagogical perspective of the MOOCs available in Portuguese. Revista Española de Pedagogía, 75(266), 101–119, doi:10.22550/REP75-1-2017-06.
Liu15
Liu, M., Kang, J.; McKelroy, E. (2015). Examining learners’ perspective of taking a MOOC: reasons, excitement, and perception of usefulness. Educational Media International, 52(2), 129–146, doi:10.1080/09523987.2015.1053289.
Liy19
Liyanagunawardena, T. R.; Lundqvist, K.; Mitchell, R.; Warburton, S.; Williams, S. A. (2019). A MOOC Taxonomy Based on Classification Schemes of MOOCs. European Journal of Open, Distance and e-Learning, 22(1), 85–103, doi:10.2478/eurodl-2019-0006.
Llo16
Lloyd, M.; Bahr, N. (2016). What Matters in Higher Education. A meta-analysis of a decade of learning design. Journal of Learning Design, 9(2), 1–13, doi:10.5204/jld.v9i2.280.
Lop15
Lopes, A. M. Z.; Pedro, L. Z.; Isotani, S.; Bittencourt, I. I. (2015). Quality evaluation of web-based educational software: A systematic mapping. Proceedings IEEE 15th International Conference on Advanced Learning Technologies: Advanced Technologies for Supporting Open Access to Formal and Informal Learning (ICALT 2015) (pp. 250–252), doi:10.1109/ICALT.2015.88.
Lop17
Lopes, A. P.; Soares, F. (2017). “Flipped Classroom with a MOOC”. An E-Learning Model into a Mathematics Course. In L. G. Chova, A. L. Martinez, I. C. Torres, Eds., INTED2017: 11th International Technology, Education and Development Conference (pp. 4643–4649), doi:10.21125/inted.2017.1092.
Man19
Manotas Salcedo, E.; Pérez-Rodríguez, A.; Contreras-Pulido, P. (2019). Proposal for design an instrument for video lectures analysis in MOOC. Alteridad, 14(1), 53–64, doi:10.17163.alt.v14n1.2019.04.
Mar15
Margaryan, A.; Bianco, M.; Littlejohn, A. (2015). Instructional quality of Massive Open Online Courses (MOOCs). Computers and Education, 80, 77–83, doi:10.1016/j.compedu.2014.08.005.
Mar14
Maringe, F.; Sing, N. (2014). Teaching large classes in an increasingly internationalising higher education environment: Pedagogical, quality and equity issues. Higher Education, 67(6), 761–782, doi:10.1007/s10734-013-9710-0.
Mar19
Marta-Lazo, C.; Osuna-Acedo, S.; Gil-Quintana, J. (2019). sMOOC: A pedagogical model for social inclusion. Heliyon, 5, e01326, doi:10.1016/j.heliyon.2019.e01326.
Mas15
Masterman, L. (2015). Does an Open world need new pedagogies or can existing pedagogies suffice. In Proceedings 14th European Conference on E-Learning (ECEL) (pp. 339–346). [no doi and no website].
McA14
McAleese, M. (2014). Realising the potential of quality in learning and teaching in higher education in Europe. Formazione & Insegnamento. Rivista Internazionale Di Scienze Dell’educazione E Della Formazione, 12(1), 19–24. [no doi:the published doi:10746/-fei-XII-01-14_02 is not working] Retrieved from https://ojs.pensamultimedia.it/index.php/siref/article/view/368.
Mil15
Miller, S. L. (2015). Teaching an Online Pedagogy MOOC. Journal of Online Learning & Teaching, 11(1), 104–119. [no doi] Retrieved from http://jolt.merlot.org/vol11no1/Miller_0315.pdf.
Mon15
Montgomery, A. P.; Hayward, D. V.; Dunn, W.; Carbonaro, M.; Amrhein, C. G. (2015). Blending for student engagement: Lessons learned for MOOCs and beyond. Australasian Journal of Educational Technology, 31(6), 657–670, doi:https://doi.org/10.14742/ajet.1869 Retrieved from https://ajet.org.au/index.php/AJET/article/download/1869/1321.
Naj15
Najafi, H.; Rolheiser, C.; Harrison, L.; Håklev, S. (2015). University of Toronto instructors’ experiences with developing MOOCs. International Review of Research in Open and Distance Learning, 16(3), 233–255, doi:10.19173/irrodl.v16i3.2073.
Nie16
Niederman, F.; Butler, B. S.; Gallupe, R. B.; Tan, B. C. Y.; Urquhart, C. (2016). Electronic pedagogy and future university business models. Communications of the Association for Information Systems, 38(1), 157–170, doi:10.17705/1CAIS.03807.
Nyo13
Nyoni, J. (2013). The viral nature of massive open online courses (MOOCs) in open and distance learning: Discourses of quality, mediation and control. Mediterranean Journal of Social Sciences, 4(3), 665–672, doi:10.5901/mjss.2013.v4n3p665.
OhC19
Oh, E.G.; Chang, Y.; Park, S.W. (2019). Design review of MOOCs: application of e-learning design principles. Journal of Computing in Higher Education, doi:10.1007/s12528-019-09243-w.
Oss15
Ossiannilsson, E.; Altinay, F.; Altinay, Z. (2015). Analysis of MOOCs practices from the perspective of learner experiences and quality culture. Educational Media International, 52(4), 272–283, doi:10.1080/09523987.2015.1125985.
Pet16
Petronzi, D.; Hadi, M. (2016). Exploring the Factors Associated with MOOC Engagement, Retention and the Wider Benefits for Learners. European Journal of Open, Distance and E-Learning, 19(2), 112–129, doi:10.1515/eurodl-2016-0011.
Pil17
Pilli, O.; Admiraal, W. (2017). Students’ Learning Outcomes in Massive Open Online Courses (MOOCs): Some Suggestions for Course Design. Journal of Higher Education/Yüksekögretim Dergisi, 7(1), 46–71, doi:0.2399/yod.17.001.
Pil18
Pilli, O.; Admiraal, W.; Salli, A. (2018). MOOCs: Innovation or Stagnation?. Turkish Online Journal of Distance Education, 19(3), 169–181, doi:10.17718/tojde.445121.
Ram18
Rambe, P.; Moeti, M. (2017). Disrupting and democratising higher education provision or entrenching academic elitism: towards a model of MOOCs adoption at African universities. Educational Technology Research and Development, 65, 631–651, doi:10.1007/s11423-016-9500-3.
Ram15
Ramírez Fernández, M. B.; Salmerón Silvera, J. L.; Meneses, E. L. (2015). Comparative between quality assessment tools for MOOCs: ADECUR vs Standard UNE 66181: 2012. Comparativa Entre Instrumentos de Evaluación de Calidad de Cursos MOOC: ADECUR vs Normas UNE 66181:2012, 12(1), 131–144, doi:10.7238/rusc.v12i1.2258.
Rol15
Rolfe, V. (2015). A Systematic Review of the Socio-Ethical Aspects of Massive Online Open Courses. European Journal of Open, Distance and E-Learning, 18(1), 52–71, doi:10.1515/eurodl-2015-0004.
Row19
Rowe, M.; Osadnik, C.R.; Pritchard, S.; Maloney, S. (2019). These may not be the courses you are seeking: a systematic review of open online courses in health professions education. BMC Medical Education, 19, 356, doi:10.1186/s12909-019-1774-9.
Ruh15
Ruhalahti, S.; Korhonen, A.-M. (2015). WANTED: MOOC PEDAGOGY. In L. Gomez-Chova, L. and A. Lopez-Martinez, I. Candel-Torres, Eds., EDULEARN15: 7th International Conference on Education and New Learning Technologies (pp. 1791–1795). [no doi and no website].
Rui19
Ruiz-Palmero, J.; López-Álvarez, D.; Sánchez-Rivas, E.; Sánchez-Rodríguez, J. (2019). An Analysis of the Profiles and the Opinion of Students Enrolled on xMOOCs at the University of Málaga. Sustainability, 11, 6910, doi:10.3390/su11246910.
Saa16
Saalman, E. (2016). Active learning with the use of MOOCs at Chalmers University of technology – Experiences, challenges and future. Proceedings International Symposium on Project Approaches in Engineering Education, 6 (pp. 132–139). [no doi] Retrieved from http://paeeale.unb.br/_upload/PAEE_ALE_2016_proceedings.pdf.
Sac18
Sanchez-Gordon, S.; Luján-Mora, S. (2018). Technological Innovations in Large-Scale Teaching: Five Roots of MOOCS. Journal of Educational Computing Research, 56(5), 623–644, doi:10.1177/0735633117727597.
Saz18
Sanz-Martínez, L.; Er, E.; Dimitriadis, Y.; Martínez-Monés, A.; Bote-Lorenzo, M. L. (2018). Supporting Teachers in the Design and Implementation of Group Formation Policies in MOOCs: A Case Study. Journal of Universal Computer Science, 24(8), 1110–1130, doi:10.3217/jucs-024-08-1110.
Sin19
Singh, N. (2019). SWAYAM- Indian MOOCs: An Insider’s Perspective. Asian Journal of Distance Education, 14(1), 47–55. [no doi] Retrieved from http://asianjde.org/ojs/index.php/AsianJDE/article/view/301.
Sto16
Stoyanov, S.; de Vries, F. (2016). MOOCs pedagogical and didactical approaches. In D. Jansen, L. Konings, Eds. MOOCs in Europe (Proceedings of the conference WOW! Europe embraces MOOCS) (pp. 155–169). Maastricht: European Association of Distance Teaching Universities (EADTU). [no doi] Retrieved from http://dspace.ou.nl/handle/1820/7608.
Str17
Stracke, C. M. (2017). Open education and learning quality: The need for changing strategies and learning experiences. Proceedings of 2017 IEEE Global Engineering Education Conference (EDUCON) (pp. 1044–1048), doi:10.1109/EDUCON.2017.7942977.
Sun16
Sunar, A. S.; Abdullah, N. A.; White, S.; Davis, H. (2016). Personalisation in MOOCs: A critical literature review. Communications in Computer and Information Science, 583, 152–168, doi:10.1007/978-3-319-29585-5_9.
Tah15
Tahiri, J.; Bennani, S.; Khalidi Idrissi, M. (2015). Using an Analytical Formalism to Diagnostic and Evaluate Massive Open Online Courses. Proceedings 10th International Conference on Intelligent Systems: Theories and Applications (SITA) [no page numbering], doi:10.1109/SITA.2015.7358389.
Tah17
Tahiri, J. S.; Bennani, S.; Khalidi Idrissi, M. (2017). diffMOOC: Differentiated Learning Paths Through the Use of Differentiated Instruction within MOOC. International Journal of Emerging Technologies in Learning, 12(3), 197–218, doi:10.3991/ijet.v12i03.6527.
Tov15
Toven-Lindsey, B.; Rhoads, R. A.; Lozano, J. B. (2015). Virtually unlimited classrooms: Pedagogical practices in massive open online courses. Internet and Higher Education, 24, 1–12, doi:10.1016/j.iheduc.2014.07.001.
Ulr15
Ulrich, C.; Nedelcu, A. (2015). MOOCs in Our University: Hopes and Worries. Procedia-Social and Behavioral Sciences, 180 (The 6th International Conference Edu World 2014 “Education Facing Contemporary World Issues,” 7th-9th November 2014), (pp. 1541–1547), doi:10.1016/j.sbspro.2015.02.304 Retrieved from http://10.0.3.248/j.sbspro.2015.02.304.
VdP19
Van de Poël, J. F.; Verpoorten, D. (2019). Designing a MOOC – A New Channel for Teacher Professional Development?. In: M. Calise, C. Delgado Kloos, J. Reich, J. Ruiperez-Valiente, M. Wirsing, Eds., Digital Education: At the MOOC Crossroads Where the Interests of Academia and Business Converge. EMOOCs 2019. Lecture Notes in Computer Science, 11475 (pp. 91–101), doi:10.1007/978-3-030-19875-6_11.
Vaz17
Vázquez-Cano, E.; López Meneses, E.; Sevillano García, M. L. (2017). The impact of the MOOC movement on social networks. A computational and statistical study on Twitter. Revista Española de Pedagogía, 75(266), 47–64, doi:10.22550/REP75-1-2017-03.
Wam18
Wambugu, P. W. (2018). Massive Open Online Courses (MOOCs) for Professional Teacher and Teacher Educator Development: A Case of TESSA MOOC in Kenya. Universal Journal of Educational Research, 6(6), 1153–1157, doi:10.13189/ujer.2018.060604.
Yal17
Yalid, A. T. A.; Bassiri, M.; Moussted, M.; Talbi, M. (2017). The instrumentalisation of the MOOCS vector of educational innovation and consecration of the academic training quality. Communication, Management and Information Technology-Proceedings of the International Conference on Communication, Management and Information Technology (ICCMIT 2016) (pp. 33–36). [no doi and no website].
You14
Yousef, A. M. F.; Chatti, M. A.; Schroeder, U.; Wosnitza, M. (2014). What drives a successful MOOC? An empirical examination of criteria to assure design quality of MOOCs. Proceedings-IEEE 14th International Conference on Advanced Learning Technologies, ICALT 2014 (pp. 44–48), doi:10.1109/ICALT.2014.23.
You15
Yousef, A.; Wahid, U.; Chatti, M.; Schroeder, U.; Wosnitza, M. (2015). The Effect of Peer Assessment Rubrics on Learners’ Satisfaction and Performance Within a Blended MOOC Environment. Proceedings of the 7th International Conference on Computer Supported Education (CSEDU-2015), (pp. 148–159), doi:10.5220/0005495501480159.

Appendix B. The Quality Framework for MOOCs with all Dimensions and Quality Criteria

The quality framework for MOOCs was developed in iterative cycles and validated through the assignments of all 103 selected studies from the systematic literature review (see Appendix C).
Figure A1. The complete quality framework for MOOCs with all dimensions and quality criteria.
Figure A1. The complete quality framework for MOOCs with all dimensions and quality criteria.
Sustainability 13 05817 g0a1

Appendix C. The Quality Framework for MOOCs with all 103 Assigned Studies

The quality framework for MOOCs (see Appendix B) was developed in iterative cycles and validated through the assignments of all 103 selected studies from the systematic literature review.
Table A1. The quality framework for MOOCs with all 103 assigned studies.
Table A1. The quality framework for MOOCs with all 103 assigned studies.
1. ORGANISATIONALLop15, Mca14, Nie16, Sto16, Str17
  1.1 Institutional commitmentFit14, Fre18, Ghi16, Kin14, Mas15
  1.2 Policies and licensesAlI19, Hai19, Nyo13, Ram17
  1.3 Efficiency and costAlI19, Dan15, Row19, VdP19, Wam18
  1.4 Sustainability and societyAlI19, Ram17
2. TECHNICALGam15, Lop15, Lau16, Llo16, Ram15, Yal17, You14
  2.1 Provider and platformBab16, Bae16, Dou14, Ebn16, Goo18, Hic17, Kin14, Pil17, Sin19
  2.2 User interfaceHaa19, Kha15, Kim14, Sin19, You14
  2.3 Video developmentHic17, Kha15, Kim14, Man19, You14
  2.4 Technical interoperabilityBro14, Faq18, Kin14, Tah17
  2.5 Technology for social interactionAmo14, Pil17, You14
3. SOCIALGar17
  3.1 CommunicationBab16, Bar15, Bro14, Kha15, Kin14, Mil15, Naj15
  3.2 CollaborationFid16, Gam15, Mas15, Pet16, Saz18, Tah17
  3.3 DiscussionLau16, Pet16
  3.4 NetworkingAmo14, Bar15, Bas14, Sto16, Vaz17
  3.5 SharingLau16, Pet16
4. PEDAGOGICALAlb18, Gam15, Lau16, Lop17, Mar14, Sto16
  4.1 Instructional designAlo19, Con15, Dua17, Fer17, Fia18, Hli16, Fal16, Lem17, Liu15, Mar15, Naj15, OhC19, Oss15, Pet16, Pil17, Ram15, You14
    4.1.1 Learning objectivesIst15, Kha15, Ram15
    4.1.2 Target groupBas14, Hai19
    4.1.3 Design approachesAmo14, Bor17
      4.1.3.1 Collaborative learningCon15, Fid16, Oss15, Pet16, Ruh15, Saz18, Tah17, Tov15
      4.1.3.2 Learner-centredAmo14, Oss15, Pil17, Ruh15
      4.1.3.3 Situated learningTov15
      4.1.3.4 Active learningSaa16, Tov15
      4.1.3.5 Adaptive learningArd17, Amo14, Dan15, Hem17
      4.1.3.6 Learner’s roleAvs18, Bro14, Bou17, Mas15
      4.1.3.7 Teacher’s roleAvs16, Avs18, Bro14, Doo19, Mas15
      4.1.3.8 PBLMar15
      4.1.3.9 GBLAmo14, Bai14, Bro14
      4.1.3.10 IBLBai14
      4.1.3.11 Content-orientedNie16
      4.1.3.12 Task-orientedPil17
      4.1.3.13 Skills-orientedAmo14
    4.1.4 Course conditionsKha15, Pil17
    4.1.5 Course descriptionKha15
    4.1.6 Course contentAda14, Bon17, Bor17, Con15, Ebn16, Gam15, Hic17, Ist15, Kim14, Kin14, Naj1, Oss15, Pet16, Ram15, Yal17
    4.1.7 ActivitiesBon17, Con15, Ebn16, Kin14, Pet16, Ram15, Yal17
    4.1.8 Resource featuresDou14, Gam15, Kim14, Kin14, Liu15, Naj15, Pil17
      4.1.8.1 AccessibilityIni19, Kim14
      4.1.8.2 UsabilityHaa19, Kim14
    4.1.9 AssessmentBon17, Cos18, Ist15, Pil17, Tah17, You14
      4.1.9.1 Peer assessmentAmo14, Eli19, Lau16, Kha15, You15
      4.1.9.2 Formative assessmentCon15
      4.1.9.3 RewardsCon15, Dan15, Kha15, Naj15
  4.2 Learners’ perspectiveAlu16, Che19, Dua17, Fia18, Goo18, Haa19, Rui19, Str17, Ulr15
    4.2.1 Relationship teacher and peerGre18, Mas15
    4.2.2 Self-assessmentKha15, Naj15, Pil17
    4.2.3 Personal learning environmentBar15, Coh15, Mon15
    4.2.4 MotivationCas15, Ebn16, Gam15, Kha15
    4.2.5 Learning styleArd17, Bae16, Bal14, Bas14, Pil17, Rol15
      4.2.5.1 Individualized pathwaysRuh15, Sun16, Tah17
      4.2.5.2 Self-paced learningOss15, Sto16, Tah17
  4.3 Theoretical frameworkAmo14, Oss15, Sac18, Tov15
    4.3.1 ConnectivismAmo14, Oss15
    4.3.2 (Socio-)constructivismTov15
  4.4 Learning processesGre18, Pil17
    4.4.1 FeedbackEli19, Lee16, Mar15, Naj15
    4.4.2 Guidance to learnerBai14, Gam15, Ruh15
    4.4.3 TutoringRam15
    4.4.4 InteractivityBro14, Ebn16, Gam15, Liu15, Mil15
  4.5 MOOC classificationsAvs18, Bro14, Con15, Fid16, Gre18, Liy19, Mar19, Sac18, Tah17
  4.6 ContextBra15, Hsu16, Mar15, Ram15
    4.6.1 FormalBor17, Fid16, Hai19
    4.6.2 Non-formalCin19, Fid16
    4.6.3 InformalFid16
  4.7 EvaluationAmo14, Lee19, OhC19, Pil18, Row19, Tah15
    4.7.1 Learning analyticsAmo14, Cin19, You14
    4.7.2 Educational data miningAmo14
    4.7.3 Drop-out rateChe19, Dou14

References

  1. Gaskell, A.; Mills, R. The quality and reputation of open, distance and e-learning: What are the challenges? Open Learn. 2014, 29, 190–205. [Google Scholar] [CrossRef]
  2. Stracke, C.M. Quality frameworks and learning design for open education. Int. Rev. Res. Open Distrib. Learn. 2019, 20, 180–203. [Google Scholar] [CrossRef] [Green Version]
  3. UNESCO. Forum on the Impact of Open Courseware for Higher Education in Developing Countries: Final Report; UNESCO (CI-2002/CONF.803/CLD.1): Paris, France, 2002; Available online: http://unesdoc.unesco.org/images/0012/001285/128515e.pdf (accessed on 15 January 2021).
  4. UNESCO. 2012 Paris OER Declaration; UNESCO: Paris, France, 2012; Available online: www.unesco.org/new/fileadmin/MULTIMEDIA/HQ/CI/CI/pdf/Events/Paris%20OER%20Declaration_01.pdf (accessed on 15 January 2021).
  5. United Nations. Transforming Our World: The 2030 Agenda for Sustainable Development; United Nations: New York, NY, USA, 2015; Available online: http://www.un.org/ga/search/view_doc.asp?symbol=A/RES/70/1&Lang=E (accessed on 15 January 2021).
  6. Dillahunt, T.R.; Wang, B.Z.; Teasley, S. Democratizing higher education: Exploring MOOC use among those who cannot afford a formal education. Int. Rev. Res. Open Distrib. Learn. 2014, 15, 1–20. [Google Scholar] [CrossRef] [Green Version]
  7. Shah, D. A Product at every Price: A Review of MOOC Stats and Trends in 2017. Class Central. 2018. Available online: https://www.class-central.com/report/moocs-stats-and-trends-2017 (accessed on 15 January 2021).
  8. Stracke, C.M.; Bozkurt, A. Evolution of MOOC designs, providers and learners and the related MOOC research and publications from 2008 to 2018. In Proceedings of the International Open & Distance Learning Conference (IODL19), Eskişehir, Turkey, 14–16 November 2019; pp. 13–20. [Google Scholar] [CrossRef]
  9. Gasevic, D.; Kovanovic, V.; Joksimovic, S.; Siemens, G. Where is research on massive open online courses headed? A data analysis of the MOOC Research Initiative. Int. Rev. Res. Open Distrib. Learn. 2014, 15, 134–176. [Google Scholar] [CrossRef] [Green Version]
  10. Stracke, C.M.; Downes, S.; Conole, G.; Burgos, D.; Nascimbeni, F. Are MOOCs Open Educational Resources? A literature review on history, definitions and typologies of OER and MOOCs. Open Prax. 2020, 11, 331–341. [Google Scholar] [CrossRef] [Green Version]
  11. Veletsianos, G.; Shepherdson, P. A systematic analysis and synthesis of the empirical MOOC literature published in 2013–2015. Int. Rev. Res. Open Distrib. Learn. 2016, 17, 198–221. [Google Scholar] [CrossRef] [Green Version]
  12. Christensen, C.M.; Raynor, M.; McDonald, R. What is disruptive innovation? Harv. Bus. Rev. 2015, 93, 44–53. Available online: https://hbr.org/2015/12/what-is-disruptive-innovation (accessed on 15 January 2021).
  13. De Moura, V.F.; De Souza, C.A. Características disruptivas dos Massive Open Online Courses (MOOCs): Uma análise exploratória no ensino superior Brasileiro [Disruptive features of Massive Open Online Courses (MOOCs): An exploratory analysis in Brazilian higher education]. Teoria E Prática Em Administração 2017, 7, 102–127. [Google Scholar] [CrossRef] [Green Version]
  14. Rambe, P.; Moeti, M. Disrupting and democratising higher education provision or entrenching academic elitism: Towards a model of MOOCs adoption at African universities. Educ. Technol. Res. Dev. 2017, 65, 631–651. [Google Scholar] [CrossRef]
  15. Yuan, L.; Powell, S. MOOCs and disruptive innovation: Implications for higher education. eLearning Pap. 2013, 33, 60–70. [Google Scholar] [CrossRef]
  16. Stracke, C.M. The need to change education towards open learning. In The Need for Change in Education: Openness as Default? Stracke, C.M., Shamarina-Heidenreich, T., Eds.; Logos: Berlin, Germany, 2015; pp. 11–23. Available online: http://opening-up.education/publications/stracke-c-m-2015-the-need-to-change-education-towards-open-learning (accessed on 15 January 2021).
  17. Hayes, S. MOOCs and Quality: A Review of the Recent Literature; QAA: Gloucester, UK, 2015; Available online: http://publications.aston.ac.uk/26604/1/MOOCs_and_quality_a_review_of_the_recent_literature.pdf (accessed on 15 January 2021).
  18. Wang, Z.; Anderson, T.; Chen, L.; Barberà, E. Interaction pattern analysis in cMOOCs based on the connectivist interaction and engagement framework. Br. J. Educ. Technol. 2017, 48, 683–699. [Google Scholar] [CrossRef] [Green Version]
  19. Amo, D.; Casany, M.J.; Alier, M. Approaches for quality in pedagogical and design fundamentals in moocs. TESI 2014, 15, 70–89. Available online: http://campus.usal.es/~revistas_trabajo/index.php/revistatesi/article/view/11653/12068 (accessed on 15 January 2021).
  20. Stracke, C.M. The quality of MOOCs: How to improve the design of open education and online courses for learners? In Learning and Collaboration Technologies. Novel Learning Ecosystems, LCT 2017, Part I, LNCS 10295; Zaphiris, P., Ioannou, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2017; pp. 285–293. [Google Scholar] [CrossRef] [Green Version]
  21. Zawacki-Richter, O.; Bozkurt, A.; Alturki, U.; Aldraiweesh, A. What research says about MOOCs—An explorative content analysis. Int. Rev. Res. Open Distrib. Learn. 2018, 19, 242–259. [Google Scholar] [CrossRef]
  22. Deng, R.; Benckendorff, P.; Gannaway, D. Learner engagement in MOOCs: Scale development and validation. Br. J. Educ. Technol. 2020, 51, 245–262. [Google Scholar] [CrossRef]
  23. Hansen, J.D.; Reich, J. Democratizing education? Examining access and usage patterns in massive open online courses. Science 2015, 350, 1245–1248. [Google Scholar] [CrossRef] [Green Version]
  24. Margaryan, A.; Bianco, M.; Littlejohn, A. Instructional quality of massive open online courses (MOOCs). Comput. Educ. 2015, 80, 77–83. [Google Scholar] [CrossRef] [Green Version]
  25. Reich, J. Rebooting MOOC research. Science 2015, 347, 34–35. [Google Scholar] [CrossRef]
  26. Lowenthal, P.; Hodges, C. In search of quality: Using quality matters to analyze the quality of massive, open, online courses (MOOCs). Int. Rev. Res. Open Distrib. Learn. 2015, 16, 83–101. [Google Scholar] [CrossRef] [Green Version]
  27. Stracke, C.M.; Tan, E. The quality of open online learning and education: Towards a quality reference framework for MOOCs. In Rethinking Learning in the Digital Age. Making the Learning Sciences Count: The International Conference of the Learning Sciences (ICLS) 2018; Kay, J., Luckin, R., Eds.; ISLS: London, UK, 2018; pp. 1029–1032. [Google Scholar] [CrossRef]
  28. Stracke, M.C.; Tan, E.; Texeira, M.A.; Pinto, M.; Kameas, A.; Vassiliadis, B.; Sgouropoulou, C. Gap between MOOC designers’ and MOOC learners’ perspectives on interaction and experiences in MOOCs: Findings from the global MOOC quality survey. In Proceedings of the 18th IEEE International Conference on Advanced Learning Technologies (ICALT), Mumbai, India, 9–13 July 2018; Chang, M., Chen, N.-S., Huang, R., Kinshuk, Moudgalya, K., Murthy, S., Sampson, D.G., Eds.; IEEE Xplore: New York, NY, USA, 2018; pp. 1–5. [Google Scholar] [CrossRef]
  29. Stracke, C.M.; Tan, E.; Texeira, A.; Pinto, M.; Vassiliadis, B.; Kameas, A.; Sgouropoulou, C.; Vidal, G. Quality Reference Framework (QRF) for the Quality of Massive Open Online Courses (MOOCs). 2018. Available online: http://www.mooc-quality.eu/QRF (accessed on 15 January 2021).
  30. Alario-Hoyos, C.; Estévez-Ayres, I.; Pérez-Sanagustín, M.; Kloos, C.D.; Fernández-Panadero, C. Understanding Learners’ Motivation and Learning Strategies in MOOCs. Int. Rev. Res. Open Distrib. Learn. 2017, 18, 119–137. [Google Scholar] [CrossRef]
  31. Brooker, A.; Corrin, L.; De Barba, P.; Lodge, J.; Kennedy, G. A tale of two MOOCs: How student motivation and participation predict learning outcomes in different MOOCs. Australas. J. Educ. Technol. 2018, 34, 73–87. [Google Scholar] [CrossRef] [Green Version]
  32. Garreta-Domingo, M.; Hernandez-Leo, D.; Sloep, P.B. Evaluation to support learning design: Lessons learned in a teacher training MOOC. Australas. J. Educ. Technol. 2018, 34, 56–77. [Google Scholar] [CrossRef]
  33. Tawfik, A.A.; Reeves, T.D.; Stich, A.E.; Gill, A.; Hong, C.; McDade, J.; Pillutla, V.S.; Zhou, X.; Giabbanelli, P.J. The nature and level of learner–learner interaction in a chemistry massive open online course (MOOC). J. Comput. High. Educ. 2017, 29, 411–431. [Google Scholar] [CrossRef]
  34. Zimmerman, T.D. Exploring learner to content interaction as a success factor in online courses. Int. Rev. Res. Open Distrib. Learn. 2012, 13, 152–165. [Google Scholar] [CrossRef] [Green Version]
  35. Conole, G. MOOCs as disruptive technologies: Strategies for enhancing the learner experience and quality of MOOCs. Revista de Educación a Distancia (RED) 2013, 39, 1–17. [Google Scholar] [CrossRef]
  36. Kizilcec, R.F.; Pérez-Sanagustín, M.; Maldonado, J.J. Self-regulated learning strategies predict learner behavior and goal attainment in massive open online courses. Comput. Educ. 2017, 104, 18–33. [Google Scholar] [CrossRef] [Green Version]
  37. Gee, S. MITx, the Fallout Rate. 2012. Available online: http://www.i-programmer.info/news/150-training-a-education/4372-mitx-the-fallout-rate.html (accessed on 15 January 2021).
  38. Jordan, K. Initial trends in enrolment and completion of massive open online courses. Int. Rev. Res. Open Distrib. Learn. 2014, 15, 133–160. [Google Scholar] [CrossRef] [Green Version]
  39. Stracke, C.M. Why we need high drop-out rates in MOOCs: New evaluation and personalization strategies for the quality of open education. In Proceedings of the 17th IEEE International Conference on Advanced Learning Technologies (ICALT 2017), Timisoara, Romania, 3–7 July 2017; Chang, M., Chen, N.-S., Huang, R., Kinshuk, Sampson, D.G., Vasiu, R., Eds.; IEEE Xplore: New York, NY, USA, 2017; pp. 13–15. [Google Scholar] [CrossRef] [Green Version]
  40. Evans, B.J.; Baker, R.B.; Dee, T.S. Persistence patterns in massive open online courses (MOOCs). J. High. Educ. 2016, 87, 206–242. [Google Scholar] [CrossRef]
  41. Glass, C.R.; Shiokawa-Baklan, M.S.; Saltarelli, A.J. Who takes MOOCs? New Dir. Inst. Res. 2016, 2015, 41–55. [Google Scholar] [CrossRef]
  42. Terras, M.M.; Ramsay, J. Massive open online courses (MOOCs): Insights and challenges from a psychological perspective. Br. J. Educ. Technol. 2015, 46, 472–487. [Google Scholar] [CrossRef]
  43. Guerra, L.; Ferrari, L. MOOC: Migliorare le opportunità dell’online collettivo [MOOC: Improving collective online opportunities]. In Atti Convegno Nazionale DIDAMATICA 2015. Studio Ergo Lavoro Dalla Società Della Conoscenza Alla Società Delle Competenze; AICA: Milano, Italy, 2015; pp. 43–50. Available online: http://www.didamatica2015.unige.it/wp-content/uploads/2016/01/Atti-Didamatica.pdf (accessed on 15 January 2021).
  44. Chiappe-Laverde, A.; Hine, N.; Martínez-Silva, J.A. Literature and practice: A critical review of MOOCs. Comunicar. Media Educ. Res. J. 2015, 44, 9–17. [Google Scholar] [CrossRef] [Green Version]
  45. Liyanagunawardena, T.R.; Adams, A.A.; Williams, S.A. MOOCs: A systematic study of the published literature 2008-2012. Int. Rev. Res. Open Distrib. Learn. 2013, 14, 202–227. [Google Scholar] [CrossRef]
  46. Calonge, D.S.; Shah, M.A. MOOCs, Graduate Skills Gaps, and Employability: A Qualitative Systematic Review of the Literature. Int. Rev. Res. Open Distrib. Learn. 2016, 17. [Google Scholar] [CrossRef]
  47. Paton, R.M.; Fluck, A.E.; Scanlan, J.D. Engagement and retention in VET MOOCs and online courses: A systematic review of literature from 2013 to 2017. Comput. Educ. 2018, 125, 191–201. [Google Scholar] [CrossRef]
  48. Sanchez-Gordon, S.; Luján-Mora, S. Research challenges in accessible MOOCs: A systematic literature review 2008–2016. Univers. Access Inf. Soc. 2018, 17, 775–789. [Google Scholar] [CrossRef]
  49. Zhu, M.; Sari, A.; Lee, M.M. A systematic review of research methods and topics of the empirical MOOC literature (2014–2016). Internet High. Educ. 2018, 37, 31–39. [Google Scholar] [CrossRef]
  50. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; The PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef] [Green Version]
  51. Hattie, J.A.C. Visible Learning. A Synthesis of over 800 Meta-Analyses Relating to Achievement; Routledge: London, UK, 2008. [Google Scholar]
  52. Brady, M.; Devitt, A.; Kiersey, R.A. Academic staff perspectives on technology for assessment (TfA) in higher education: A systematic literature review. Br. J. Educ. Technol. 2019, 50, 3080–3098. [Google Scholar] [CrossRef]
  53. Gough, D.; Oliver, S.; Thomas, J. An Introduction to Systematic Reviews; Sage: London, UK, 2012. [Google Scholar]
  54. Lopes, A.M.Z.; Pedro, L.Z.; Isotani, S.; Bittencourt, I.I. Quality evaluation of web-based educational software: A systematic mapping. In Proceedings of the IEEE 15th International Conference on Advanced Learning Technologies: Advanced Technologies for Supporting Open Access to Formal and Informal Learning (ICALT 2015), Hualien, Taiwan, 6–9 July 2015; pp. 250–252. [Google Scholar] [CrossRef]
  55. Nickerson, R.C.; Varshney, U.; Muntermann, J. A method for taxonomy development and its application in information systems. Eur. J. Inf. Syst. 2013, 22, 336–359. [Google Scholar] [CrossRef]
  56. Bailey, K.D. A three-level measurement model. Qual. Quant. 1984, 18, 225–245. [Google Scholar] [CrossRef]
  57. Stracke, C.M. The Quality Reference Framework for MOOC Design. In Proceedings of the 14th European Conference on Technology Enhanced Learning (EC-TEL 2019), LNCS 11722, Delft, The Netherlands, 16–19 September 2019; pp. 673–677. [Google Scholar] [CrossRef]
  58. Wielenga-Meijer, E.G.A.; Taris, T.W.; Kompier, M.A.J.; Wigboldus, D.H.J. From task characteristics to learning: A systematic review. Scand. J. Psychol. 2010, 51, 363–375. [Google Scholar] [CrossRef]
  59. Bernstrøm, V.H.; Houkes, I. A systematic literature review of the relationship between work hours and sickness absence. Work. Stress 2018, 32, 84–104. [Google Scholar] [CrossRef]
  60. Naczenski, L.M.; De Vries, J.D.; Van Hooff, M.L.M.; Kompier, M.A.J. Systematic review of the association between physical activity and burnout. J. Occup. Health 2017, 59, 477–494. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  61. Nilsen, W.; Skipstein, A.; Østby, K.A.; Mykletun, A. Examination of the double burden hypothesis—A systematic review of work–family conflict and sickness absence. Eur. J. Public Health 2017, 27, 465–471. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Selection of studies using the PRISMA procedure.
Figure 1. Selection of studies using the PRISMA procedure.
Sustainability 13 05817 g001
Figure 2. Overview of the pedagogical dimension and its categories from the quality framework for MOOCs.
Figure 2. Overview of the pedagogical dimension and its categories from the quality framework for MOOCs.
Sustainability 13 05817 g002
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Stracke, C.M.; Trisolini, G. A Systematic Literature Review on the Quality of MOOCs. Sustainability 2021, 13, 5817. https://doi.org/10.3390/su13115817

AMA Style

Stracke CM, Trisolini G. A Systematic Literature Review on the Quality of MOOCs. Sustainability. 2021; 13(11):5817. https://doi.org/10.3390/su13115817

Chicago/Turabian Style

Stracke, Christian M., and Giada Trisolini. 2021. "A Systematic Literature Review on the Quality of MOOCs" Sustainability 13, no. 11: 5817. https://doi.org/10.3390/su13115817

APA Style

Stracke, C. M., & Trisolini, G. (2021). A Systematic Literature Review on the Quality of MOOCs. Sustainability, 13(11), 5817. https://doi.org/10.3390/su13115817

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop