Next Article in Journal
Preservice Teachers’ Online Self-Regulated Learning: Does Digital Readiness Matter?
Previous Article in Journal
Effects of Japanese Special Moras Education Using Evernote
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Schema-Based Instructional Design Model for Self-Paced Learning Environments

1
Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, MD 20814, USA
2
School of Education, Johns Hopkins University, Baltimore, MD 21218, USA
3
College of Education, Sungkyunkwan University, Seoul 03063, Korea
*
Author to whom correspondence should be addressed.
Educ. Sci. 2022, 12(4), 271; https://doi.org/10.3390/educsci12040271
Submission received: 28 December 2021 / Revised: 26 March 2022 / Accepted: 30 March 2022 / Published: 12 April 2022
(This article belongs to the Topic Advances in Online and Distance Learning)

Abstract

:
Although research on schema has been widely investigated for the past decades, little research has addressed the development of a systematic instructional design theory using schema principles and processes. This study proposes a systematic schema-based instructional design model, including general and schema analysis, schema-based design, and development processes and techniques for evaluating a learner’s acquired schema. By synthesizing empirical studies, this study comprehensively reviews literature on schema and foundational principles for learning. The goal of the study is to enrich the knowledge base of schema-based instructional design for different learning environments. Thus, the study is concluded by a discussion on how to utilize a schema-based instructional design for self-paced learning environments with additional implications and further recommendations.

1. Introduction

Over recent decades, scholars and practitioners have enriched the knowledge base of schema and cognition in diverse fields. However, a systematic schema-based instructional design theory has not been fully developed. Generally, large-scaled self-paced online courses do not entail that an instructor is always available. Within many learning contexts, especially self-paced online environments, learners struggle to acquire in-depth knowledge and skills due to limited methods for accurate assessments of learning and absence of personal communications [1,2]. Learners are likely to experience cognitive overload and perceive the learning gaps as an overwhelming challenge. The high cognitive load may cause learners to only learn the content to a shallow degree [3]. Thus, an online course without considerations of cognitive load and instructional design is likely to be inconsistent in its effectiveness [4].
Research on online learning retention strengthens the above argument. A review of the literature on retention in online courses reported that online courses have a 10–20% higher failed retention rate than courses taught in a traditional classroom [5]. The situation is even direr in large-scale self-paced online courses, where research shows the retention rate is approximately 5–10%, which calls for further work to improve students’ quality of learning in these contexts [6]. Furthermore, few studies have emphasized the availability of instructional design theories specifically for large-scale self-paced online learning. By synthesizing empirical studies on schema and foundational principles, we present a schema-based instructional design model.
This study is guided by the following research questions:
1.
What principles of cognition can be utilized to design and develop both online and traditional classes?
2.
What is a schema-based instructional design and development process?
3.
What schema-based instructional techniques can be used for creating a quality learning environment?
4.
How is a schema-based instructional design model applicable to different learning environments?

2. Focused Literature Review

We conducted a focused literature review in order to propose an instructional design model for developing a large-scale self-paced online learning environment. Addressing the four aforementioned research questions, we searched four databases including PsycINFO, EBSCOHOST, Google Scholar, and SocINDEX. An extensive search was not limited to the field of education, because schema theory has been discussed in several fields such as psychology and social studies. The keywords for searching included “schema”, “schema-based approach”, “schema-based instruction”, “schema-based learning”, and “schema-based instructional design”. The initial search resulted in 5566 studies. We excluded duplicates so that only the most recent versions of the studies are included rather than working papers or conference proceedings, resulting in 684 studies. We then excluded studies that were not published in non-academic outlets such as magazines or newspapers.

3. Results

RQ1. What principles of cognition can be utilized to design and develop both online and traditional classes?

3.1. Principles of Schema-Based Instruction

Schema is a cognitive structure that helps to organize and process incoming information. Schema enables a human to discern the key features of an object from other irrelevant information; that is, prior knowledge and beliefs play a key role in the construction of more complex cognitive structures. Newly acquired information is paired with existing and readily available knowledge in the process of schema construction [7]. Sweller et al. highlighted that people elaborate their schema by incorporating elements from low-level schemas into higher-level schemas [3]. The cognitive schema theory posits that learners’ prior knowledge helps them to engage in in-depth cognitive processing [8].
The theoretical explanation of learners’ cognitive dissonance offers further insights into how to support schema elaboration in self-paced learning environments. The accumulation of new information does not necessarily result in schema elaboration [9]. Learners often struggle to assimilate new information into an existing schema when the new information is inconsistent with what they already know or believe [10]. Learning often entails such cognitive dissonance when learners are required to acquire new knowledge and skills [11]. When learners are perplexed by new information that conflicts with their prior knowledge, their working memory is overloaded, resulting in the failure of incorporating the new information. Instruction should be designed in a manner that allows learners to reduce this dissonance and achieve consonance with the external instructional guidance [9]. When learners are guided to devote their mental efforts to learning tasks, they are likely to accept the new information and elaborate their existing schema concerning the topic [12]. For example, Jermias found that learners willingly accepted new information that conflicted with their prior knowledge and beliefs when given opportunities to justify their choice for a new cost accounting system [12]. The study suggested that meaningful learning takes place when learners are asked to engage in reflection on their given learning tasks. For example, in-depth discussion following an instructor-led lecture could help learners elaborate their knowledge regarding a specific course topic.
Another approach designed to reduce the cognitive load imposed on learners’ working memory is to allow them to learn from multiple examples [13]. When learners develop a consistent and well-structured schema through learning with numerous examples, they can overcome cognitive dissonance in the face of new information [14]. A well-developed schema established by exposure to numerous examples, extensive practice, and instructional support allows a learner to use minimal working memory in the future. Such exposure enables learners to focus more on the critical features of an object rather than surface features, with an increased working memory capacity [15].
Instruction using visualized structures also facilitates a reduction in working memory load and schema elaboration regarding a concept. Visualized structures such as diagrams delineate semantic relations between the concepts and procedural knowledge to be learned [16]. The visual structures of objects have been recognized as facilitating learners to attend to key features of the objects they are learning about. For example, Jitendra et al. revealed that diagrams, compared with text-only materials, helped learners attend to a logical flow representing the process [17]. The results indicated that students who learned mathematical concepts with schema diagrams outperformed students who were taught without diagrams.
Furthermore, visual structures combined with cues or additional learner activities are even more effective in terms of knowledge acquisition. According to Scheiter and Eitel, when presented with diagrams combined with signals, students attended to key information more frequently than those learning only with diagrams [18]. Additional activities such as self-explanation contributed to the acquisition of transferable knowledge.
RQ2. What is a schema-based instructional design and development process?

3.2. Processes of Schema-Based Instructional Design

This section includes a summary of a literature review regarding the instructional design process, emphasizing the effective schema development in a large-scale online learning environment.

3.3. General Needs Analysis

Reigeluth stated that instructional design theories require two components: (1) instructional methods and (2) situations [19]. Situations are composed of: (a) desired learning outcomes and (b) instructional conditions, including: (i) learning, (ii) learner, (iii) learning environments, and (iv) development constraints.

3.4. Needs Analysis

Practically every instructional design theory should start by identifying preconditions of the instructional context for optimal learning outcomes and actions. Therefore, the primary goal of the first stage of schema-based instruction is to facilitate a systematic process for both establishing goals and identifying discrepancies between the goals and the presence in order to determine priorities for instructional actions [20]. While the general needs analysis is similar to traditional instructional design theories, schema-based instruction facilitates an additional analysis process for explicating and representing experts’ know-how by Cognitive Task Analysis (CTA) for capturing an expert’s convert knowledge, schema hierarchization, and knowledge-mapping process.

3.5. Learner Analysis

Instructional methods are dependent on many factors and the nature of learners plays an important role in this regard. In some large-scale learning environments, the average age of a MOOC is 42 years old, most of whom hold master’s degrees [21], while The National Center for Educational Statistics (NCES) indicated that students who are 18–24 years old are traditional college students [22]. Likewise, it is imperative to understand the nature of learners for creating and providing effective instructions suited for learners in a context. Reigeluth identified the three components: (1) prior knowledge, (2) learning strategies, and (3) motivations [19]. Considering the emergence of online education, it is important to understand learners’ readiness for online learning. Hung, Chou, Chen, and Own proposed the Online Learning Readiness Scale (OLRS) composed of five dimensions, which we included as essential components requiring investigation before developing instructions [23]. Those dimensions are: (4) computer-internet self-efficacy, (5) self-directed learning, (6) learner control, (7) online communication self-efficacy, and (8) motivation for learning

3.6. Learning Environment (Context) Analysis

Learning environments refer to the diverse physical locations, contexts, and cultures where students learn and interact with different people and content; Thus, factors of the learning environment contribute to the learning task [24]. Students may further their learning through outside-of-classroom environments such as online education, open learning environments [25], or Massive Open Online Course [26], wherein researchers have categorized distance/remote learning environments as Virtual Learning Environments (VLE). In order to analyze a VLE context, Reigeluth proposed that the nature of learning environments is composed of the following aspects: (1) the location, (2) the format (e.g., online vs. face-to-face), (3) the number of learning peers, and (4) the purpose of the learning environments (e.g., workplace, military, college, or government) [19]. In addition, VLE commonly requires additional functions relative to the large-scale online learning environment; thus, a fifth aspect, an effective Learning Management System, is also required to facilitate and deliver content, manage students’ learning progress, and offer just-in-time training [27,28].

3.7. Performance Objective Analysis

Dick and Carey stated that the performance objective analysis is dependent on analysis regarding the context, the learner, and the problems relating to potential solutions [29]. By understanding the cause of the problem, as schema-based instructional design focuses on higher-order thinking skills, this stage can offer more practical solutions and clearer performance objectives that can be used by the majority of the audience. The performance objective analysis should be conducted in concurrence with instructional analysis and learner/context analysis.

3.8. Feasibility Analysis

Feasibility analysis, also referred to as feasibility study, is a process of ensuring that all of a project’s proposed outcomes are feasible to achieve [30]. In the field of management, feasibility analysis considers four primary aspects: (1) economic, (2) technical, (3) operational, and (4) schedule feasibility. By applying these concepts to an instructional design project, feasibility can represent a valuable tool for assessing the feasibility of the proposed timeline, technical infrastructure, and resources needed to accomplish the project’s outcomes [30].

3.9. Schema Analysis

Unlike traditional instructional design theories and models, schema-based instruction starts with a Cognitive Task Analysis (CTA) conducted by a learning designer in close collaboration with a domain expert. The CTA explicates the expert’s tacit knowledge and is effective in establishing a knowledge map of the domain [31]. Schema analysis is unique in comparison to traditional instructional design analysis in the sense that this process is used to explicate an expert’s behavioral and cognitive tasks.

3.10. Cognitive Task Analysis (CTA)

Cognitive Task Analysis (CTA) is a process of explicating what people (who are usually experts) know, the way they think, how they organize and structure their knowledge, and most importantly how they think of achieving their desired performance outcomes [31]. CTA should be performed before instructional design and development, especially when the student population is composed of adult learners. Considering that traditional instructional design theories primarily analyze behavioral tasks, conducting CTA can address a task that is largely covert and nonprocedural in nature. There are four major analysis tools for conducting CTA: (1) behavioral, (2) information processing, (3) goals, operations, methods, and selection rules (GOMS) [32], and (4) the critical decision method. The first two, behavioral and information processing, analyze and capture procedural knowledge, while the latter two methods focus on rule-based analysis.

3.11. Schema Hierarchical Analysis

Upon the completion of CTA, one may have a decent understanding of which, and how, covert and overt tasks/concepts relate to performing a certain task. With this, schema hierarchization is necessary in order to sequence how such organized forms of knowledge associate with each other. The hierarchical analysis is more of a bottom-up approach based on learning taxonomies [33]. According to Botvinick and Plaut, schema hierarchization starts by modeling naturalistic action with the goals of identifying the sequential subordinate actions involved in a task [34]. The architecture of the overall model includes: (1) perceptual input, (2) internal representation, and (3) actions; these three components thus interact with the environment.

3.12. Knowledge Mapping

The last schema analysis is knowledge mapping. Once schema and its hierarchies—superordinate–subordinate relationships—are identified, learning designers should develop a knowledge map for providing a visual of the knowledge base (or a schema). Knowledge mapping is a process where learning designers put pieces into place through visual references. It is an efficient and effective way to categorize and visualize “organizational knowledge” ([35], p. 83). Knowledge mapping has been widely used in the field of knowledge management [36]. Knowledge mapping is a useful process for capturing both implicit (overt/behavioral) and explicit (covert/cognitive) knowledge [37]. Please see Table 1 for details.

4. Design and Development

The design and development focus on both learning experiences and materials [39]. Since the results of analyses provide a visualized knowledge map and schema hierarchies as well as the nature of learning contexts, learners, desired outcomes, and learning activities should foster: (1) activating, (2) constructing, and (3) automating schema, while also (4) updating/modifying existing schema through assimilation and accommodation, and (5) facilitating schema elaboration. If a learner has never learned a lesson’s desired learning outcomes, creating a new schema is essential in order to mimic the knowledge system of an expert; the ways in which experts think, how they make decisions regarding particular situations, and how they achieve their desired goals. On the other hand, if learners also have similar experiences with, or knowledge of, the learning objectives, instructions should trigger and activate the existing schema for refresher learning [40]. In this section, we provide selected examples of instructional techniques and strategies for schema activation, construction, automation, modification, and elaboration.

4.1. Schema Activation

Schema activation is also referred to as knowledge activation. In this stage, instructions prompt learners to revisit or activate such individual knowledge or belief systems with the goal of identifying learning gaps or discrepancies. Researchers claim that one of the primary reasons for schema activation is to improve the comprehension of a text [41,42,43]. Based on a study conducted by Cho and Ma, schema building activation had a short-term and long-term effect on increasing learners’ reading comprehension [44]. Educational psychologists have developed several approaches for schema activation, of which pre-reading tasks are widely used activities [45]. Pre-reading activities help learners to create top-down semantic and structural information before reading a text. These pre-reading activities belong to a higher category called pre-organizers.

4.2. Schema Construction

When learners try to comprehend a text, they tend to associate it with prior knowledge and experiences for interpretation [45]. By relating the text with prior knowledge, learners can better understand an author’s message. As learners link new knowledge to existing knowledge, learning outcomes can be increased while also decreasing cognitive load [46]. While scholars have developed instructional strategies to help learners activate and employ prior knowledge and experiences for comprehension, schema theories rarely explain how learners modify and create a new schema for dealing with novel information or skills. Scholars have suggested four main methods for schema construction: (1) cases, (2) interactive video recordings, (3) hierarchical concept maps, and (4) Vee diagrams [39].

4.3. Schema Automation

Schema automation has been widely investigated by educational psychologists [3,47,48]. Schema automation is the opposite of conscious processing, in which individuals utilize a large amount of working memory to process presented concepts or information. An empirical study conducted by Kotovsky, Hayes, and Simon represents a good example for showing the importance of schema automation [49]. The results of their study revealed that students who learned a new concept had to spend their working memory—processing effort—on retrieving “rules”, to solve the problem, while students who had already mastered the concept utilized their working memory to identify “patterns”. This mastery and familiarity of a concept comes from repeated practices and processes of strengthening that lead to automation [7]. Given this, researchers have proposed instructional techniques related to schema automation, among which we present the methods that are reliably tested and empirically proven.
RQ3. What schema-based instructional techniques can be used for creating a quality learning environment?

4.4. Instructional Techniques for Schema Construction and Automation

Sweller and Levine introduced the following aspects: (1) Goal-free problems in an effort to reduce extraneous cognitive load generated by means–ends analysis [50]. An example of a goal-free mathematics problem is to replace the last statement rather than providing a goal statement such as “calculate the value of as many variables as you can”. In this way, learners do not search for the “right answers”; rather, they practice reasoning for achieving “possible” answers. As there is no specific goal identified in a problem, learners must find alternative problem-solving mechanisms that can be applied; (2) Worked example is another method that can promote schema automation and construction. Worked examples are also supposed to remove means–ends searches, freeing working memory capacity by decreasing extraneous cognitive load [51]. Worked examples should facilitate active learning, focus on problems and solution steps that can connect worked examples to existing knowledge and minimize extraneous demands for the effective construction of knowledge [52] (Renkl, 2021). However, worked examples may not force learners to study problems carefully; thus, using worked examples should be mixed with conventional problems; (3) The completion-problems effect was introduced to fill the gaps of worked examples. Van Merrienboer and Krammer suggest that completion problems can offer partial solutions, enabling learners to figure out the remaining solutions [53]. With the clues provided, this method has been largely effective in design-oriented disciplines such as software development or architecture; (4) The split-attention effect is also important to consider when designing instruction. Chandler and Sweller conducted an empirical study on visual split attention [54]. The results of the study found that requiring learners to split their attention is not conducive to learning. When text and diagrams are presented simultaneously, learners’ cognitive loads are evidently high, thus negatively impacting learning [55]. If possible, split attention should be avoided in practically all instructional situations [4]. Derived from the split-attention effect, scholars suggested (5) the modality effect when designing instruction. Rather than merely presenting split-sources information, integrating visuals such as diagrams and providing verbal sources of information in the auditory form is conducive to learning [4]. Narrations with instructional visuals are preferred over visuals with written texts because it allows for independent visual and auditory processing that lighten the cognitive load on working memory [56]. However, this effect is only proven under the split-attention situations [57]; (6) The redundancy effect is also derived from the split-attention effect. Basically, the redundancy effect claims that working memory can be reduced and schema automation/construction can be improved by eliminating redundant sources of information [58], especially when instruction is delivered through a screen [4]. Rather than reinforcing important information, redundancy of identical information can decrease performance because of its interference with processing key information [59] (Fiorella and Mayer, 2021). Lastly, (7) implementing the variability effect is advisable. By experiencing the full spectrum of a concept, learners are able to identify implied problems, thus increasing the probability for transfer of learning [60].

4.5. Schema Modifications

Schema is difficult to change once constructed and automated [61]. However, as knowledge advances and skills elaborate, schema needs to be updated and maintained. Although schema modification is integral for the whole design and development processes, the last stage for holistic evaluation of schema is advisable as a learner is supposed to have a well-organized knowledge base of a task and is thus likely to possess a high level of meta-cognition [62]. Moreover, schema modification requires a large amount of mental energy, which can be counterproductive for learning a new task. Therefore, the schema modification process should occur during the last phase, facilitating activities that mainly focus on re-evaluating a learner’s existing schema in order to promote an efficient use of mental efforts [63].
Schema modification can be explained by Jean Piaget’s theory of cognitive development, wherein his idea of accommodation is particularly applicable for schema modification. Piaget stated that an individual experiences a state of cognitive disequilibrium when a new concept or piece of information cannot be fitted into an existing schema [64]. A learner trying to balance her or his cognitive state through changing an existing schema is called accommodation, which allows the person to once again achieve a state of equilibrium [64].
While Piagetian theorists enriched the knowledge base of cognitive development, they did not articulate how to modify established schema. In the review of literature, we believe the two learning strategies—unlearning/relearning and reflection—can be highly effective in modifying and updating schema. “Unlearning” means a planned, active, and deliberate process designed to generate critical rethinking [65], whereas “relearning” refers to the process of developing new understandings and skills regarding the same concepts [66]. An example is a task requiring learners to: (a) challenge what they think they know, (b) question everything, (c) step outside their circle of influence, (d) believe, and (e) accept and adapt. Reflection, meanwhile, has a diversity of meanings; in education it is an intentional inquiry process regarding the motives, methods, materials, and outcomes of educational practices [67]. While many reflection models are proposed, the common process follows three steps: (a) reflection, (2) planning, and (3) action.

4.6. Schema Elaboration

Once a schema is constructed and automated, instructions promoting schema elaboration should follow. The schema elaboration process is necessary for improving the resolution of the schema. Schema elaboration is a process of maturating, saturating, or sophisticating the acquired schema through instruction in a different yet similar domain. This process is known as either an inter-schema interaction or a parallel schema interaction. A learner applies a trained schema to an immature schema in order to perform a self-directed or instructor-guided pattern recognition activity. A learner then identifies similarities and differences from a presented case or problem; the learner is able to revisit the acquired concepts and skills, resulting in the strengthening of such understandings.
Schema elaboration is facilitated by an inter-schema or parallel schema interaction. Such interaction requires a partner schema that is immature, enabling a learner to compare and contrast an acquired schema with an immature schema. We believe instructional methods for partner schema should be different depending on what type of schema a learner compares with or what the instructor considers important: (1) A non-existing schema is a schema that a learner has never been exposed to, (2) an idling schema is a previously learned schema that is deeply tacit and has not been utilized frequently, (3) an incomplete schema is a mental model that an individual has an overview understanding of, yet schema automation is not facilitated, (4) an inactive schema is a seldom-used schema in which a learner has completed the automation and construction process and which is seldom used in work or real life, (5) an active schema is a frequently used schema in real life and work context; thus, an active schema is up-to-date and is readily articulable. We believe that there is an inversely proportional relationship between the five schemas and cognitive load or element interactivity. Figure 1 portrays the proposed concept.
The inter-schema interaction should occur in a different domain of knowledge for optimal knowledge transfer while intra-schema interactions—upper–lower schema in the same domain of knowledge—should be facilitated for the schema automation phase. Figure 2 portrays the schema elaboration process.
Taken in total, Table 2 provides a summary of the instructional design process.

5. Evaluation

Given that mental efforts regarding learning tasks are indicative of cognitive processing [70], it is possible to use the cognitive load as a proxy through which to determine whether learners are engaged with a given learning task. Sweller, Van Merrienboer, and Pass suggest three major techniques to measure students’ mental efforts: (1) Subjective techniques can be used, with the assumption that students are able to introspect on their cognitive processes [3]. The major strengths of subjective techniques are their simplicity and applicability; however, the simplicity can also be seen as a weakness, because results may provide a general rather than specific measure of cognitive load [60] (Sweller et al., 2019). Self-report tools such as online questionnaires have been regarded as feasible methods for learners to report accurate numerical values representing their mental efforts invested in learning tasks [71]. In self-paced online learning environments, survey tools designed to measure students’ cognitive load can easily be implemented; therefore, self-report tools have been most frequently used for evaluation [72]. For example, Jung, Shin, and Zumbach carried out multiple surveys at three different phases of learning while students were participating in computer-supported collaboration in order to examine the effect of pre-training on the students’ cognitive load and achievement [21]. (2) Physiological techniques can be employed when students’ physiological reactions to learning tasks can be collected, such as their heart rate or eye movement. According to de Jong, physiological response indicators could promise accurate data on learner response to various mental efforts and cognitive load [72]. Recent advances in wearable devices such as eye trackers, neuro-imaging techniques, and heart rate monitors equipped in mobile devices (e.g., smart watches) combined with the affordability of such technologies, have made it possible to collect students’ physiological data more easily than ever before in self-paced online learning contexts [72]. Lin et al. attempted to track students’ cognitive processes using an eye tracker while they were debugging in a computer programming class [73]. The sequential analysis of participants’ eye gazes revealed that high performers moved their eyes in a more logical manner than low performers. The study presented the potential advantages of such wearable devices for the implementation of physiological techniques to measure students’ mental efforts in self-paced online learning settings. (3) Task- and performance-based techniques are focused on analyzing student performance after learning tasks are completed. Objective task characteristics such as the number of attempts made for critical reasoning, time spent on a learning task, or test scores are considered traditional ways of measuring cognitive load without factoring subjective information [74]. Sweller et al. claimed that while these task- and performance-based techniques have been employed as reliable and easy-to-use methods, an increasing amount of attention is being paid to subjective and physiological techniques to measure learners’ on-task cognitive loads [3]. Given the fact that at-risk students need to be identified in the early phase of self-paced online courses in order to provide timely instructional support, the use of subjective or physiological techniques should be considered.

5.1. Schema Acquisition Test

Since students’ schema is a complex structure, it is important to leverage multiple techniques in order to evaluate what schema they acquired during the instruction. Ahn, Mooney, Brewer, and DeJongstressed that a schema is supposed to apply to a large number of new instances beyond one example with which learners acquire knowledge regarding a target concept; that is, the quality of schema should be evaluated through the following strategic measures, which allow learners to demonstrate a comprehensive understanding of the concept [75]:
(1) Self-explanation of the acquired knowledge could enable students to offer their understanding about a concept. Given prompts, learners self-explain and elaborate on their knowledge and the information provided [60]. While learners are articulating cognitive structures built in their schema, an evaluator can capture the degree of correspondence between the learners’ conceptual schema, its representation, and interpretation [76]. Wong, Lawson, and Keeves reported that students who performed self-explanation using certain prompts achieved better results in studying a new geometry theorem than students who studied using a booklet [77]. The findings suggest that self-explanation served to increase students’ awareness of what they learned and what they still did not understand.
(2) A checklist approach can also be considered when evaluating learners’ schema. This is differentiated from conventional test methods in that a scoring method based on the checklist approach is designed to monitor what aspects of knowledge taught during instruction were actually acquired. For example, Han et al. assessed students’ schema by checking if they correctly understood the difference between the variables and constraints of a given problem; the approach enables the comprehensive evaluation of a schema that would not be fully revealed through conventional tests [75].
(3) Concept mapping could also be a complementary way to self-explanation methods that often lead learners to omit a full account of their learning. Once learners draw a concept map representing their cognitive structure built in their schema, it is possible to evaluate how they understand relationships between conceptual components and what needs to be taught to address knowledge gaps that are often delineated by omitted components or disconnected linkage. Kim proposed a web-based tool designed to visualize how semantic components are linked in learners’ schemas [78]. The author proved that the tool can be used to convert learners’ writing into a visually interpretable structure, regardless of the languages they used. This digital visualization tool has the huge potential to be used in large-scale online learning because individual learners can use it without an instructor. It is thus possible for instructors to see what their students typically do not understand for each phase of learning.

5.2. Measurement of Knowledge Transfer

One of the purposes of the proposed instructional design theory is to prepare learners for a new context in which they are required to apply what they learned to solve a novel problem. Near transfer is defined as a transfer between similar contexts; students learn to subtract two-digit numbers by one example and then are asked to solve another problem of the same kind involving different numbers. Far transfer, in contrast, refers to transfer between substantially different contexts [79,80]. For example, it may be considered far transfer when learners are asked to explain how to resolve an energy issue between two countries after learning about game theory. While the distinction between “near” and “far” does not “imply any strictly defined metric of closeness” ([79,80], p. 4), the facilitation of the transfer of learning has been researchers’ main concern in the area of schema-based instruction [81]. This is because the construction of schema implies that learners are equipped with the ability to link their learning with novel problems [82].
Surprisingly, and despite a plethora of research on transfer, there is little to no research on how to measure when a transfer occurs. Of the few studies that have been conducted, one way to determine whether a schema has been well constructed is to evaluate learner performances with (1) intervals. Learners who are intermittently evaluated during learning are more likely to dedicate cognitive efforts toward information processing, as well as to transfer what they learn into practice, compared with those who were evaluated at the end of learning [83]. Research concerned with transfer of learning has implemented multiple tests at multiple time points during and after learning to examine whether learners obtained transferable knowledge and skills from learning [84,85].
Another method is (2) to expose learners to incrementally complex problems during the instruction in such a way that they obtain a sense of what they will actually do after the completion of learning. Kaner and Padmanabhan reported that students who studied with practice exercises showing numerous worked examples of the use of software techniques that become incrementally difficult were able to obtain the skills needed to resolve novel problems [86].
Some researchers have tested novel approaches for the evaluation of the transfer of learning. For example, Brown, McCracken, and O’Kane revealed that (3) reflective learning journals helped to facilitate and evaluate training transfer [87]. In the study, employees who were participating in a leadership development program were encouraged to keep a learning diary in order to reflect on their own learning from each module. A qualitative analysis conducted to investigate participants’ experiences showed that they not only engaged with content in each module they also consciously implemented the strategies they had learned in their daily lives and in the workplace. It is important to note that the reflective learning journals were also a reliable source for evaluating transfer of training as they clearly illustrated how participants changed in terms of applying what they learned from the training opportunity.
In order to properly investigate the transfer of learning in self-paced online learning environments, a combination of the methods is advisable. While multiple quizzes and/or practice exercises are provided, with intervals as the formative assessment method, the addition of complexity to the practice exercises should also be considered in order to help the learners acquire transferable knowledge and skills. Online learning platforms afford the ability to implement such strategies. For instance, in modern learning management systems, online tests can readily be set to be presented to learners after they complete each module, along with reflective questions.

5.3. Heuristic Task Assessment (Task Expertise Assessment)

In order to evaluate the acquired schema, we believe that assessment of a task expertise is beneficial. According to Reigeluth, task expertise is different from domain expertise; task expertise refers to “becoming an expert in a specific task, such as managing a project, selling a product, or writing an annual plan” ([19], p. 435), whereas domain expertise means “becoming an expert in a body of subject matter not tied to any specific task, such as economics, electronics, or physics (but often relevant to many tasks)” ([19], p. 435). We adopted Lee and Reigeluth’s concepts on Heuristics Task Analysis (HTA), largely because of the abstract nature of a schema [88]. Utilizing the HTA can be an effective method for instructors to assess a learner’s acquired schema [88].
Lee and Reigeluthprovided a systematic guideline for performing an HTA [88]. HTA mainly addresses the three aspects of heuristic knowledge, starting with: (1) knowledge elicitation, which explicates domain-relevant knowledge directly from a student, followed by (2) knowledge analysis, and (3) knowledge representation. Of the many knowledge elicitation methods, Cookesuggested three primary techniques: (a) observations and interviews, (b) process tracing, and (c) conceptual techniques [89]. Forsythe and Buchanan mentioned that a lack of advice regarding how to select and apply the elicitation methods has been a concern in the context of expertise performance assessment, but in an instructional environment an instructor can address this issue [90]. For knowledge analysis, we believe nonhuman-based approaches as suggested by Benbenishty are particularly beneficial for large-scale self-paced online learning situations [91].
While rare, researchers have used several knowledge representation methods, including semantic networks or augmented transition networks, in order to construct the system hierarchically. For the purpose of heuristic task assessment for instructional purposes, we believe that facilitating learning activities prompts learners to represent their acquired schema in an organized manner that can be completely tutored and simulated [92]. Lee and Reigeluth also introduced different kinds of heuristic knowledge [88]. It is worth adopting the definitions for our audience to develop instructional strategies for facilitating and creating heuristic task assessments. Table 3 summarizes different kinds of heuristic knowledge.
RQ4. How is a schema-based instructional design model applicable to different learning environments?
In the synthesis of our discussion on schema, we propose the following model: schema-based instructional design starts with (1) general and (2) schema analysis, followed by schema development and elaboration with relevant evaluation methods. To note, the two analyses processes interact to initiate a schema activation process. We believe schema construction, automation, and modification occur iteratively by nature, followed by elaborating the acquired schema by facilitating cross-context applications. To measure whether the acquired schema is well developed and elaborated, a learner should go through several schema-related evaluation processes. Figure 3 provides an overview process.

6. Implications

This model is intended to help learners generate major “takeaways” from the instruction process. With the concrete schema constructed, we believe that learners are able to retain acquired knowledge and skills for a longer time than those receiving traditional teacher-guided instruction. For online learning in general and large-scale online courses in particular, both retention and student engagement have been a major concern. The proposed instructional theory can offer a way to create a meaningful, engaging, and challenging learning environment, with the goal of improving the retention rate. Most MOOC courses are standardized with a large audience in mind. However, the theory of instructional design has not been highly emphasized [21]. Designing instruction based on schema theory can help learners reflect on their current schema of a knowledge domain. This can result in either the modification of schemas or the facilitation of the relearning/unlearning processes. The fields of computer science and medicine refer to this phenomenon as “schema versioning” [93,94], in which we believe the proposed theory is particularly effective for adult learners with prior knowledge and experiences. Further empirical studies in this area can offer deeper insights.
Measurement of schema has been difficult as schema is abstract in nature [3]. We thus call for further empirical studies that measure cognitive activities during the learning process. A schema-based instructional design can also be synergetic with learning analytics for real-time analysis and the development of an automated feedback system in a large-scale self-paced online learning environment. With the results of learning analytics, researchers and practitioners can establish a sophisticated learning system, which can serve as a vehicle for elaborating research on a personalized learning system. Upon the implementation of schema-based instructional design models, researchers can identify the “common sticky points” of a domain of knowledge by collecting datasets from online learners, while also analyzing time spent for a specific problem and schema. This will feed into the revision and elaboration of the instruction process. This concept is similar to “desirable difficulty” as proposed by [95]. Large-scale e-learning (e.g., MOOC) does not involve teachers and thus the importance of content-learner interaction is high. Designing instruction in consideration of learners’ cognitive activities may help with learners’ knowledge acquisition processes.

7. Conclusions

To date, much of the large-scale, self-paced online learning literature has focused on retention rates or content quality [6]. To address concerns about the absence of instructors, designing a large-scale self-paced online course should focus on human cognition, engagement, flow, and motivation for promoting learner-related benefits such as self-control or sense of achievement [96]. Seminal work on schema and cognition in relation to learning such as Sweller’s Cognitive Load Theory [97], measurement of mental effort [70], principles of principled (organized) knowledge [98], or schema and instructional design [3], schema development, automation, and elaboration processed were addressed to elaborate a schema-based instructional design model. Interestingly, we found an abundance of literature on cognition and its application to learning; however, little to no research focused on the methods of instruction, a systematic development of instructional design in particular. Thus, this article strove to provide instructional methods on how to facilitate schema-based instruction.
Reviewing the relevant literature, we found that both online and face-to-face courses should be designed to address the essential schema-related components, including schema hierarchization, schema construction, schema automation, schema activation, and/or schema interactions [3]. In theory, utilizing the principles of schema can help learners create, automate, elaborate, and modify schema, thus establishing a desirable learning process that is applicable across domains and contexts. Such techniques are more effective in large-scale self-paced online courses, largely due to the absence of an instructor and a learner’s high interactivity with, and heavy reliance on, content. For instance, current trends in MOOC represent the importance of self-directed learning and related learning activities for instructional efficiency. Schema-based instructions with learning strategies that promote schema development are not only effective for instruction but also for learners’ self-learning skills such as meta-cognitive skills or problem-solving skills. Future studies should focus on refining the measurement of schema, preferably in an autonomous way and for a large audience. We believe that enriching the knowledge base of schema-based instructional design will contribute greatly toward the development of effective personalized learning systems.

Author Contributions

Conceptualization, E.J. and D.K.; methodology, E.J. and D.K.; formal analysis, E.J. and R.L.; investigation, E.J. and D.K.; data curation, R.L.; writing—original draft preparation, E.J. and D.K.; writing—review and editing, R.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2021S1A5A8063131).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Disclaimer

The opinions and assertions expressed herein are those of the author(s) and do not necessarily reflect the official policy or position of the Uniformed Services University or the Department of Defense.

References

  1. Rovers, S.F.E.; Clarebout, G.; Savelberg, H.H.C.M.; De Bruin, A.B.H.; Van Merriënboer, J.J.G. Granularity matters: Comparing different ways of measuring self-regulated learning. Metacognit. Learn. 2019, 14, 1–19. [Google Scholar] [CrossRef] [Green Version]
  2. Wahyudi, R.; Malik, M. “Democratic” online courses as “glocalized communication” in English language teaching and applied linguistics: A proposal. J. Glob. Literacies Technol. Emerg. Pedagog. 2014, 2, 248–260. [Google Scholar]
  3. Sweller, J.; Van Merrienboer, J.J.G.; Paas, F. Cognitive Architecture and Instructional Design. Educ. Psychol. Rev. 1998, 10, 251–296. [Google Scholar] [CrossRef]
  4. Sweller, J. Cognitive load theory and educational technology. Educ. Technol. Res. Dev. 2020, 68, 1–16. [Google Scholar] [CrossRef]
  5. Bawa, P. Retention in online courses: Exploring issues and solutions—A literature review. Sage Open 2016, 6, 2158244015621777. [Google Scholar] [CrossRef] [Green Version]
  6. Hone, K.S.; El Said, G.R. Exploring the factors affecting MOOC retention: A survey study. Comput. Educ. 2016, 98, 157–168. [Google Scholar] [CrossRef] [Green Version]
  7. Van Merriënboer, J. How People Learn. The Wiley Handbook of Learning Technology; Wiley & Sons: Hoboken, NJ, USA, 2016; pp. 15–34. [Google Scholar]
  8. Britton, B.K.; Tesser, A. Effects of prior knowledge on use of cognitive capacity in three complex cognitive tasks. J. Verbal Learn. Verbal Behav. 1982, 21, 421–436. [Google Scholar] [CrossRef]
  9. Kalyuga, S. Knowledge elaboration: A cognitive load perspective. Learn. Instr. 2009, 19, 402–410. [Google Scholar] [CrossRef] [Green Version]
  10. Gorham, L.M.; Rumble, J.N.; Pounds, K.L.; Lindsey, A.B.; Irani, T. The Role of Dissonance and Schema: An Exploration of Florida Public Perception after the DWH Oil Spill. J. Appl. Commun. 2016, 100, 10. [Google Scholar] [CrossRef] [Green Version]
  11. Festinger, L. Cognitive dissonance. Sci. Am. 1962, 207, 93–106. [Google Scholar] [CrossRef]
  12. Jermias, J. Cognitive dissonance and resistance to change: The influence of commitment confirmation and feedback on judgment usefulness of accounting systems. Account. Organ. Soc. 2001, 26, 141–160. [Google Scholar] [CrossRef]
  13. Rittle-Johnson, B.; Star, J.R. Does comparing solution methods facilitate conceptual and procedural knowledge? An experimental study on learning to solve equations. J. Educ. Psychol. 2007, 99, 561–574. [Google Scholar] [CrossRef] [Green Version]
  14. Große, C.S.; Renkl, A. Effects of multiple solution methods in mathematics learning. Learn. Instr. 2006, 16, 122–138. [Google Scholar] [CrossRef]
  15. Hilbert, T.; Renkl, A.; Schworm, S.; Kessler, S.; Reiss, K. Learning to teach with worked-out examples: A computer-based learning environment for teachers. J. Comput. Assist. Learn. 2008, 24, 316–332. [Google Scholar] [CrossRef]
  16. Jitendra, A.K.; Hoff, K.; Beck, M.M. Teaching middle school students with learning disabilities to solve word problems using a schema-based approach. Remed. Spec. Educ. 1999, 20, 50–64. [Google Scholar] [CrossRef]
  17. Jitendra, A.K.; Star, J.R.; Starosta, K.; Leh, J.M.; Sood, S.; Caskie, G.; Hughes, C.L.; Mack, T.R. Improving seventh grade students’ learning of ratio and proportion: The role of schema-based instruction. Contemp. Educ. Psychol. 2009, 34, 250–264. [Google Scholar] [CrossRef]
  18. Scheiter, K.; Eitel, A. Signals foster multimedia learning by supporting integration of highlighted text and diagram elements. Learn. Instr. 2015, 36, 11–26. [Google Scholar] [CrossRef]
  19. Reigeluth, C.M. What is instructional design theory? In Instructional Design Theories and Models: A New Paradigm of Instructional Theory; Reigeluth, C.M., Ed.; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 1999; Volume 2, pp. 5–29. [Google Scholar]
  20. Jonassen, D.H.; Tessmer, M.; Hannum, W.H. Task Analysis Methods for Instructional Design; Psychology Press: East Sussex, UK, 1998. [Google Scholar]
  21. Jung, E.; Kim, D.; Yoon, M.; Park, S.; Oakley, B. The influence of instructional design on learner control, sense of achievement, and perceived effectiveness in a supersize MOOC course. Comput. Educ. 2019, 128, 377–388. [Google Scholar] [CrossRef]
  22. Wolfberg, E. Free Application for Federal Student Aid (FAFSA); US Department of Education: Washington, DC, USA, 2012; pp. 305–306. Available online: http://www.fafsa.ed.gov/ (accessed on 1 March 2020).
  23. Hung, M.-L.; Chou, C.; Chen, C.-H.; Own, Z.-Y. Learner readiness for online learning: Scale development and student perceptions. Comput. Educ. 2010, 55, 1080–1090. [Google Scholar] [CrossRef]
  24. Choi, H.-H.; van Merriënboer, J.J.G.; Paas, F. Effects of the Physical Environment on Cognitive Load and Learning: Towards a New Model of Cognitive Load. Educ. Psychol. Rev. 2014, 26, 225–244. [Google Scholar] [CrossRef]
  25. Jegede, O.J.; Fraser, B.; Curtin, D.F. The development and validation of a distance and open learning environment scale. Educ. Technol. Res. Dev. 1995, 89–94. [Google Scholar]
  26. Jansen, D.; Schuwer, R.; Teixeira, A.; Aydin, C.H. Comparing MOOC Adoption Strategies in Europe: Results from the HOME Project Survey. Int. Rev. Res. Open Distrib. Learn. 2015, 16. [Google Scholar] [CrossRef]
  27. Reigeluth, C.M.; Aslan, S.; Chen, Z.; Dutta, P.; Huh, Y.; Lee, D.; Lin, C.Y.; Lu, Y.H.; Min, M.; Tan, V.; et al. Personalized integrated educational system: Technology functions for the learner-centered paradigm of education. J. Educ. Comput. Res. 2015, 53, 459–496. [Google Scholar] [CrossRef]
  28. Wang, F.; Hannafin, M.J. Design-based research and technology-enhanced learning environments. Educ. Technol. Res. Dev. 2005, 53, 5–23. [Google Scholar] [CrossRef]
  29. Dick, W.; Carey, L. The Systematic Design Of Instruction, 4th ed.; Harper Collins College Publisher: New York, NY, USA, 1996. [Google Scholar]
  30. Arain, M.; Campbell, M.J.; Cooper, C.L.; Lancaster, G.A. What is a pilot or feasibility study? A review of current practice and editorial policy. BMC Med. Res. Methodol. 2010, 10, 67. [Google Scholar] [CrossRef] [Green Version]
  31. Crandall, B.; Klein, G.; Klein, G.A.; Hoffman, R.R. Working Minds: A Practitioner’s Guide to Cognitive Task Analysis; MIT Press: Cambridge, MA, USA, 2006. [Google Scholar]
  32. Van Merriënboer, J.J. Training Complex Cognitive Skills: A Four-Component Instructional Design Model for Technical Training; Educational Technology: Englewood Cliffs, NJ, USA, 1997. [Google Scholar]
  33. Gagné, R.M.; Medsker, K.L. The Conditions of Learning: Training Applications; Harcourt Brace: Fort Worth, TX, USA, 1996. [Google Scholar]
  34. Botvinick, M.; Plaut, D.C. Doing Without Schema Hierarchies: A Recurrent Connectionist Approach to Normal and Impaired Routine Sequential Action. Psychol. Rev. 2004, 111, 395–429. [Google Scholar] [CrossRef] [Green Version]
  35. Lachner, A.; Pirnay-Dummer, P. Model-Based Knowledge Mapping. In Learning and Instruction in the Digital Age; Springer Science and Business Media LLC: New York, NY, USA, 2009; pp. 69–85. [Google Scholar]
  36. Vail, E.F. Knowledge Mapping: Getting Started with Knowledge Management. Inf. Syst. Manag. 1999, 16, 16–23. [Google Scholar] [CrossRef]
  37. Wexler, M.N. The who, what and why of knowledge mapping. J. Knowl. Manag. 2001, 3, 249–263. [Google Scholar] [CrossRef]
  38. Clark, R.E.; Estes, F. Cognitive task analysis for training. Int. J. Educ. Res. 1996, 25, 403–417. [Google Scholar] [CrossRef]
  39. Alvarez, M.; Risko, V. Schema Activation, Construction, and Application (ERIC Digest #46); ERIC Clearinghouse on Reading and Communication: Bloomington, IN, USA, 1989. [Google Scholar]
  40. Monsieurs, K.G.; De Regge, M.; Schelfout, S.; D’Hondt, F.; Mpotos, N.; Valcke, M.; Calle, P.A. Efficacy of a self-learning station for basic life support refresher training in a hospital: A randomized controlled trial. Eur. J. Emerg. Med. 2012, 19, 214–219. [Google Scholar] [CrossRef]
  41. Hudson, T. The effects of induced schemata on the “Short-Circuit” in second language reading: Non-decoding factors in second language reading performance. Lang. Learn. 1982, 32, 1–31. [Google Scholar] [CrossRef]
  42. Johnson, P. Effects on Reading Comprehension of Language Complexity and Cultural Background of a Text. TESOL Q. 1981, 15, 169. [Google Scholar] [CrossRef]
  43. Steffensen, M.S.; Joag-dev, C.; Anderson, R.C. A cross-cultural perspective on reading comprehension. Read. Res. Q. 1979, 15, 10–29. [Google Scholar] [CrossRef] [Green Version]
  44. Cho, Y.A.; Ma, J.H. The Effects of Schema Activation and Reading Strategy Use on L2 Reading Comprehension. Engl. Teach. 2020, 75, 49–68. [Google Scholar] [CrossRef]
  45. Maghsoudi, N. The impact of schema activation on reading comprehension of cultural texts among Iranian EFL learners. Can. Soc. Sci. 2012, 8, 196–201. [Google Scholar]
  46. Darejeh, A.; Marcus, N.; Sweller, J. The effect of narrative-based E-learning systems on novice users’ cognitive load while learning software applications. Educ. Technol. Res. Dev. 2021, 69, 2451–2473. [Google Scholar] [CrossRef]
  47. Schneider, W.; Shiffrin, R. Controlled and automatic human information processing: I. Detection, search and attention. Psychol. Rev. 1977, 84, 1–66. [Google Scholar] [CrossRef]
  48. Shiffrin, R.; Schneider, W. Controlled and automatic human information processing: II. Perceptual learning, automatic attending, and a general theory. Psychol. Rev. 1977, 84, 127–190. [Google Scholar] [CrossRef]
  49. Kotovsky, K.; Hayes, J.; Simon, H. Why are some problems hard? Evidence from Tower of Hanoi. Cogn. Psychol. 1985, 17, 248–294. [Google Scholar] [CrossRef]
  50. Sweller, J.; Levine, M. Effects of goal specificity on means–ends analysis and learning. J. Exp. Psychol. Learn. Mem. Cognit. 1982, 8, 463. [Google Scholar] [CrossRef]
  51. Sweller, J. Working memory, long-term memory, and instructional design. J. Appl. Res. Mem. Cognit. 2016, 5, 360–367. [Google Scholar] [CrossRef]
  52. Renkl, A. The Worked Example Principle in Multimedia Learning. In The Cambridge Handbook of Multimedia Learning; Cambridge University Press: Cambridge, UK, 2021; pp. 231–240. [Google Scholar]
  53. Van Merrienboer, J.J.G.; Krammer, H.P.M. Instructional strategies and tactics for the design of introductory computer programming courses in high school. Instr. Sci. 1987, 16, 251–285. [Google Scholar] [CrossRef] [Green Version]
  54. Chandler, P.; Sweller, J. The split-attention effect as a factor in the design of instruction. Br. J. Educ. Psychol. 1992, 62, 233–246. [Google Scholar] [CrossRef]
  55. Ayres, P.; Sweller, J. The Split-Attention Principle in Multimedia Learning. In The Cambridge Handbook of Multimedia Learning; Cambridge University Press: Cambridge, UK, 2021; pp. 199–211. [Google Scholar]
  56. Castro-Alonso, J.C.; Sweller, J. The Modality Principle in Multimedia Learning. In The Cambridge Handbook of Multimedia Learning; Cambridge University Press: Cambridge, UK, 2021; pp. 261–267. [Google Scholar]
  57. Mousavi, S.Y.; Low, R.; Sweller, J. Reducing cognitive load by mixing auditory and visual presentation modes. J. Educ. Psychol. 1995, 87, 319. [Google Scholar] [CrossRef]
  58. Chandler, P.; Sweller, J. Cognitive Load Theory and the Format of Instruction. Cognit. Instr. 1991, 8, 293–332. [Google Scholar] [CrossRef]
  59. Fiorella, L.; Mayer, R.E. Principles for Reducing Extraneous Processing in Multimedia Learning. In The Cambridge Handbook of Multimedia Learning; Cambridge University Press: Cambridge, UK, 2021; pp. 185–198. [Google Scholar]
  60. Sweller, J.; Van Merriënboer, J.J.G.; Paas, F. Cognitive Architecture and Instructional Design: 20 Years Later. Educ. Psychol. Rev. 2019, 31, 261–292. [Google Scholar] [CrossRef] [Green Version]
  61. Sandoval, J. Constructing conceptual change in consultee-centered consultation. J. Educ. Psychol. Consult. 2003, 14, 251–261. [Google Scholar] [CrossRef]
  62. Kitchner, K.S. Cognition, Metacognition, and Epistemic Cognition. Hum. Dev. 1983, 26, 222–232. [Google Scholar] [CrossRef]
  63. Osman, M.; Hannafin, M.J. Metacognition research and theory: Analysis and implications for instructional design. Educ. Technol. Res. Dev. 1992, 40, 83–99. [Google Scholar] [CrossRef]
  64. Piaget, J. Origins of Intelligence in the Child; Routledge & Kegan Paul: London, UK, 1936. [Google Scholar]
  65. Azmi, F.T. Mapping the learn-unlearn-relearn model. Eur. Bus. Rev. 2008, 20, 240–259. [Google Scholar] [CrossRef]
  66. Klein, E.J. Learning, unlearning, and relearning: Lessons from one school’s approach to creating and sustaining learning communities. Teach. Educ. Q. 2008, 35, 79–97. [Google Scholar]
  67. Norton, J.L. Creative thinking and the reflective practitioner. J. Instr. Psychol. 1994, 21, 139. [Google Scholar]
  68. Ausubel, D.P. The use of advance organizers in the learning and retention of meaningful verbal material. J. Educ. Psychol. 1960, 51, 267–272. [Google Scholar] [CrossRef]
  69. Graves, M.F.; Cooke, C.L.; Laberge, M.J. Effects of Previewing Difficult Short Stories on Low Ability Junior High School Students’ Comprehension, Recall, and Attitudes. Read. Res. Q. 1983, 18, 262. [Google Scholar] [CrossRef]
  70. Wierwille, W.W.; Eggemeier, F.T. Recommendations for Mental Workload Measurement in a Test and Evaluation Environment. Hum. Factors J. Hum. Factors Ergon. Soc. 1993, 35, 263–281. [Google Scholar] [CrossRef]
  71. Sewell, J.L.; Boscardin, C.K.; Young, J.Q.; Cate, O.T.; O’Sullivan, P.S. Measuring cognitive load during procedural skills training with colonoscopy as an exemplar. Med. Educ. 2016, 50, 682–692. [Google Scholar] [CrossRef]
  72. De Jong, T. Cognitive load theory, educational research, and instructional design: Some food for thought. Instructional Science 2010, 38, 105–134. [Google Scholar] [CrossRef] [Green Version]
  73. Lin, Y.T.; Wu, C.C.; Hou, T.Y.; Lin, Y.C.; Yang, F.Y.; Chang, C.H. Tracking students’ cognitive processes during program debugging—An eye-movement approach. IEEE Trans. Educ. 2015, 59, 175–186. [Google Scholar] [CrossRef]
  74. Martin, S. Measuring cognitive load and cognition: Metrics for technology-enhanced learning. Educ. Res. Eval. 2014, 20, 592–621. [Google Scholar] [CrossRef]
  75. Ahn, W.K.; Brewer, W.F.; Mooney, R.J. Schema acquisition from a single example. J. Exp. Psychol. Learn. Mem. Cognit. 1992, 18, 391–412. [Google Scholar] [CrossRef]
  76. Mehmood, K.; Cherfi, S.S.-S. Evaluating the Functionality of Conceptual Models. In Advances in Conceptual Modeling—Challenging Perspectives. ER 2009; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2009; Volume 5833, pp. 222–231. [Google Scholar]
  77. Wong, R.M.; Lawson, M.J.; Keeves, J. The effects of self-explanation training on students’ problem solving in high-school mathematics. Learn. Instr. 2002, 12, 233–262. [Google Scholar] [CrossRef]
  78. Kim, K. Graphical Interface of Knowledge Structure: A Web-Based Research Tool for Representing Knowledge Structure in Text. Technol. Knowl. Learn. 2019, 24, 89–95. [Google Scholar] [CrossRef]
  79. Perkins, D.N.; Salomon, G. Transfer of learning. In International Encyclopedia of Education, 2nd ed.; Pergamon: Oxford, UK, 1992; pp. 6452–6457. [Google Scholar]
  80. Reed, S.K. Pattern recognition and categorization. Cogn. Psychol. 1972, 3, 382–407. [Google Scholar] [CrossRef]
  81. Kalyuga, S. Enhancing transfer by learning generalized domain knowledge structures. Eur. J. Psychol. Educ. 2013, 28, 1477–1493. [Google Scholar] [CrossRef]
  82. Fuchs, L.S.; Fuchs, D.; Finelli, R.; Courey, S.J.; Hamlett, C.L. Expanding Schema-Based Transfer Instruction to Help Third Graders Solve Real-Life Mathematical Problems. Am. Educ. Res. J. 2004, 41, 419–445. [Google Scholar] [CrossRef]
  83. Espejo, J.; Day, E.A.; Scott, G. Performance evaluations, need for cognition, and the acquisition of a complex skill: An attribute–treatment interaction. Personal. Individ. Differ. 2005, 38, 1867–1877. [Google Scholar] [CrossRef]
  84. Henry, L.A.; Messer, D.J.; Nash, G. Testing for Near and Far Transfer Effects with a Short, Face-to-Face Adaptive Working Memory Training Intervention in Typical Children. Infant Child. Dev. 2014, 23, 84–103. [Google Scholar] [CrossRef]
  85. Karbach, J.; Kray, J. How useful is executive control training? Age differences in near and far transfer of task-switching training. Dev. Sci. 2009, 12, 978–990. [Google Scholar] [CrossRef]
  86. Kaner, C.; Padmanabhan, M. Practice and transfer of learning in the teaching of software testing. In Proceedings of the Annual Conference on Software Engineering and Training (CSEE&T), Dublin, Ireland, 3–5 July 2007; pp. 157–166. [Google Scholar]
  87. Brown, T.; McCracken, M.; O’Kane, P. Don’t forget to write: How reflective learning journals can help to facilitate, assess and evaluate training transfer. Hum. Resour. Dev. Int. 2011, 14, 465–481. [Google Scholar] [CrossRef]
  88. Lee, J.-Y.; Reigeluth, C.M. Formative research on the Heuristic Task Analysis process. Educ. Technol. Res. Dev. 2003, 51, 5–17. [Google Scholar] [CrossRef] [Green Version]
  89. Cooke, N.J. Varieties of knowledge elicitation techniques. Int. J. Hum. Comput. Stud. 1994, 41, 801–849. [Google Scholar] [CrossRef]
  90. Forsythe, D.E.; Buchanan, B.G. Knowledge acquisition for expert systems: Some pitfalls and suggestions. IEEE Trans. Syst. Man Cybern. 1989, 19, 435–442. [Google Scholar] [CrossRef]
  91. Benbenishty, R. An Overview of Methods to Elicit and Model Expert Clinical Judgment and Decision Making. Soc. Serv. Rev. 1992, 66, 598–616. [Google Scholar] [CrossRef]
  92. Wenger, E. Artificial Intelligence and Tutoring Systems; Elsevier, B.V.: Amsterdam, The Netherlands, 1987. [Google Scholar]
  93. Roddick, J.F. A survey of schema versioning issues for database systems. Inf. Softw. Technol. 1995, 37, 383–393. [Google Scholar] [CrossRef] [Green Version]
  94. Nordahl, H.M.; Holthe, H.; Haugum, J.A. Early maladaptive schemas in patients with or without personality disorders: Does schema modification predict symptomatic relief? Clin. Psychol. Psychother. 2005, 12, 142–149. [Google Scholar] [CrossRef]
  95. Yue, C.L.; Castel, A.D.; Bjork, R.A. When disfluency is—And is not—A desirable difficulty: The influence of typeface clarity on metacognitive judgments and memory. Mem. Cognit. 2013, 41, 229–241. [Google Scholar] [CrossRef] [Green Version]
  96. Kop, R.; Fournier, H.; Mak, J.S.F. A pedagogy of abundance or a pedagogy to support human beings? Participant support on massive open online courses. Int. Rev. Res. Open Distrib. Learn. 2011, 12, 74–93. [Google Scholar] [CrossRef] [Green Version]
  97. Sweller, J. Cognitive load theory, learning difficulty, and instructional design. Learn. Instr. 1994, 4, 295–312. [Google Scholar] [CrossRef]
  98. Alexander, P.A. The Development of Expertise: The Journey From Acclimation to Proficiency. Educ. Res. 2003, 32, 10–14. [Google Scholar] [CrossRef] [Green Version]
Figure 1. A relationship between the five types of schema and element interactivity.
Figure 1. A relationship between the five types of schema and element interactivity.
Education 12 00271 g001
Figure 2. Schema elaboration process.
Figure 2. Schema elaboration process.
Education 12 00271 g002
Figure 3. Schema-based instructional design model.
Figure 3. Schema-based instructional design model.
Education 12 00271 g003
Table 1. General and Schema Analysis for Designing Large-Scale Online Learning Environments.
Table 1. General and Schema Analysis for Designing Large-Scale Online Learning Environments.
PhaseStageKey ObjectiveReference
General AnalysisNeeds AnalysisIdentifying preconditions of the instructional context for optimal learning outcomes and actions
Identifying discrepancies between the goals and the presence
[20]
Learner analysisIdentifying learners’ (1) prior knowledge, (2) learning strategies, (3) motivations, (4) computer-internet self-efficacy, (5) self-directed learning, (6) learner control, and (7) online communication self-efficacy [19,22,23,25]
Learning Environment (context) AnalysisUnderstanding/identifying the nature of learning environments composed of (1) the location, (2) the format (e.g., online vs. face-to-face), (3) the number of learning peers, and (4) the purpose of learning environments, as well as (5) the Learning Management System (LMS) [19,24]
Performance Objective AnalysisSynthesizing three primary components: (1) contexts, (2) learner, and (3) problems with potential solutions in order to create effective performance objective analysis [29]
Feasibility AnalysisIdentifying the four primary aspects of feasibility: (1) economic, (2) technical, (3) operational, and (4) schedule feasibility (Timeline) [30]
Schema AnalysisCognitive Task Analysis Explicating what experts know, the way they think, how they organize and structure their knowledge, and how they think to achieve the desired performance outcomes
Conducting (1) behavioral task analysis to capture overt actions and (2) procedural task analysis to capture covert actions/concepts
[31,38]
Schema Hierarchical AnalysisPerforming schema hierarchization, starting with modeling naturalistic action with the goals of identifying the sequential subordinate actions involved in a task
Understanding the architecture of the overall model including (1) perceptual input, (2) internal representation, and (3) actions
[33,34]
Knowledge MappingPerforming a knowledge map analysis for providing a visual of such knowledge base
Creating a knowledge map of a schema following the 9-step process suggested by researchers.
[35,36,37]
Table 2. Schema-Based Instructional Design and Development Process, Goals, Objectives, and Strategies.
Table 2. Schema-Based Instructional Design and Development Process, Goals, Objectives, and Strategies.
StageKey Objective(s)Instructional StrategiesReference
Schema activationActivating a learner’s prior knowledge and experiences to accelerate the learning process
Helping novel learners quickly grasp an overview understanding of a concept or topic
Pre-reading activities
Pre-organizers
Advance organizers
Previews
Thematic organizers
[39,41,42,43,44,45,68,69]
Schema constructionAssisting learners with self-initiated instructional activities for accelerating schema construction
Helping a learner exposed to a framework with themes for schema construction
Cases
Interactive video-recordings
Hierarchical concept maps
Vee diagrams
[39,46]
Schema automationAssisting learners to be able to solve problems effortlessly
Helping learners automate acquired schema
Goal-free problems
Worked example
Completion problems effect
The split-attention effect
The modality effect
The redundancy effect
The variability effect
[4,7,49,51,53,60]
Schema modificationsHelping learners practice assimilation-accommodation processes for updating schema
Facilitating learners to understand their existing schema and mismatching schema in reality for schema modification
Training learners to unlearn in order to learn and relearn
Unlearning/relearning
Reflection
[63,64,66]
Schema elaborationEnabling a learner to mature, saturate, or sophisticate the acquired schema through instruction in a different yet similar domainPattern recognition
Parallel schema interaction
[22]
Table 3. Kinds of Heuristic Knowledge.
Table 3. Kinds of Heuristic Knowledge.
Kinds of KnowledgeDefinitions
GuidelinesPrescriptive principles or “rules of thumb” that a task expert (student) uses to attain the goals for the specific performance of the task
Explanatory modelsA set of related reasons that constitutes a causal model explaining why the guidelines work
Descriptive modelsA set of related causal relationships characterizing the phenomena or objects with which the expert works (as opposed to the activities the expert performs)
Metacognitive decision rulesA set of rules the expert (a student) uses to decide when to use which steps, guidelines, and descriptive models during the specific performance of the task being analyzed
Reprinted with permission from ref. [53]. Copyright 2003 Association for Educational Communications and Technology.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Jung, E.; Lim, R.; Kim, D. A Schema-Based Instructional Design Model for Self-Paced Learning Environments. Educ. Sci. 2022, 12, 271. https://doi.org/10.3390/educsci12040271

AMA Style

Jung E, Lim R, Kim D. A Schema-Based Instructional Design Model for Self-Paced Learning Environments. Education Sciences. 2022; 12(4):271. https://doi.org/10.3390/educsci12040271

Chicago/Turabian Style

Jung, Eulho, Rachel Lim, and Dongho Kim. 2022. "A Schema-Based Instructional Design Model for Self-Paced Learning Environments" Education Sciences 12, no. 4: 271. https://doi.org/10.3390/educsci12040271

APA Style

Jung, E., Lim, R., & Kim, D. (2022). A Schema-Based Instructional Design Model for Self-Paced Learning Environments. Education Sciences, 12(4), 271. https://doi.org/10.3390/educsci12040271

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop