Next Article in Journal
Understanding School Leadership’s Influence on Teacher Retention in High-Poverty Settings: An Exploratory Study in the U.S.
Next Article in Special Issue
Project-Based Learning (PBL) as an Experiential Pedagogical Methodology in Engineering Education: A Review of the Literature
Previous Article in Journal
Annotating Throughout or Annotating Afterward: Preservice Teachers’ Experiences with the ANNOTO Hyper-Video in Blended Learning
Previous Article in Special Issue
Using Problem-Based Learning and Gamification as a Catalyst for Student Engagement in Data-Driven Engineering Education: A Report
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Academic System Influence on Instructional Change: A Conceptual Systems Dynamics Model of Faculty Motivation to Adopt Research-Based Instructional Strategies (RBIS)

by
Juan Manuel Cruz-Bohorquez
1,*,
Stephanie G. Adams
2 and
Flor Angela Bravo
3
1
Experiential Engineering Education Department, Rowan University, Glassboro, NJ 08028, USA
2
Erik Jonsson School of Engineering & Computer Science, University of Texas at Dallas, Richardson, TX 75080, USA
3
Universidad Nacional de Colombia—Sede de La Paz, La Paz—Cesar 202017, Colombia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(5), 544; https://doi.org/10.3390/educsci14050544
Submission received: 24 April 2024 / Revised: 11 May 2024 / Accepted: 11 May 2024 / Published: 17 May 2024
(This article belongs to the Special Issue Challenges of Project Based Learning (PBL) in Engineering Education)

Abstract

:
Many universities have implemented initiatives to drive instructional change, yet their success has often been limited due to a lack of recognition of academia as a complex dynamic system. This paper explores how the interconnected and dynamic nature of academic systems influences faculty motivation to adopt instructional innovations, such as project-based learning (PBL) and small group collaborations (SGCs). We present a Conceptual Systems Dynamics Model (CSDM) that illustrates these interconnections, demonstrating how systemic factors create feedback loops that either reinforce or hinder faculty motivation, as well as other related factors. These loops, represented as Causal Loop Diagrams (CLDs), were derived from literature reviews and qualitative data obtained from interviews and focus groups involving 17 faculty and administrators within an Engineering Department at a research university in South America. The paper identifies thirteen CLDs, comprising seven reinforcing dynamics that positively influence faculty motivation and six balancing dynamics that exert negative pressure. Using empirical evidence and analysis, we describe how the systemic factors influence faculty motivation, and how shifts in motivation reciprocally impact these interconnected factors. By elucidating the complex dynamics at play, this research contributes to a deeper understanding of how to promote sustainable instructional change within academic institutions.

1. Introduction

For decades we have been hearing calls for instructional change in engineering education. These calls aim toward increasing the pedagogical quality of our learning environments, particularly by increasing the adoption of Research-Based Instructional Strategies (RBIS) [1,2,3,4]. As their name indicates, RBIS are teaching practices that research shows are effective in improving students’ learning. However, the various strategies that universities have taken to attend to the calls for instructional change, that is, the calls for increasing the adoption of RBIS, have yielded low to moderate success [5,6]. For example, McKenna, Froyd [7] found that change initiatives focus on faculty but avoid other participants in the educational ecosystem. Henderson, Beach [5] found that the communities that study and lead change (i.e., colleges/schools of engineering administrators, higher education researchers, and universities’ centers for teaching and learning) are isolated from each other. Finelli, Daly [8] suggested that these communities should be integrated around the characteristics of the local context, such as a particular school of engineering. Other authors argued that the lack of success occurs because change efforts have been driven by implicit, tacit, simplistic, or not-grounded-in-research theories of change [6,9] or that the initiatives advocate for single interventions as the solution to the problem [10]. These different reasons suggest that success in change initiatives depends on the integration of several factors and change agents’ actions and that change does not occur in giant steps [11].
Nonetheless, there is one key variable that directly and ultimately determines the success of instructional change initiatives: the faculty motivation to adopt RBIS in their courses. Even if university or college-wide policies mandate such a change, it ultimately relies on individual professors being willing to change their practices [12]. Professors will change their practices only if they are motivated to do so. Motivation, as its name indicates, represents the motion or engine that drives and sustains individual physical and mental activity, behaviors, and verbalizations [13]. In short, instructional change will be unsuccessful if faculty decide not to adopt and sustain such strategies in their classrooms. Nonetheless, we are not arguing that faculty motivation occurs in isolation, or that it depends solely on the individual; rather, we understand that faculty motivation is affected by external and internal factors of the academic system that dynamically interact to reinforce or balance the decision to adopt RBIS in their courses [14]. In summary, faculty motivation is dynamic and affected by the complexity of the academic system. Therefore, understanding how the system influences faculty motivation is key to understanding instructional change [15]. The reader can note that we use the term faculty motivation as an umbrella term when referring to faculty motivation to adopt two specific RBIS: Project-based learning (PBL) and small group collaborations (SGCs). We will explain later why we chose these two strategies.
The sheer number of factors and their intricacies hint at another perspective that could explain this low success in change initiatives: that academia is a complex system [16]. As such, it does not have isolated drivers or root causes that are individually capable of generating change. Therefore, facilitating instructional change in engineering education requires an approach that acknowledges the complex nature of the academic system and explains how everything is connected to everything else [17,18]. This research uses one of these tools: creating a Conceptual System Dynamics Model (CSDM) [17] to explore how these multiple factors interact dynamically to reinforce or control instructional change. We followed a process suggested by Sterman [17] with emphasis on its qualitative approach to data collection and analysis [19,20]. In essence, this method aims to understand the behavior of complex systems and to recognize a range of feedback loops (i.e., dynamics) operating within them.
The purpose of this paper is to introduce a Conceptual Systems Dynamics Model (CSDM) that elucidates and examines the interactions among multiple factors influencing faculty motivation, and vice versa. Our contribution lies in the thorough analysis of the academic system wherein we identify these factors, elucidate their causal relationships, and, importantly, uncover how these relationships intertwine to form feedback loops. In essence, we have identified and interconnected various factors, discerned the dynamics that emerge from these interconnections, and explored their impact on faculty motivation, as well as how faculty motivation reciprocally influences these factors. Furthermore, we show how these dynamics are interconnected with one another, resulting in a non-linear effect on faculty motivation. These dynamics are depicted as Causal Loop Diagrams (CLDs) which, when integrated, constitute the CSDM.
In this paper, we explored how the dynamics of the factors of the academic system influence faculty motivation to adopt two specific RBIS: Project-based learning (PBL) and small group collaborations (SGCs) (from now on, we will call it faculty motivation). We selected these two RBIS because they provide a rich case to illustrate the dynamics of an instructional practice that is highly beneficial but difficult to implement. On one hand, the literature suggests PBL and SGCs are beneficial to develop engineering and professional skills [21,22,23], are highly tied to skills used in engineering practice [24], and can be applied in a variety of subjects [25]; on the other hand, they present several barriers to their adoption at the individual and institutional level such as the huge variation of their implementations, and time required both for their training, implementation, and assessment [26,27].
The proposed model was built through an iterative process of systematically reviewing the literature and gathering data from semi-structured interviews and a focus group with purposely selected faculty members at an engineering department. The analysis of these data led to a model consisting of thirteen CLDs, comprising seven reinforcing dynamics that positively influence faculty motivation and six balancing dynamics that exert negative pressure.
We start this article with an introduction to the theory we used to understand faculty motivation and a brief review of the factors that affect it. Later, we present the methods to elicit, analyze, and describe the model. Then we introduce the model by describing each feedback loop, supported by the literature and qualitative data, and how each loop integrates with the previous until it forms the CSDM.

1.1. Theoretical Framework

To identify how faculty motivation is influenced by the academic system, we used complex systems theory as a framework to determine and explore how and why certain factors affect faculty motivation to enact change in their instruction. In this theory, complexity refers to the intrinsic interconnectedness and interdependence of elements within a system [18,28] and it encompasses three basic tenets:
(1)
Reality is composed of multiple intertwined and dynamic systems [28];
(2)
These systems display emerging characteristics that cannot be fully understood by examining each part separately or in isolation [28]. That is, the behavior of the whole cannot be fully explained by analyzing its components in isolation [18] or by breaking it down into simpler parts [28];
(3)
These systems involve inherent contradictions, uncertainties, and non-linearities [28]:
Contradictions, because when facing change, factors in a complex system can be drivers and barriers simultaneously;
Uncertainties, because complex systems are policy-resistant by nature [16,17] which means that policies intended for positive outcomes could lead to unintended consequences [17,18];
Non-linearities, because actions in one part of the system generate reactions in another part of the system that are not directly tied to those initial actions [17,18]; hence, small changes can have significant and unexpected effects on the overall system [18], and increases in the outcomes are not proportional to or have a low correlation with the increases in what causes those outcomes [29].
In this work, describing and modeling the feedback loops is a way to model the non-linearities resulting from the interactions between elements of the academic system. They could also describe the emergent contradictions of such interactions by showing how driving factors under certain conditions can become barriers in the long run and the unintended consequences of instructional change policies.
We also used the literature on motivation to explain how the feedback loops influence faculty motivation. As a rich field of inquiry in the education and psychology literature, motivation consists of many theories and models, each meant to address motivation in different contexts and from different perspectives [30]. For example, we used Eccles’ expectancy-value theory [31,32] to elicit individual and social factors that motivate faculty to change their instruction. At its core, this theory suggests that faculty engage in tasks or activities in which they believe they can succeed and that they value [31,32]. This theory allowed us to ask questions and code our data to identify whether and why faculty value the adoption of PBL and SGCs. For example, adopting PBL and SGCs could be interesting and enjoyable (i.e., interest value); be perceived as important to faculty (i.e., attainment value); could be useful to their current and future plans and goals (i.e., utility value); or the benefits of adopting these strategies could outweigh its costs (i.e., cost value). Other theories like the MUSIC model of motivation [13] allow us to ask questions and code our data to identify whether and why they feel empowered to make the change (i.e., empowerment) or that by adopting these strategies they show care to the students (i.e., caring).

1.2. Review of the System Factors Affecting Faculty Motivation

Through reviewing the literature, we identified several factors that potentially impact faculty motivation to adopt RBIS, which could be barriers or drivers to instructional change in engineering education. Details of this review process, analysis, and categorization of all these factors are published elsewhere [14]. We acknowledge, however, that there are more factors whose dynamics merit more research.

1.2.1. Institutional Support Factors

These factors are part of the structures and procedures in the system and include institutional and departmental policies, the availability of resources and infrastructure, the implementation of faculty development programs, and conditions like the time allotted for the change initiative. These factors depend on the leaders of the academic institution and the administrators, but also some of these factors could emerge from faculty committed to changes within the Department [33]. Specifics of these factors are summarized in the following four subcategories:
  • Institutional policies about tenure, promotion, service, and teaching. They influence the adoption of PBL and SGCs predominantly by the weight the policies put on both teaching evaluations and teaching performance as a condition for decisions of advancement and continuation in the academy [8,34,35,36]. They vary depending on the institution type and the emphasis put on research [8,9];
  • Available resources, infrastructure, and instructional training. Insufficient institutional resources and inadequate facilities diminish the possibility of instructional change [26,37,38] mostly because they impact the expectancy of success of implementing these strategies [12]. For example, the layout of classrooms either encourages or discourages the adoption of teaching innovations [39,40]. A classroom with flexible seating arrangements for group work invites SGCs, whereas a static auditorium designed for large classes tends to promote lectures [41]. Also, instructional change is supported by providing teaching assistants or technical and logistical aid to instructors [42]. These factors of institutional support can either hinder or enhance teaching quality, contingent upon whether the focus is on productivity rather than teaching excellence [33];
  • Flexibility of curriculum. It promotes or hinders instructional change because professors are expected to cover all the content [40,42] that was originally defined for lecture-based instruction. Professors who want to adopt PBL and SGCs have found it highly difficult to cover all the content using these strategies [39,42,43,44]. Also, professors who perceive they are expected to follow the defined content sequence with a specific timing designed for direct instruction [8,34] find it difficult to adopt PBL because these strategies usually require different timing than lectures [42,45]. Although the content of the course is predefined and static, the flexibility allowed for its coverage could be a driver to change [43,44];
  • Time allotted to adopt PBL and SGCs can be a barrier to change [44]. Adopting new instructional strategies is a process that requires time to learn its pedagogical principles [37,39], preparation of class activities [38,43], and class time for its implementation [12,33,38,39,40,43].

1.2.2. Levels of Pedagogical Knowledge and Skills about PBL and SGCs

These factors explain that successful instructional change occurs alongside the pedagogical development of faculty and that this cognitive and practical knowledge is developed through a learning process resulting from conscious thought and reflection about their teaching practices. Research has shown that as a learning process, pedagogical knowledge has at least three levels (awareness, familiarity, and expertise) that must be fulfilled to sustain the adoption of PBL and SGCs [8,46].
  • Awareness. This first level of knowledge embodies the consciousness that faculty have about the existence and characteristics of PBL and SGCs [8,39]. The huge variation in their implementations represents a barrier to adoption [26,27] because it makes it more difficult to access and discern the research that validates such strategies [39,46];
  • Familiarity. This second level represents the understanding of the educational concepts behind PBL and SGCs [46,47], their effect on students’ learning [8,46], and an adaptation of these strategies to the faculty’s particular context [37]. Such adaptation can be a barrier to change because sometimes the adaptations do not follow all of the details that make them effective [45,48] or are altered in ways that err on the side of direct instruction [39];
  • Expertise. The third level implies the development of practical knowledge of these strategies that effectively improves their teaching methods [33]. Effective implementation requires awareness, familiarity, and, above all, a deeper understanding of the pedagogical tenets that explain why these strategies work [37,46]. This level of expertise is a strong driver of the sustainability of the adoption of these strategies because at this level faculty’s self-reflection and continuous improvement are an important part of their daily practices.

1.2.3. Institutional Culture Factors

These factors are closely tied to the cultural elements present within an organization. These elements encompass symbols, artifacts, attitudes, beliefs, assumptions, and collective values. They play a crucial role in shaping the interpretations individuals give to various situations. Consequently, these cultural elements establish the accepted modes of thinking and form the foundation for actions undertaken within the organization [49]. Change theorist Kezar [6] asserts that altering cultural values represents a fundamental component of second-order or deep change.
  • Symbols and artifacts. They encompass rituals, traditions, events, or historical representations that reflect the organizational culture [50]. For example, traditions like the expectation of a persistent heavy workload for faculty and that they should be highly occupied most of the time [8,38];
  • Attitudes. They involve both the perceived institutional attitudes toward faculty and the attitudes of faculty toward change. For example, administrators and some faculty members perceive faculty as inherently resistant to almost any change initiative [8,36];
  • Beliefs. They represent the shared mental models among faculty members. For instance, a common concern hindering the adoption of RBIS is the belief that favoring these strategies automatically implies opposition to any form of lecture [42]. This belief acts as a barrier to motivation for change because introducing teaching innovations challenges traditional practices and may evoke feelings of incompetence among faculty accustomed to lectures in their classes [37];
  • Assumptions. They refer to preconceived interpretations or meanings assigned to academic activities. Negative assumptions about RBIS can impede change. For example, many engineering faculty exhibit skepticism toward educational data indicating the higher impact of RBIS on learning [8]. This skepticism arises from the belief held by several faculty members that traditional teaching methods are already achieving their goals [6,39,42]. Such assumptions persist due to the prevailing notion that current educational systems consistently produce successful new scientists [51];
  • Collective values. They denote the collective importance or recognition that faculty and administrators assign to academic activities. These include the collective value placed on traditional teaching methods by faculty [6], the value attributed to innovations by administrators [12], the significance attached by both faculty and administrators to students’ deep approach to learning [37], and the balance between the value accorded to teaching and scholarship by faculty [35].

1.2.4. Student Experience Factors

These factors are associated with how students perceive their academic experience in classrooms where RBIS are implemented. Although the primary objective of enacting instructional change in academic institutions is to enhance the students’ experience, improve learning, and provide better evaluation processes [6,9,46], barriers and drivers to the adoption of PBL and SGCs can also arise from students.
  • Students’ motivation and learning. They represent the students’ willingness to study and learn [13] and serve as a driving force behind the adoption of RBIS [8,9] mostly because these strategies align with our understanding of how learning works [46];
  • Students’ evaluation of teaching. Having better scores on evaluations of teaching supports instructional change. RBIS are related to increases in these scores [42] because they provide clear means to assess students’ performance, and can alter faculty perception of teaching effectiveness [52];
  • Students’ resistance. They act as a barrier to the adoption of RBIS [9,39,40]. This resistance often stems from the unfamiliarity of students with these practices [53]. Nevertheless, the prevalence of RBIS among faculty members within a school can diminish students’ resistance, as they perceive these practices to be more commonplace [53].
Our review found that research has explained multiple factors that affect instructional change, defining them as either drivers or barriers and providing suggestions for generating the desired outcomes. However, the literature reviewed mostly avoided the implications that the complexity of academia has on the change initiative; that is, those factors do not act in isolation and their effects are not linear, have delays, and could create unexpected consequences [17]. From a complex system perspective, such non-linearity explains that there are factors that could be both barriers and drivers depending on the context and timing. For example, as we will explain later, one factor that seems to be a barrier, the increase in teaching workload, could become a driver. Indeed, there is evidence of potential time savings when faculty adopt these strategies [8,38] but, at first, their adoption will likely increase the faculty’s time commitments [42].
In summary, to understand the full extent of the complexity of academia (i.e., dynamic complexity [17]), it is necessary to understand the interrelations between the factors of the academic system. In this paper, we seek to answer the following research question: How do the dynamics of the academic system affect faculty motivation to adopt PBL and SGCs?

2. Methods

2.1. Overview

To answer the research question, we created a CSDM following the general process suggested by Sterman [17]. More specifically, we followed the qualitative data collection procedures of CSDM as suggested by Luna-Reyes and Andersen [19] and Vennix [20], and the analysis procedure suggested by Luna-Reyes, Martinez-Moyano [54], and Kim and Andersen [55]. The goal of the CSDM is to illustrate the internal dynamics and the causal structure in a system that reinforces or hinders the variable of interest (i.e., faculty motivation). These dynamics are the feedback or causal loops between the factors or elements of the system that ultimately promote the growth or decline of faculty motivation. These loops are represented in a causal loop diagram (CLD). Reinforcing CLDs illustrate dynamics that promote faculty motivation whereas balancing CLDs illustrate dynamics that hinder it. The combination of all the CLDs constitutes the CSDM.
To obtain this model, we followed three phases: (1) an analysis of the literature to identify potential dynamics of the factors influencing faculty motivation, (2) a single case study to add depth to these dynamics and identify new dynamics, and (3) an integration of the literature review and data collected to describe and illustrate those dynamics.
This process consisted of ten individual interviews and one focus group, all facilitated by the first author. Both aimed to elicit opinions, values, and knowledge about the participants’ willingness to adopt PBL and SGCs, and elements of their academic system that affect such adoption. Both were video- and audio-recorded with the participants’ consent. Each interview was informed by a preliminary analysis of previous interviews where we identified potential feedback loops. This analysis was important to determine the emphasis of the next interview; that is, whether it was necessary to explore similar dynamics, connections of similar factors, or try to elicit different dynamics. After all the interviews, we conducted a focus group following the methodology of Group Modeling Building [20,54]. Its purpose was to discuss, critique, and add depth to the feedback loops identified during the interviews.
To reduce researcher bias, we followed the process of reflexive bracketing [56]. Such a process included clarifying the suppositions that we brought to the study [57], stating our worldview from where we made the interpretations and analysis [58], and pilot-testing the interview protocols with people who share similarities with the participants [59,60]. Finally, we employed multiple strategies of validity to increase confidence in the accuracy of the findings. Such strategies include: (1) revisiting the data to look for contradictions and alternative hypotheses that explain the information in the diagrams [17,61], (2) trying to identify the implicit assumptions of our mental models considering how the outcomes of the model might change if different assumptions were used [17], (3) documenting thoroughly every step of the process and results [62], and (4) we solicited the judgments, reviews, and opinions of a diverse group of people [17,63] including member checks with four participants of the study, and feedback from researchers savvy in theories and research on motivation.

2.2. Phase 1: Analysis of the Literature

As mentioned in the introduction, we explore the literature to identify factors in the academic system affecting faculty motivation [14]. With these factors, we searched and analyzed the literature to establish potential dynamics. As suggested by Vennix [20], Luna-Reyes, Martinez-Moyano [54], and Kim and Andersen [55], this qualitative process consisted of a highly iterative process of five steps. Step A, Literature search, consisted of finding supportive literature for the factors and the potential causal links between them. Step B, Hypothesis linking, consisted of hypothesizing causal links between the factors (i.e., causal coding [64]), and drafting causal loops based on the analysis and interpretation of the literature found in Step A. Step C, CLD team meeting, consisted of regular team meetings to discuss the rationale of the causal links, defining how such links influenced the causal diagram, and establishing a narrative or story that explained the causal loops. These meetings included a system dynamics expert, who is also an engineering faculty member. In these meetings, we checked and revisited the interpretations of factors, loops, and narratives. After conducting the literature searches, hypothesizing links, and team meetings, step D, Validation, was conducted to recheck any additional loops, causal links, and factors that were added in each iteration of the process, and to ensure that the narratives were accurate with changes made to the overall CLD. Step E, System Boundary, was focused on determining the boundary of the model. In this step, we checked if the model included all relevant endogenous factors that affect faculty motivation and the consistency between the loops and the elicited narratives. This process is illustrated in Figure 1.

2.3. Phase 2: A Single-Case Research Study

Based on the knowledge obtained in phase 1, we expanded and added depth to such knowledge by creating a CSDM from a single-case research study. This exploratory inquiry was based entirely on the experiences and beliefs of the study participants. To minimize researcher bias in phase 2, we followed a reflexive bracketing procedure [56]. That is, we temporarily set aside the causal structure found in phase 1 but approached phase 2 starting only with the factors found in the literature review, and proceeded to elicit a causal structure from the participants. We followed this phase because it is a common practice to build a CSDM in single sites to understand and solve a particular problem [17].

2.3.1. Site Selection and Description

We selected an electronics engineering department in a South American country as the site to develop this research. The rationale for this selection was consistent with the requirements of a single site selection for an SDM research process [63]. It provides opportunities for unusual research access [19,20,65] including the disposition of multiple gatekeepers and characteristics of dynamic complexity [17]. The latter was warranted by the richness and variety of the ongoing circumstances within the curriculum and instructional change initiatives in the site which suggested both numerous interdependencies of the system’s elements and a high probability of finding feedback loops that reinforce or resist policies.
The selected engineering department had a combination of events and unique characteristics such as being involved in a curriculum change within an ongoing international accreditation process, and it had professors with different levels of willingness to adopt PBL or SGCs, and to change their instruction. Some of these initiatives at the Department include: ABET accreditation, changes in university policies, administrative reorganization, and the Conceiving, Designing, Implementing, and Operating (CDIO) initiative [66] that is rich in student design-build-test projects, integrating learning of professional skills, and featuring active and experiential learning. There are more than 130 institutions around the world implementing this initiative whose competencies and guidelines come from engineers in the industry and academy [67,68]. Also, its underlying values are similar to PBL [21] and, ultimately, its purpose is to drive change towards the inclusion and adoption of PBL, SGCs, and active learning [66].
Another feature is that the site has a combination of events and unique characteristics [69] to study instructional change. There are four events of ongoing change processes at different levels that have implications for the adoption of RBIS in the Department. The site selected is immersed in an ongoing curriculum change process at the Department level that started three years before the data collection of this study. In addition, it recently (1 year before) acquired the ABET accreditation, which also entails a curriculum change process. As the Department and school make changes, the university was changing its policies about teaching requirements, promotion, and teaching evaluation. Furthermore, the university was constructing a new engineering building with new laboratories and classrooms. The latter involves changes and reorganization of the administrative staff, logistics, and office spaces. In addition, at the time of data collection the site also had the following unique characteristics that made it interesting for studying an academic system: (1) it was coordinating the implementation of curriculum change initiatives with ten other universities across the Latin-American region, (2) it is one of the oldest electronics department in its country, and (3) it was part of one of the oldest and biggest private universities within its country. This history suggests the ample tradition of its internal systems and allowed us to find faculty that had been part of the Department for more than two decades and had participated in several previous change processes. The aforementioned characteristics speak to the influence the Department had in the electronics engineering academic community in its country.

2.3.2. Data Collection

This process consisted of ten individual interviews and one focus group all facilitated by the first author. Both aimed to elicit opinions, values, and knowledge about the participants’ willingness to adopt PBL and SGCs, and elements of their academic system that affect such adoption. Both were video and audio recorded with the participants’ consent. Each interview was informed by a preliminary analysis of previous interviews where we identified potential feedback loops. This analysis was important to determine the emphasis of the next interview, that is whether it was necessary to explore similar dynamics, connections of similar factors, or try to elicit different dynamics or other factors. After all the interviews, we conducted a focus group following the methodology of Group Modeling Building [20,54]. Its purpose was to discuss, critique, and add depth to the feedback loops identified during the interviews. Figure 2 illustrates the overall process of data collection and analysis to create the CSDM.
The study followed a purposeful sampling [59], drawing participants from the population of 57 faculty members in the selected Department. Seventeen participants were selected to conduct 10 in-depth interviews and one focus group (see Appendix A). This sample represented approximately 30% of the population with varied teaching experience, roles, gender, and workloads. The interview participants were different from the focus group participants. The interviews included faculty who have held administrative positions in the past and included a varied sample of the population. The focus group also included decision-makers for an ongoing international accreditation process in the Department. All data collection was held in Spanish as it was the native language of the participants.
The 1-h interviews consisted of two sections. The first section consisted of open-ended questions related to identifying the factors and reasons that positively and negatively affected the participants’ motivation. The questions were framed into the categories of factors found in the literature review. The result of this section was a list of factors that affected their motivation. The second section of the interview focused on synthesizing the stories about how the dynamics of the system worked (i.e., the dynamic hypothesis [17]) in the form of feedback loops. That is, we aimed to build a CLD with the interviewee. The process is described in the instruments protocol (see Appendix B, Table A2 and Table A3). We conducted a preliminary integration of all the CLDs elicited in the interviews, along with a narrative describing them. We extracted 15 CLDs to analyze, complement, and validate in the focus group.
The 4-h focus group also had two sections; the first involved asking open-ended questions to identify factors that affect faculty motivation, and the second focused on integrating these variables into the CLDs created from the interviews. This focus group involved discussion, complementation, and consensus-building on the description and rationale of the causal loops. The process is described in the instruments protocol (see Appendix B, Table A4 and Table A5).
Both in the interview and the focus group, after explaining the purpose of the second section and the basic notions and notation of a causal loop, from each of the identified factors in the first section, we asked their perceptions on what caused these factors and on the consequences of increasing or decreasing such factors; that is, how other factors were affected by the increase or decrease in these factors. To facilitate the discussion, we stated these factors as variables that could be quantified (e.g., class time using PBL, workload, class size, or motivation) and they were drawn in a diagram. The elaboration of these diagrams was video-recorded with the consent of the participants. Lastly, we asked for the possible links (and their rationale) between the causes and consequences and the other variables in the diagram. The CLDs created in the second sections were a way to synthesize the stories about how the dynamics of the system worked (i.e., the dynamic hypothesis [17]). In short, we elicited factors that affected their motivation, faculty’s perceptions of what caused these factors, the consequences of variations of such factors, and possible links with other factors.
To ensure the quality of the elicited data, we followed the recommendations for a good interview and focus group [20,59]. This included gaining rapport, conducting the interviews in the native language of the participants, maintaining the flow of the conversation in an open communication climate, stating clearly the purpose of the interview, making sure the respondents understood what was expected from them, maintaining awareness of the perceived relevance of the questions, not judging or criticizing their responses, being genuinely interested in the participants’ ideas and opinions, and maintaining reflective listening [20,59]. We conducted three pilot interviews with faculty who were engineering faculty members in Latin America and their native language was Spanish.

2.4. Phase 3: Integration of the Literature and Data Collected

Phase 3 occurred in two steps. First, we analyzed data to add depth, provide evidence, and integrate the CLDs constructed in phase 2. We acknowledged that during data collection, we likely did not capture all the nuances of the discussion in the preliminary CLDs; hence, we conducted a content analysis [70] of the interviews and focus group data. Its purpose was to extract all the factors, causal links, and narratives elicited during the interviews and the discussions within the focus group. To classify the information, we used open and a priori coding. The factors identified in the literature review [14], the five constructs of Expectancy Value Theory [31,32], and the constructs of empowerment and caring of the MUSIC model of motivation [13] constituted the a priori codes. We associated all the codes with the CLDs created during the data collection by checking whether such codes were included in the loops, represented new factors, or explained links between the factors. That is, we decided if a new factor arose, if data substantiated a causal relationship already defined, or if data suggested a new one. The results of this step were refined versions of the CLDs.
Second, we integrated the CLDs created in the analysis of the literature (phase 1) with those created in the single case research (phase 2). Such integration was a comparison between the CLDs plus an interpretation of their similarities and differences. On one hand, similarities between the CLDs indicated that the dynamics were supported by the literature and data. Sometimes causal links between the factors were explicit in both CLDs, and sometimes they added nuances to a causal link identified in a CLD. For example, we found in the literature that class size impacted motivation [41,71], and in the data, we found reasons explaining that such impact was caused mostly by the increased difficulty or ease of implementing the strategies in the classroom. On the other hand, differences between the CLDs of each phase indicated that they represented distinct dynamics. Nonetheless, by combining causal links found in both phases, our iterative process allowed us to identify more robust CLDs. Finally, we simplified those CLDs and created narratives explaining them both supported by qualitative data and the literature. We described thirteen final CLDs in the results section of this paper. All the CLDs combined are the conceptual system dynamics model (CSDM).

3. Results

In this section, we focus on analyzing how factors of the academic system influence faculty motivation. To understand the influence of the interactions of these factors on faculty motivation, we developed a CSDM. The developed model allowed us to identify dynamics that can reinforce or hinder motivation. We present the model in Figure 3. consisting of thirteen dynamics directly influencing faculty motivation. These dynamics are unpacked, detailed, and explained in this section. The narratives explaining the dynamics include validity evidence of each causal link in the form of either quotations from the participants or references to the literature.
The quotations were translated from Spanish and adjusted for clarity. The accuracy of the translation is warranted because the authors of this paper are fluent in both Spanish and English (two of them are native Spanish speakers). Also, because women faculty in the Department are a very small minority, to reduce the risk of showing identified data, we either use a gender-neutral pronoun or a male pronoun to interpret their quotes.
Lastly, in our results and discussion, we are using the term Research-Based Instructional Strategies (RBIS) as an umbrella referring to both PBL and SGCs. We will use the specific terms SGCs or PBL for claims or insights unique to each strategy.

3.1. Time Invested in Covering Content

When professors believe that implementing RBIS does not allow them to cover the required content, they feel less motivated to adopt such strategies. An associate professor illustrates this point: “[I don’t use PBL] because of reduced resources and time. It’s possible to guide a few students,… but with 30 it would be very difficult to apply this methodology. I have used it in courses of 20 students or even 24 and I observed they produced good results, but yet, it took me 3 h of the class to work on a problem [that should have taken me] one hour because I had to give advice to each person…”. In short, this professor argues that they do not use PBL in certain courses because it increases the difficulty of covering all the content.
This belief creates a balancing dynamic (described in Figure 4) that hinders faculty motivation: The more class time dedicated to RBIS, the lesser the content that can be covered and the lesser motivation to adopt RBIS. This dynamic is consistent with previous research which suggests that the perceived difficulty in covering content is a barrier to change [26,39,42,43,44].
This balancing dynamic is reinforced by two other factors. First, if the content, sequence, and timing of the course syllabi were originally designed for lecture-based instructions, it is more difficult to cover the content using RBIS. One reason for this difficulty is that professors are expected to cover all the content [40,42] but RBIS usually require different timing than lectures (e.g., it is possible that small group collaborations require more class time than lectures [42]). A second reason is that faculty are usually expected to follow the defined content sequence with a specific timing [8,34] but RBIS might require rearranging of the sequence of content (e.g., PBL requires students’ self-directed learning and gathering of content that allows the problem’s solution [45]). A program administrator illustrates how difficult is for professors to follow these requirements: “They have to work to try that [the course’ content] is covered. Hopefully, in the way it was designed. Do it as you want, use the strategy and didactic that you want, but follow the syllabus! That is very difficult, very difficult!”.
Second, this dynamic is reinforced when there is an increase in the expected content to cover in class either by curriculum design (e.g., a change in the learning objectives that adds more content to the class) or by a consequence of students not achieving the learning objectives of pre-required classes. Research suggests that when professors cover the content more superficially (likely because they are required to cover more content in less time), it reinforces a surface approach to learning in their students [71], which in turn reduces the likelihood of students achieving the learning objectives in a class. Our analysis suggests that this would likely hinder motivation to adopt RBIS in the subsequent courses. When professors of the subsequent course perceive that students did not learn key concepts they were expecting, they would need to teach such concepts for students to succeed. Consequently, adding more content to cover that would require more class time. An assistant professor exemplifies this reinforcing factor: “when you discover that there are more [topics] that we expect [students] to learn, you have to reduce the quality of the course somehow. In some cases, even adding more topics [to the course]; more topics imply increasing the content to cover”.

3.2. Time on Activities Required for Promotion

In general, professors manage their available time by balancing their teaching, research, and service activities which increases their recognition and their positive professor evaluations. Data suggest that the time on teaching, research, and service activities had two different dynamics that reinforce or hinder faculty motivation depending on the institutional policies (see Figure 5). First, if the institutional policies recognize the adoption of RBIS as part of the professors’ promotion, they will be more motivated to expend the available time on activities that lead to the adoption of RBIS. Conversely, if policies for promotion and continuity do not favor teaching, professors will not likely invest more time to innovate in their teaching, therefore reducing their motivation. Second, research faculty will likely increase their time invested in research or service activities because an increase in funding and publications would increase their recognition and their positive evaluations to satisfy the promotion and recontract requirements. Data suggest that promotion policies that are heavily inclined to research productivity hinder faculty motivation. An associate professor summarizes the effect of such a policy on faculty motivation by stating that faculty who want to advance up the career ladder prefer to invest their time in publishing rather than teaching: “There is a very clear faculty career ladder: you have to publish, or you will not advance in your career. Even if you teach in some focused and effective way, in the end, it is worth much less than publishing an article, so you say ‘I cannot keep doing this because I also need to advance in my professional career’”.
Our analysis suggests that even professors who adopted RBIS previously decided to invest less time in their adoption because it did not help them achieve promotion. Although professors recognized that not adopting RBIS in their courses harmed students’ motivation and learning, they considered that the required time to adopt RBIS was better spent in activities that helped them with their promotion. A professor supported this idea by suggesting that reducing the quality of his teaching does not necessarily hurt his promotion: “To be recognized as an excellent professor I had to make decisions to reduce interaction with students. I believe that my evaluations [of teaching] have gradually decreased… I accepted that it was going to happen… I know I can teach my classes better, but I do just enough to be ok with what they demand from me, and I dedicate more time to what gives me more benefits and more recognition”. In his comments, the professor illustrates that this policy reduces the likelihood of changing their teaching because professors see the time invested in teaching as less effective for their promotion. Even professors who have reduced their teaching evaluation scores do not perceive the effects of this reduction on their overall evaluation and promotion. They tend, then, to be more pragmatic and do what works well enough to have students’ evaluations that are not impacting their promotion.
These findings are consistent with previous research showing that in predominantly research institutions the emphasis put on research in the institutional policies related to tenure and promotion influences the adoption of RBIS [8,9,35] predominantly by the weight the policies put on both teaching evaluations and teaching performance as a condition for decisions of advancement and continuation in the academy.
In summary, university policies for promotion that are heavily inclined to research productivity instead of the quality of teaching reinforce the idea that time invested in adopting RBIS shortens the time on activities required for promotion. These policies create a balancing dynamic that hinders faculty motivation.

3.3. Faculty Workload

Faculty motivation can be hindered because the adoption of RBIS would likely increase the teaching workload due to the increase in preparation of class activities [26,38,43], time for grading, feedback activities [4,8,72], and time to learn its pedagogical principles [37,39]. If faculty perceive an increase in this workload, the usefulness of adopting new practices and activities of RBIS decreases because either the available time to implement them is scarce [8,53,73,74] or investing time in such practices reduces the time devoted to other important activities [12,37,44,75]. This increase creates a balancing dynamic that hinders faculty motivation shown in Figure 5: The more time dedicated in class to RBIS, the higher the workload and the less the motivation to adopt RBIS.
An assistant professor who used to adopt RBIS in his courses commented on how the workload increase has reduced his use of RBIS in class: “[for many reasons] faculty have to increase their teaching hours, but it is more difficult for a professor and more demotivating. We are forced to return to our [lectures], what we have been doing historically…”. In his comments, the professor suggests that increasing the teaching workload hinders motivation because of a combination of two reasons: professors have less time available, thus they will not likely invest the additional time that RBIS would require, and even if they do, such additional time will not increase their positive evaluations. Similarly, another assistant professor suggested that by having more time available, he was more willing to integrate PBL in his classes: “I feel that with more time I will always be able to give more, I will always be able to explain better what I like, I will be able to look for more examples, I will be able to find other books, I will be able to do more simulations, I will be able to propose better teaching strategies. Because that takes time. I know I did it at some point, when I [had lesser workload], I prepared very well my courses; that allowed me to propose courses based on projects”. Having more time available also helped the professor learn more about the content that he is teaching and find ways to make it simpler and easier for students to learn. Conversely, he considers that not having enough time to add depth to the content is limiting the quality of his classes: “I am no longer teaching my classes as before. Before, I taught them more in-depth… the different ways that I can deal with abstract concepts help the students a lot… the way you add depth to your knowledge is year after year, semester after semester. To teach the best I can implies that I can build the content in the most structured way. It seems to me that I have succeeded to some extent in simplifying topics that have traditionally been supremely complex for students. That takes time, and I would like to have a little more time to reach an even greater depth. I think that’s what professors of the best universities in the world do”.
Nonetheless, as it will be detailed further, the potential increase in teaching workload caused by the adoption of RBIS can be reduced over time as a consequence of more time invested in teaching and the practical knowledge it entails. That is, the teaching workload is decreased when the instructor has more and better ideas on which practices are appropriate and feasible, how to be more efficient in providing feedback, and how to be more effective in the logistics necessary to implement RBIS [53,73]. In this same area, an adjunct professor highlighted another effect that institutional policies have on faculty willingness to adopt RBIS. For context, as an adjunct professor, his main income comes from outside the university. Therefore, he is more careful with the time invested in teaching. Acknowledging that class innovations require additional time, he suggested that knowing that he will teach the same courses in the future motivates him to invest time applying innovations in his current courses because he could use those innovations in the future: “Another factor that motivates me is the continuity in the classes they assigned me. For example, I only have one class that I know I will teach every year. In that class, I put the effort to apply innovations. Not so much in the other classes, because of my time and salary… because I don’t know if I will be able to use those innovations later. If they assure me continuity, I will invest time, because I invest in what benefits me in the long term. Otherwise, I will use that time for other things that I need for my income”.
Another factor that increases the faculty workload over time, hence reducing motivation, is their career progression in research. It is expected that the workload will gradually increase when the funding-seeking activities (e.g., proposal writing) are fruitful, leading to more projects to implement and manage while continuously seeking more funding and publications. This means that at the beginning of their career professors would have more time available for teaching, but such availability would be reduced over time due to the gradual increase in their research workload. It is likely, then, that faculty motivation will decrease gradually as the junior professors’ careers advance.
Data suggest that an institutional policy that uses faculty publications as the most important factor in reducing the teaching load hinders faculty motivation. An assistant professor who has a high teaching workload and adopted RBIS in the past highlighted that there is a delay between the proposed projects and the publications, which are the primary promotion requirements: “Bringing in a project does not reduce my teaching workload. University policies are clear: reductions in workload only occur when I have more publications. However, to publish, I need time to work on projects, to propose them, to develop them, and then, I have to wait for the time it takes for the publication to go out…”. Consequently, professors are more inclined to invest their time in activities that allow more publications. Particularly, professors with higher teaching workloads had more difficulty in fulfilling the requirements for promotion. This leads to exhaustion and frustration, and ultimately less motivation.
In summary, the combination of the increases in teaching workload that adopting RBIS entails, the gradual increase in the workload of conducting research, the long delays between initiating research and producing publications, and the promotion policies heavily focused on publications create a feedback loop that gradually reduces the available time and makes career advancement very difficult for professors with higher teaching workloads. Professors with higher workloads, then, are less inclined to adopt RBIS in their courses because, ultimately, additional time invested in teaching will not help them with their promotion. This finding is consistent with the literature, where faculty claim that time is one of their biggest restrictions to engaging in instructional activities [44], therefore the perceived effect of adopting RBIS on the faculty’s workload is one strong barrier to instructional change [53,73].

3.4. Class Size

One of the highly important motivational factors for adopting RBIS is to have a manageable class size [41,71]. Smaller class size increases faculty motivation because it allows professors to provide better and timely feedback, reduces the teaching workload, and increases the ease of implementing RBIS [40,41,76,77].
Our data suggest that smaller class sizes facilitate the adaptation of the strategies and help to have a more positive experience when professors adopt RBIS. An adjunct professor, who uses RBIS regularly, has experienced that a smaller class size facilitates the implementation of RBIS because it allows more time to work with students in class: “… one feels the difference when working with smaller classes: the class flows, the designs flow, the activities flow, we can do activities like sharing, so everyone is engaged and learns from everybody”. Conversely, an increase in class size is a balancing factor affecting faculty motivation because it would make the adoption of RBIS more costly to professors. They illustrated that in large classes the difficulty of using RBIS is bigger compared with traditional instruction, which in turn increases their apprehension to innovate in their classes: “That is one of my biggest fears; it is not possible to teach a class of 36 people using projects. This class size only allows working on small projects, not the design projects we want students to develop”. They believe that using RBIS in large classes will increase their workload up to the point that it is not feasible to use strategies where students design big projects, therefore hindering their motivation to adopt RBIS.
Our system analysis suggests the existence of another dynamic that puts pressure on increasing class size as a consequence of the successful adoption of RBIS (see Figure 6). In the long term, higher adoption of RBIS leads to higher student success [78,79,80], thus higher retention. As Zaini, Pavlov [76] suggested, increased retention numbers tied to a higher quality of teaching increase institutional reputation, thereby increasing enrollment yield and the number of students, and consequently, adding pressure to increase the class size. In the short term, successful adoption of RBIS leads to more students motivated in an instructor’s class, and a better instructor reputation among students, which encourages other students to register for the class (e.g., by word of mouth), adding pressure to increase the class size. As an assistant professor puts it: “The reward for being good at teaching is more work, and the reward for being bad at teaching is less work!”.

3.5. Difficulty in Implementing RBIS

In general, when professors believe that it is easier to implement RBIS, they will be more motivated to adopt these strategies. Data suggest that this perceived difficulty is associated with the number and extent of possible learning activities related to RBIS that students can successfully complete in a course. As acknowledged by an adjunct professor who is willing to use PBL in his second-semester class, having clear ideas of activities or projects that are adjusted to the students’ current level of knowledge increases the ease of using PBL: “What projects can we do? I don’t know; it is not easy… How do I integrate [the course] with PBL?… We need the idea of which project can be proposed, or with which specifications [the students] can develop it”. Conversely, if professors believe that the PBL activities are not feasible for students, they will be less motivated to adopt them, as illustrated by a professor who does not use PBL regularly: “If they have such flaws and I make experiments [using PBL]… and they don’t have the basic concepts… I fear to start doing those experiments, maybe when the course is consolidated, but I see it difficult right now”. Also, professors were more inclined to use traditional instruction in courses with high abstract or theoretical content because they perceived implementing PBL in those courses was more difficult, whereas courses designed around projects instead of content (e.g., capstone design) were perceived as easier to implement PBL. As an associate professor puts it: “It is not that PBL does not produce learning, but it is harder to use PBL in these types of courses which have more theoretical or abstract concepts”.
On another hand, the literature suggests that when a faculty member is motivated to use RBIS, he or she will be devoting more class time to the use of such strategies [4,8,72]. Therefore, there are more opportunities to practice and build knowledge of what is effective for them and the class [80,81]. This leads to an increase in students’ academic motivation [8] and improves the learning environment [13]. The loop closes because when the students’ learning of theory and concepts increases, the activities that students can accomplish also increase, which in turn raises the number of possible activities to implement in a course.
In summary, the ease of applying RBIS brings a reinforcing dynamic affecting faculty motivation; the more ideas of feasible learning activities, the easier it gets to implement RBIS in the classroom and thus professors are more motivated to adopt these strategies. Increased faculty motivation leads to greater RBIS adoption and, therefore, an enhanced students’ learning of foundations and concepts, which in turn increases the number of learning activities that students can accomplish and can be implemented.

3.6. Pedagogical Training

The difficulty of applying RBIS also depends on the faculty’s knowledge or skills about these strategies [8,15,26,82]. Such knowledge shapes faculty beliefs about new teaching methods [15], how students learn [46], and their self-efficacy with RBIS [83]. An associate professor who does not use PBL regularly supported the idea that with more pedagogical knowledge about PBL, he would be more inclined to use it in his course: “If I would know [PBL] well, I would be interested in using it, but only at moments where it is needed… to introduce the concept of what is voltage, what is current, a loop, etc… I don’t know if it is definitively better [to use] PBL at every moment, for example, to explain [those concepts]”.
Commonly, professors motivated to adopt RBIS seek opportunities to increase their knowledge about such strategies [8,26,36,72,84,85,86]. An assistant professor who regularly applied RBIS in his courses and participated in formal and informal training commented on the effect that such training has on providing more feasible ideas and learning activities to implement in the classroom: “I read about pedagogy and found ideas… It is listening to people who have had interesting experiences, to having conversations”.
Conversely, professors who are less motivated to innovate in their teaching are less interested in seeking out or participating in the pedagogical training activities offered by the university. As this assistant professor suggested: “I have perceived in other professors a disinterest in changing… There are people that, semester after semester, with bad evaluations, don’t make any changes in their instructional practices… some are good teachers, they really are very good in the scenario, talking, explaining their topics, they know their content very well, but their classes are essentially the same… they don’t participate in the [training offered by the university]”.
This creates another reinforcing dynamic illustrated in Figure 7: faculty motivated to adopt RBIS are more willing to participate in formal or informal pedagogical training. Such training increased their knowledge about how to apply RBIS in their courses and gave them more ideas about how to use RBIS. The more feasible ideas for applying RBIS the easier it becomes to implement them; hence faculty are more motivated to adopt them.

3.7. Students’ Motivation and Learning

In general, professors will be more inclined to adopt RBIS if they perceive such strategies lead to more motivated and engaged students who achieve the learning objectives of their courses. In other words, the improvement of students’ motivation and learning is a driver of the adoption of RBIS [8,9] if professors perceive that spending more time implementing RBIS in the classroom is more effective than spending time on lectures [52,72,87]. This creates a reinforcing dynamic for students’ learning, students’ academic motivation, and faculty motivation illustrated in Figure 8.
Students’ academic motivation is increased when faculty adopt RBIS either because these strategies enhance students’ success [8] or improve the general learning environment [13]. When students are more motivated, they are more engaged in deep learning activities [80,88], which subsequently leads to better learning [78,80], greater educational gains, higher grades, and greater satisfaction with college [89]. The gains in these outcomes raise both student motivation and faculty motivation. On one hand, students would be more motivated because they would perceive they are more successful, and feel more interested [13,90]; on the other hand, faculty will be more motivated because they would perceive they are more effective in their teaching [52,87]. Motivation theory suggests that this perception increases their success beliefs and utility value [91] of adopting RBIS.
Our data support the existence of this dynamic, as a professor who uses RBIS regularly puts it: “I believe that what motivates me the most is that students learn and develop skills”. However, data also suggest this dynamic has more nuances. These nuances are factors or additional dynamics that mediate or condition either the impact of RBIS on students’ learning or how such learning influences faculty motivation. In short, these dynamics reinforce faculty motivation if faculty perceive that students are learning better and, more importantly, if they attribute this better learning to their implementation of RBIS. The following paragraphs describe these other six dynamics.

3.8. Students’ Engagement in Class

This dynamic represents the positive effect that the adoption of RBIS has on their students’ engagement in class. As noted in Figure 8, the more that students are engaged, the more likely it is that professors will adopt RBIS because they recognize and value that students who are engaged in class are enjoying it and are more motivated.
Professors suggested several reasons why students’ engagement increased when they implemented RBIS. First, RBIS help professors to expose students to real-world problems and contexts [33,45,46], which in turn increases students’ motivation. As an associate professor noted, there is a positive effect of showing how things work on students’ motivation when using one of these strategies in his courses: “In my personal experience, it always has been a cause of great joy to see that things work, that is, that [the class content] is not only theory… To me, when things work, it is really good. I believe that to others it can be good too. Maybe it will not be the same to everybody, but I have seen that many of the students have the same response…”. He also acknowledged how noting this motivation in his students motivated him as well: “The fact that [students] could see [things working] and could experiment… the awe in their faces was really gratifying!”.
Second, professors acknowledged that their students’ engagement increased when they implemented RBIS that involved projects related to daily life experiences: “I proposed students to substitute the final exam for their participation in a national contest. It was a nice experience because the class transformed entirely. That is, I was not the master imparting the knowledge, we started to lay out the project, to read about it, to understand different alternatives to solve it… That was super motivational! Students began to participate, to make their designs, they did spectacular things, the simulations, the model, everything!…”.
Third, professors were motivated when they perceived that their students’ engagement with the class content persists in future courses, as an associate professor of an introductory course in power electronics commented: “I like [PBL] very much, especially when I see that the students have liked it so much and many have followed the specialization of power electronics in the master’s degree”. Expressing a similar opinion, another professor was motivated because alumni have continued in a line of work related to the content taught in the class: “When I see that after graduation at the professional level, they [students] are working in that area and they tell you ‘I am working on this’, and I see they keep working on that area… somehow that makes me happy”.

3.9. Quality and Timely Feedback

This dynamic that increases faculty motivation occurs when professors perceive that RBIS are useful for increasing the quality and timing of the feedback provided to students (see Figure 8). Quality feedback provides students with information about their current state of knowledge and it can guide them in working toward achieving their learning goals [46]. Adopting RBIS will lead to better learning because students receive quality and timely feedback from the professor to correct methodological mistakes, misconceptions, or misunderstandings. As a result, faculty would perceive RBIS as more effective [52] and therefore increasing their motivation to adopt them.
An experienced associate professor who has used RBIS in his classes explained how the timely feedback he provided to the students in class helps them to avoid misunderstandings or myths during the solution of a problem, which helps students learn better: “[Using RBIS] is more effective and efficient… When they develop the problems in class, we reduce the possibility of errors, false ideas, and their “hunting and guessing”. [i.e., trial and error]… it is better for their learning”. This professor also suggested that using RBIS allows to provide quality feedback in class, which in turn reduces the additional time for grading or reviewing students’ work after class: “It is better when I can advise each group in class… there are risks when they are home alone [without someone who supervises them], myths are created, myths that [problems] are solved uniquely. It would take me more time reviewing their work at my office and writing or telling them what to fix”. Both increases in student learning and potential reduction in the workload after class make RBIS more useful and reinforce motivation.

3.10. Students’ Evaluation of Teaching

This dynamic occurs when professors acknowledge that an increase in students’ academic motivation and learning is reflected in more positive students’ evaluation of teaching (SET) (see Figure 9). As mentioned, students learn more because they are more motivated and engaged in the learning activities. Such engagement, complemented by deep explanations of the theory behind the activities provided by the professors, makes it more likely for students to show appreciation for their professor by providing better SETs [79,92,93]. Such appreciation has a positive effect on faculty motivation, first, because if professors attribute the adoption of RBIS to better SET scores, they would perceive RBIS as more effective [8,73]. Other informal demonstrations of gratitude for the use of RBIS shown by students would reinforce such a better perception. An associate professor who has used PBL regularly provided an example of how former students of his course recognize his teaching positively, besides formal SETs, and how this recognition motivates him: “My students finish the course very happy; they appreciate it, they keep writing to me in the following semesters… I see they liked my teaching… that motivates me too because recognition comes from students…”.

3.11. Permissiveness

Although good grades and good SETs are commonly attributed to better students’ motivation and learning [94], we found data indicating a collective belief that such grades and SETs could be attributed to how permissive the professors were with their students instead. This collective belief creates a balancing dynamic that hinders faculty motivation (see Figure 9). If faculty attribute the adoption of RBIS to how permissive the professors were with their students instead of to an increase in students’ learning, they are less motivated to adopt these strategies. In this case, because faculty consider it very important to be recognized as highly demanding teachers, adopting RBIS would go against their identity as academic professors [95]. To many professors, being perceived as permissive questions their overall quality of teaching.
Two circumstances reinforce the belief that associates good grades and SETs with permissiveness. First, is the perception that the overall students’ academic quality is reduced and that such quality is also reduced due to a decreased selectivity in admissions. An administrator who is also an assistant professor shared his perceptions about the first-year students’ academic quality: “Many senior professors believe that it is the students who don’t ‘function’. That is the difference… Everybody perceives it! They say that students come ill-prepared from high school, that every year they’re getting worse… Everybody recognizes this, but everybody knows that we need to teach these students, who are not the same as ten years ago”. Such a perception of reduced quality combined with a sub-par applicant pool from which to select prospective students has increased the pressure felt by professors to increase students’ results, as an associate professor described: “My responsibility as an institution is to elevate [the students] to the point of excellence that I want, at the expense of whatever is necessary, because it is the university’s pledge… if we have to push them [students], then we have to push them… our responsibility is to graduate them with excellence, not only graduate the best we receive… We should admit only the best [students], but we are not in a time of high demand… [the university] is telling us to admit 40 or 80 students with a profile that is not as superior as we were used to 20 years ago… here, the selection is reduced, and we can’t make a long face over it”.
Second, depending on the professors’ perceptions of job security, such pressure increases the likelihood of being permissive with their students, which in turn increases students’ grades [96,97]. The incentives in the system can be highly associated with positive SET scores, that is, professors’ perceptions of job security pressure them to influence high SET scores in students. As studied by [97], our data suggest this pressure could be higher on adjunct faculty, whose contract renewal is partially tied to the SETs. A senior associated professor, who does not use RBIS regularly, illustrated this notion: “When he [a tenured senior professor] teaches a class, he could change the strategy, but he makes sure students are learning what they need to learn, and he is not going to lose his job. If that happens to a professor like [an adjunct professor]… his students complained that the professor was very tough, or something like that. What was the solution? That he will not teach that course anymore. Next semester, another professor is teaching the class… his students told me that he does not have any clue [about the topics]. Students have such power that they could even get a professor fired”. He recognized that such pressure occurs to full-time faculty, and he shared how he unappreciated the SET scores because he believed they are highly related to students’ passing or failing the course: “Those professors have good SETs because the students pass their courses, whereas other professors have bad SET because students fail the course… we must reduce attrition, so what is the solution? To facilitate students passing the course”.
Another assistant professor who sometimes uses RBIS underscores the subjective nature of the SETs: “Those SETs have to be cautiously understood because students don’t really know what the professor has to teach. Students only have a perception from their viewpoint that is not the whole reality”. In his remarks, the professor acknowledges that some professors’ decisions to teach a hard lesson to students are not necessarily convenient, which could affect the SETs. As a senior associate professor who does not use RBIS regularly agreed: “I believe certain professors have the mindset of doing whatever is possible to make students learn. I believe that not everybody is on that mindset because students evaluate you, then if students have a good performance they say ‘that professor is wonderful’ if not…”.
In summary, if professors believe that RBIS are associated with more permissiveness, they will be less inclined to adopt such strategies in their classroom. Such permissiveness is reinforced by the pressures felt by the professors to reduce attrition combined with a perception of the lesser academic quality of their incoming students and that other professors who apply RBIS receive positive SET scores. Conversely, professors who share a belief that good grades and good SETs are the results of an increase in the students’ academic quality are less likely to associate positive SETs with permissiveness in grading and will be more motivated to adopt RBIS.

3.12. Students’ Ability to Succeed with Learning Activities

The dynamic illustrated in Figure 10 occurs when professors assess whether the students have a minimum level of skills to be able to succeed in the proposed learning activities. If the students have this minimum level, professors are more motivated to adopt RBIS in their courses because they expect their students will be successful in the activities and therefore more motivated to learn. Conversely, if professors consider that their students do not have the expected minimum level, they are less motivated to adopt RBIS because they believe students would be frustrated not being able to solve the problems. Such frustration would decrease their students’ motivation, thus reducing learning objectives and skill acquisition, and ultimately hindering faculty motivation. As described by one professor who was willing to use RBIS in his courses, the students’ motivation could be reduced when the academic demands of RBIS increase to a point where students cannot succeed: “If we ask them something outside of their ability, students wouldn’t reach a solution, and they will feel they didn’t learn anything”.
This frustration could not only occur in the students but in the professors as well, which leads to decreases in students’ and faculty motivation in consequence. A senior professor observed this reaction in another professor who was trying to use PBL in his course: “For [him], there is a huge difference [between the expected and actual problem-solving ability of his students], and sometimes he yells because he is frustrated and notices that the students seem like they are not responding to him, and he reacts. Sometimes, that leads to problems with the students. Hence they don’t learn, they are less engaged with the class and the professor, and that will become something negative”.
The fear of the possibility of such a negative effect on students’ learning is a powerful negative motivator. To some faculty, using RBIS with students who do not have the necessary knowledge or skills to successfully complete the proposed learning activities is a risk that they are not willing to take. A professor who uses traditional instruction regularly highlighted this fear: “I have seen in [students] a tremendous number of gaps in knowledge; I fear to start doing such [RBIS] trials … If they have such failures and don’t have the basic concepts, and I start doing experiments… then, there is no guarantee [they learn]”. In consequence, some professors preferred to use direct instruction (i.e., lectures) aiming to increase students’ learning and problem-solving abilities up to the point where students can solve the proposed learning activities by themselves.

3.13. Sense of Urgency to Change

This dynamic illustrated in Figure 10 occurs when professors believe that direct instruction is an effective strategy for students to achieve the learning objectives of their courses. When professors have this belief, there is no apparent, compelling, or urgent reason to change [98,99]. As some change theorists suggest [100,101], an increased sense of urgency is an igniter for change whereas a reduced sense of urgency is a powerful inhibitor of change. An assistant professor illustrates how this belief impacts the sense of urgency; he acknowledges that RBIS works better to increase students learning, but believes that direct instruction also works: “Sometimes I go back a little towards traditional instruction because at the end we have to acknowledge that direct instruction works… we are not going to disregard it, it is only that I have been reducing such component [traditional instruction] because I also know that we learn more when we find our own patterns, when we have an ‘aha’ moment”. The literature suggests that this faculty belief is reinforced because their professors used lectures as the main teaching strategy and they consider it worked effectively for them [74].
This belief creates a dynamic that hinders faculty motivation for two reasons. First, research on learning suggests that the continued use of lectures is less effective in students gaining the required problem-solving abilities, teamwork, or other non-technical skills [53,102]. Thus, it is less likely that by using lectures students obtain the required learning and skills to solve the proposed learning activities or reach the expected learning outcomes, which in turn increases the frustration of both students and faculty and ultimately reduces faculty motivation (see Figure 10). Second, if there is an increase in the students’ learning, professors would attribute such learning to the use of direct instruction, reinforcing the idea that lectures work sufficiently to increase students’ skills even if students have not reached the expected level. Therefore, professors will feel less motivated to change their instructional strategies in class.
However, if professors recognize that lectures are not effective for students to achieve the learning objectives of their courses, the professors’ sense of urgency to change could increase, thereby driving motivation for RBIS. Other mediating variables could also reinforce this sense of urgency, such as market position compared to competitors [74], the acknowledgment of pedagogical evidence that suggests that traditional teaching does not work as successfully [53,86], results of the performance evaluation or feedback from peers and administrators [33,35], or other contexts where faculty acknowledge the need to change (e.g., special reports and participation in education conferences and workshops, the stakeholders’ assessment of students’ learning, or the need for constant improvement) [98].
An assistant professor, whose belief is that RBIS are more effective for learning, acknowledged that he was more inclined to adopt RBIS when he needed to make changes in his courses as a response to his students’ progress: “Courses like mine are constantly under construction; it is not a predetermined course where I can say I came with a topic to explain something. Instead, those are courses that have to change over time according to what [students] are doing. That implies changes. I look for alternatives to my teaching because by default my personal policy is not lecturing, unless there are specific topics where I made presentations of no more than 20 min”. Other professors looked for a balance in the use of RBIS and lectures. An assistant professor who uses RBIS regularly described that sometimes the adjustments in his courses led him to use direct instruction. He believed that direct instruction works to increase students’ learning but acknowledged that RBIS work better. Therefore, he tried to reduce the use of direct instruction but did not avoid it entirely: “Sometimes my adjustments are additional classes, if a class was more of discovery, I give a class with more theory”.

4. Discussion

The previous dynamics model how factors within the academic system can either reinforce or hinder faculty motivation, highlighting a relationship that is neither linear nor directly causal. For example, the size of a class does not consistently correlate with faculty motivation—increasing class size does not always decrease motivation, nor does decreasing class size always increase it. As demonstrated in this paper, smaller class sizes tend to positively reinforce motivation, but increased motivation can also exert pressure to increase class sizes. However, factors such as increased faculty workload may diminish motivation, thus mediating the positive impact of small class sizes. In essence, these dynamics clarify how certain factors exert negative pressure while others exert positive pressure on motivation, and how shifts in motivation can reciprocally influence these factors.
Systems theory suggests the concept of leverage—identifying which relatively small focused actions or changes in structures can lead to significant, enduring improvements [18]. The CSDM can help hypothesize those levers for instructional change by illustrating the dynamic structure of a system.
We acknowledge, however, that without a fully formulated quantitative computer model, it would be very difficult to identify the unintended consequences of decisions and policies, other hidden sources of resistance to instructional change, and ultimately, validate and test the levers [18]. Nevertheless, we could suggest policies that potentially strengthen the reinforcing dynamics of faculty motivation or weaken the balancing dynamics that hinder it. These policies are based on the factors that are central to a dynamic, potentially modify its direction (e.g., create negative links), affect various dynamics, and were emphasized by faculty participants, who also suggested practical ways to modify such factors. Figure 11 shows the model highlighting the factors that lead to the levers.
In this section, we are presenting nine potential levers that could significantly improve PBL and SGCs adoption. Table 1 summarizes these levers. We believe our proposed leverages are in the agency of faculty, departments, and colleges. They are not intended to be comprehensive and their validation merits future investigations.

5. Limitations

The results presented in this document were obtained from the interpretation of the developed CSDM. However, this model has some limitations. First, although the site selected in this study had special features and characteristics that made it suitable for developing a CSDM in a single site, this can also impact the transferability of the model [63]. Our CSDM built in a single site is not expected to be representative of a broader population [103], but its findings could be transferred or replicated into other similar systems [17,65,103] by looking for similar cases and trying to replicate findings from the original study [104] or by looking for a family of social systems to which the particular case belongs and enhance and test the findings [63]. Building a general model would require following a similar method used to build broader theories from multiple case studies [65,69,105], but it goes beyond the scope of this study.
However, the potential transferability of the model to other academic systems is supported by the notion of isomorphism described in the institutional theories of change [6]. Isomorphism theory suggests that “universities with even distinctive missions, have shifted over time to become more similar in character in terms of their student bodies, mission statements, focus on research over teaching, curriculum, and other components that make up the organizations” [6] (p. 38). That is, academic institutions tend to be very similar even in different countries and educational systems. This is not to say that the CSDM will fit perfectly into other academic systems but that it increases the likelihood that some elements of the model, particularly the relationships found in the CLDs, could be illustrative of other systems due to this isomorphism.
Nevertheless, to increase the transferability properties, this study provided good documentation of qualitative procedures [104], thick descriptions [106], and strong relation to the literature [59].
Second, to predict the behavior of the academic factors that increase or decrease faculty motivation to adopt RBIS over time, the model would need to be adapted in specific academic contexts using quantitative descriptions. The levers we suggested are still hypothetical and it is the product of an analysis of the system. Including quantitative descriptions to formulate the SDM [17] would allow a better understanding of the impact and strength of the suggested levers on instructional change. The CSDM in this article provides the first important step to creating a complete SDM by describing how all the variables interconnect.
Third, the proposed CSDM is not exhaustive; the existence of more dynamics that require further investigation is highly possible. Finally, the levers proposed in this document are not exhaustive and are suggested as starting points for developing policies that reduce resistance to instructional change and strengthen faculty motivation.

6. Concluding Remarks

Faculty motivation is arguably the most important factor for sustainable instructional change. We argue that the best way to systemically enact change is to increase faculty motivation to adopt and maintain RBIS in universities (in this paper, we used the term RBIS as an umbrella term for PBL and SGCs). In general, instructional change is more likely to occur and be sustainable when professors are willing to make that change, and academic units are consistently supporting it. Motivating faculty is not a problem that can be solved only with development programs that externally encourage faculty to experiment with new techniques and take risks in their instruction. Instead, such programs must consider the dynamic complexity of academia. This paper uses a systems perspective that models how the dynamics of internal factors of the system interact and ultimately present barriers or drivers for faculty motivation to enact and sustain instructional change.
We found evidence of thirteen dynamics affecting faculty motivation and presented them in the form of CLDs. Their integration constitutes the CSDM: In summary, professors are less inclined to adopt RBIS in their classroom:
(1)
When professors believe that adopting RBIS will reduce the time available to cover content;
(2)
If implementing such strategies does not contribute to their promotion;
(3)
If their workload increases due to increases in research workload, or due to preparation, assessment, and feedback activities involved in RBIS;
(4)
With a higher number of students because it will be more difficult to implement the RBIS;
(5)
If they believe that adopting RBIS implies being more permissive with students;
(6)
When they believe that direct instruction is more effective than RBIS in improving students learning.
Conversely, professors are more inclined to adopt these strategies:
(7)
If they have more feasible ideas about how to use RBIS in their courses;
(8)
When faculty increase their knowledge about how to implement RBIS;
(9)
When students exhibit higher levels of motivation and learning;
(10)
When they witness students actively engaging in class;
(11)
When they recognize that these strategies effectively facilitate their ability to provide students with timely and high-quality feedback;
(12)
When they observe an increase in positive evaluation scores for their teaching performance;
(13)
If students possess the minimum level of skills to succeed in the learning activities.
Finally, we suggest five areas of future work: (1) complementing the model with other dynamics of the academic system, (2) modeling other possible sources of resistance to change, (3) simulating the effect of instructional change policies in the short and long term, (4) applying the results of such models to faculty development programs, (5) formulating a computational systems model to test the suggested levers, and (6) analyzing dynamics of instructional change from the perspective of other stakeholders in the academic system.

Author Contributions

Conceptualization, J.M.C.-B. and S.G.A.; methodology, J.M.C.-B.; validation, S.G.A. and F.A.B.; formal analysis, J.M.C.-B.; investigation, J.M.C.-B.; resources, S.G.A.; data curation, J.M.C.-B.; writing—original draft preparation, J.M.C.-B.; writing—review and editing, J.M.C.-B., F.A.B. and S.G.A.; visualization, J.M.C.-B. and F.A.B.; supervision, S.G.A.; project administration, J.M.C.-B.; funding acquisition, S.G.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Western Institutional Review Board (18-574 Approval Date: 2 July 2018).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article can be made available by the authors upon request.

Acknowledgments

We appreciate the work of both Niyousha Hosseinichimeh, assistant professor of the Industrial and Systems Engineering Department, and Cynthia Hampton, postdoctoral associate both from Virginia Tech. Hosseinichimeh was the system dynamics expert who helped validate the initial CLDs from the literature in phase 1. Hampton’s input and invaluable insight helped to complete phase 1 of this study.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Population and Sample for Data Collection

Table A1. Population and sample.
Table A1. Population and sample.
CategoriesPopulation (57)Interview (10)Focus Group (5)
Full-time (27)Full401
Associate541
Assistant professors1542
Instructors2 (non-tenured)00
Emeritus Professor100
Adjunct (30)Non-tenured3021
Experience and formal training with RBISIntroductory2383
Advanced422
Other demographics (for full-time only)Female420
Male2364
Senior14 133
Junior13 251
1 Professors with more than 10 years in the Department. 2 Professors with less than 10 years in the Department.

Appendix B. Instrument Protocols

Appendix B.1. Interview Protocol

Appendix B.1.1. First Section

This section will consist of open-ended questions related to identifying the factors and reasons that positively and negatively affect their motivation to adopt RBIS.
The interviewer will introduce the objective of the project, the interview process, and the expected outcome of the interview. The interviewer will emphasize that several questions might prove difficult and that it is expected that he or she may not have all the answers. This includes a discussion about PBL and SGCs and what they represent to the interviewee and defines RBIS as an umbrella term to represent these strategies.
The introduction, faculty motivation, and only one or two of a subset of factors are selected for each interview.
Table A2. First Section Interview Questions.
Table A2. First Section Interview Questions.
FactorsOpen-Ended Interview Questions
IntroductionFrom your own experience or perspective, we are going to list and discuss the academic system´s factors that are promoting or hindering your willingness to implement RBIS in your courses.
Typically, RBIS are not commonly used in engineering, or the most common teaching practice is traditional lecturing.
  • Why do you think this situation persists?
Faculty
motivation
[Expectancy of Success]
  • Do you believe the expectations of adopting these strategies are clear?
  • Is the adoption of these strategies challenging to faculty? Why?
  • Do you believe that faculty can succeed in implementing these strategies?
[Utility Value]
  • Do you believe that adopting these strategies is useful? To whom?
  • Does the institution care about whether you implement these strategies?
[Attainment Value]
  • Do you feel interested in adopting these strategies?
  • Do you feel faculty is interested in adopting these strategies?
[Cost Value]
  • Does the institution care about faculty success at implementing these strategies?
  • What are the costs for faculty to adopt these strategies?
[Empowerment]
  • Do you have choices about what you can do?
  • Do faculty have choices about what they can do?
  • Do you believe that faculty are empowered to adopt these strategies? Are they obligated or persuaded to do so?
[Caring]
  • Does the institution care about whether you implement RBIS? Why?
  • Does the institution care about whether faculty is successful in implementing RBIS? Why?
Category of Factors 1:
Institutional support
Please describe the support provided by your institution to implement or adopt RBIS.
In terms of structures:
  • What are the resources or infrastructure to support RBIS implementation?
  • What are the policies for training related to RBIS implementation?
  • How does the class content favor the RBIS implementation?
  • How does the timing and sequence of the instruction favor the RBIS implementation?
  • Does the time allotted in class favor the RBIS implementation? Why?
In terms of networking and community:
  • Are there activities intended to create community of practices (working collaboratively around teaching)? Are these activities coordinated? Please describe;
  • How consistent is this support? Please explain;
  • Is it effective, accessible, or enough? Please explain.
Category of Factors 2:
Faculty
pedagogical knowledge.
  • Do you implement RBIS? Never, sometimes, often, always.
  • Do you think faculty implement RBIS? Never, sometimes, often, always.
  • Which percentage of your class time is used in RBIS?
  • What is your level of expertise about RBIS? Aware, familiar, expert? Please explain.
  • What is the faculty level of expertise about RBIS? Aware, familiar, expert? Please explain.
  • Have you participated in activities intended to increase your knowledge or skills about RBIS? Please describe them.
  • Do you think they favor RBIS implementation in your case? Why?
  • Have faculty participated in these activities? Never, sometimes, often, regularly?
  • Do you think the activities favor RBIS implementation for all faculty? Why?
Category of Factors 3:
Institutional culture
  • What are the faculty assumptions about RBIS?
  • What are the institutional assumptions about RBIS?
  • Are they favorable to RBIS adoption? Why?
  • Do faculty value their adoption? Why?
  • What are the collective beliefs, symbols, or attitudes of your academic institution regarding these strategies? Are they favoring the adoption of these strategies? Why?
Category of Factors 4:
Students
experience
  • How do students perceive their academic experience in classrooms where these strategies are implemented?
  • Do you think students favor them? Resist them? Why?
  • Do you think their implementation improves students’ learning? Please explain.

Appendix B.1.2. Second Section

Table A3. Second Section Interview Protocol.
Table A3. Second Section Interview Protocol.
Context DescriptionOpen-Ended Interview Questions
This section focuses on building a CLD with the interviewee. The interviewer briefly explains the process of constructing a causal loop and the basic notation. This section is repeated to elicit two or more causal loops.
From the first section, the interviewer selects a factor. The factor selected must be defined in a way that can be explained as a variable that increases or decreases. Then they ask these questions:
  • Assuming that all other variables in the system are static:
  • Which do you think are the direct causes for the increase in such variables? Why?
  • Or which other variables will increase such variables?
  • What do you think are the direct consequences for the increase in such variables? Why?
For each new variable, the questions will be repeated until a loop is found. Then they ask these questions:
  • Does this loop make sense to you? Why, or what needs to change in this loop to make sense to you?
  • Does it explain a clear story that relates to a personal experience? Please elaborate.
  • What aspects can be distinguished about the increase or decrease in motivation to adopt these strategies?

Appendix B.2. Focus Group (GMB) Protocol

Date: _______________________ Hour: ________________ Location: ______________
Number of participants: __________
Roles: Facilitator and modeler: ______________________
Process and content coach: _________________________________________
Recorder: _______________________________________
Gatekeeper: _____________________________________
Notes: This protocol is based on Luna-Reyes, Martinez-Moyano [54].
The modeler should be knowledgeable of the subject matter and will focus on drawing the model and aid in preventing the group from developing a one-sided view of the problem.
The facilitator will ask the questions, guide the group discussion, maintain the flow of the conversations, and take notes on the white board to keep track of the so-called group memory.
The content and process coach will focus on the group process and dynamics; he or she will observe the group and help the facilitator to identify strategies to keep the group effective.
The recorder should take notes that can be used to keep discussion on track and make the final report.
The gatekeeper will be one of the Department´s administrators interested in solving the problem focused on motivating the participants to engage in the group activity.

Appendix B.2.1. First Section

Table A4. Focus group protocol—First section.
Table A4. Focus group protocol—First section.
Steps
1. The facilitator introduces the concepts of system dynamics and causal loops with a simple and known example.
2. The facilitator presents the problem to model (to increase the faculty motivation to adopt RBIS).
3. The facilitator asks the group to identify (individually and then collectively) as many problem-related variables (factors) as possible. The facilitator will show the categories of factors that affect instructional change to help them generate ideas and to focus the conversation. The question to motivate the activity is: What are the key variables affecting faculty motivation to adopt RBIS?
4. The facilitator will ask the group to discuss why these variables are important.
5. The facilitator will prioritize and organize the variables according to group consensus. Then, the variables will be organized around the categories of factors. The facilitator will make sure that the definition of each variable is clear to the group.

Appendix B.2.2. Second Section

Table A5. Focus group protocol—Second section.
Table A5. Focus group protocol—Second section.
StepsOpen-Ended Questions
For each category of factors, the facilitator presented the different CLDs created in the interviews. Then, they asked the group to discuss and critique the causal loops.
  • Do they make sense to the group?
  • Do they explain a clear story that relates to a personal experience?
  • Are there other variables that they consider better explain these loops or extend their narrative?
For each new variable, the questions were repeated until a loop was found.
Then, they asked these questions:
  • Does this loop make sense to you? Why, or what needs to change in this loop to make sense to you?
  • Does it explain a clear story that relates to a personal experience? Please elaborate.
  • What aspects can be distinguished about the increase or decrease in motivation to adopt these strategies?
Each member was asked to connect the key variables of each category within the presented CLDs.
  • From the list of variables, which one do they think causes any of the variables in the loop?
  • Which one do they think is caused by any of the variables in the loop?
  • What are the possible explanations for each link?
The facilitator combined each response into one diagram and presented it to the group. Then, they asked the group to discuss, explain, critique, and agree with each new connection. After the group reached a consensus, the facilitator presented another CLD found in the interviews and repeated the last two steps.
This second section ended when the group agreed that the model told a complete story.

References

  1. Henderson, C.; Dancy, M. Physics faculty and educational researchers: Divergent expectations as barriers to the diffusion of innovations. Am. J. Phys. (Phys. Educ. Res. Sect.) 2008, 76, 79–91. [Google Scholar] [CrossRef]
  2. Jamieson, L.H.; Lohmann, J.R. Creating a Culture for Scholarly and Systematic Innovation in Engineering Education: Ensuring U.S. Engineering Has the Right People with the Right Talent for a Global Society; American Society of Engineering Education: Washington, DC, USA, 2009. [Google Scholar]
  3. National Academy of Engineering. Educating the Engineer of 2020: Adapting Engineering Education to the New Century; National Academies Press: Washington, DC, USA, 2005. [Google Scholar]
  4. Abrami, P.C.; Poulsen, C.; Chambers, B. Teacher motivation to implement an educational innovation: Factors differentiating users and non-users of cooperative learning. Educ. Psychol. 2004, 24, 201–216. [Google Scholar] [CrossRef]
  5. Henderson, C.; Beach, A.; Finkelstein, N. Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. J. Res. Sci. Teach. 2011, 48, 952–984. [Google Scholar] [CrossRef]
  6. Kezar, A. How Colleges Change: Understanding, Leading, and Enacting Change; Routledge: Oxfordshire, UK, 2014. [Google Scholar]
  7. McKenna, A.F.; Froyd, J.; Litzinger, T. The Complexities of Transforming Engineering Higher Education: Preparing for Next Steps. J. Eng. Educ. 2014, 103, 188–192. [Google Scholar] [CrossRef]
  8. Finelli, C.J.; Daly, S.R.; Richardson, K.M. Bridging the Research-to-Practice Gap: Designing an Institutional Change Plan Using Local Evidence. J. Eng. Educ. 2014, 103, 331–361. [Google Scholar] [CrossRef]
  9. Kezar, A.; Gehrke, S.; Elrod, S. Implicit theories of change as a barrier to change on college campuses: An examination of STEM reform. Rev. High. Educ. 2015, 38, 479–506. [Google Scholar] [CrossRef]
  10. Dearing, J.W. Applying diffusion of innovation theory to intervention development. Res. Soc. Work Pract. 2009, 19, 503–518. [Google Scholar] [CrossRef] [PubMed]
  11. Froyd, J.E. Developing a dissemination plan. In Proceedings of the ASEE/IEEE Frontiers in Education Conference, Reno, NV, USA, 10–13 October 2001; IEEE: Reno, NV, USA, 2001; p. F2G-18. [Google Scholar]
  12. Matusovich, H.M.; Paretti, M.C.; McNair, L.D.; Hixson, C. Faculty Motivation: A Gateway to Transforming Engineering Education. J. Eng. Educ. 2014, 103, 302–330. [Google Scholar] [CrossRef]
  13. Jones, B.D. Motivating Students to Engage in Learning: The MUSIC Model of Academic Motivation. Int. J. Teach. Learn. High. Educ. 2009, 21, 272–285. [Google Scholar]
  14. Cruz, J.M.; Hampton, C.; Adams, S.G.; Hosseinichimeh, N. A System Approach to Instructional Change in Academia. In Proceedings of the American Society of Engineering Education (ASEE), Tampa Bay, FL, USA, 15–19 June 2019. [Google Scholar]
  15. Lattuca, L.R. Influences on engineering faculty members’ decisions about educational innovations: A systems view of curricular and instructional change. In Proceedings of the Forum Impact Diffusion Transform, Engineering Education Innovation, Washington, DC, USA, February 2011. [Google Scholar]
  16. Ghaffarzadegan, N.; Larson, R.; Hawley, J. Education as a Complex System. Syst. Res. Behav. Sci. 2017, 34, 211. [Google Scholar] [CrossRef]
  17. Sterman, J.D. Business Dynamics: Systems Thinking and Modeling for a Complex World; McGraw-Hill Education: New York, NY, USA, 2000. [Google Scholar]
  18. Senge, P.M. The Fifth Discipline: The Art and Practice of the Learning Organization; Crown Pub: London, UK, 1990. [Google Scholar]
  19. Luna-Reyes, L.F.; Andersen, D.L. Collecting and analyzing qualitative data for system dynamics: Methods and models. Syst. Dyn. Rev. 2003, 19, 271–296. [Google Scholar] [CrossRef]
  20. Vennix, J.A.M. Group Model Building Facilitating Team Learning Using System Dynamics; John Wiley & Sons: New York, NY, USA; Chichester, UK, 1996. [Google Scholar]
  21. Edström, K.; Kolmos, A. PBL and CDIO: Complementary models for engineering education development. Eur. J. Eng. Educ. 2014, 39, 539–555. [Google Scholar] [CrossRef]
  22. Dochy, F.; Segers, M.; Van den Bossche, P.; Gijbels, D.J.L. instruction, Effects of problem-based learning: A meta-analysis. Learning and instruction 2003, 13, 533–568. [Google Scholar] [CrossRef]
  23. Springer, L.; Stanne, M.E.; Donovan, S.S. Effects of Small-Group Learning on Undergraduates in Science, Mathematics, Engineering, and Technology: A Meta-Analysis. Rev. Educ. Res. 1999, 69, 21–51. [Google Scholar] [CrossRef]
  24. Kolmos, A.; Fink, F.K.; Krogh, L. The Aalborg PBL Model: Progress, Diversity and Challenges; Aalborg University Press: Aalborg, Denmark, 2004. [Google Scholar]
  25. Savin-Baden, M. Using problem-based learning: New constellations for the 21st century. J. Excell. Coll. Teach. 2014, 25, 197–219. [Google Scholar]
  26. Chen, J.; Kolmos, A.; Du, X. Forms of implementation and challenges of PBL in engineering education: A review of literature. Eur. J. Eng. Educ. 2021, 46, 90–115. [Google Scholar] [CrossRef]
  27. Meijer, H.; Hoekstra, R.; Brouwer, J.; Strijbos, J.-W. Unfolding collaborative learning assessment literacy: A reflection on current assessment methods in higher education. Assess. Eval. High. Educ. 2020, 45, 1222–1240. [Google Scholar] [CrossRef]
  28. Morin, E. From the concept of system to the paradigm of complexity. J. Soc. Evol. Syst. 1992, 15, 371–385. [Google Scholar] [CrossRef]
  29. Forrester, J.W. System dynamics and the lessons of 35 years. In A Systems-Based Approach to Policymaking; Springer: Berlin/Heidelberg, Germany, 1993; pp. 199–240. [Google Scholar]
  30. Eccles, J.S.; Wigfield, A. Motivational Beliefs, Values, and Goals. Annu. Rev. Psychol. 2002, 53, 109–132. [Google Scholar] [CrossRef]
  31. Eccles, J.S.; Adler, T.; Futterman, R.; Goff, S.; Kaczala, C.; Meece, J. Expectancies, values, and academic behaviors. In Achievement and Achievement Motivation; Spence, J.T., Ed.; W. H. Freeman: San Francisco, CA, USA, 1983; pp. 75–146. [Google Scholar]
  32. Wigfield, A.; Eccles, J.S. Expectancy–value theory of achievement motivation. Contemp. Educ. Psychol. 2000, 25, 68–81. [Google Scholar] [CrossRef]
  33. Borrego, M.; Henderson, C. Increasing the Use of Evidence-Based Teaching in STEM Higher Education: A Comparison of Eight Change Strategies. J. Eng. Educ. 2014, 103, 220–252. [Google Scholar] [CrossRef]
  34. Bouwma-Gearhart, J.; Sitomer, A.; Quardokus-Fisher, K.; Smith, C.; Koretsky, M. Studying Organizational Change: Rigorous Attention To Complex Systems via a Multi-Theoretical Research Model; American Society for Engineering Education: New Orleans, LA, USA, 2016. [Google Scholar]
  35. Feldman, K.A.; Paulsen, M.B. Faculty Motivation: The Role of a Supportive Teaching Culture. New Dir. Teach. Learn. 1999, 1999, 69–78. [Google Scholar] [CrossRef]
  36. Finelli, C.J.; Richardson, K.M.; Daly, S.R. Factors that influence faculty motivation of effective teaching practices in engineering. In Proceedings of the Annual Conference-American Society for Engineering Education (ASEE), Atlanta, GA, USA, 23–26 June 2013. [Google Scholar]
  37. Litzinger, T.; Lattuca, L.R. Translating Research to Widespread Practice in Engineering Education. In Cambridge Handbook of Engineering Education Research; Johri, A., Olds, B.M., Eds.; Cambridge University Press: New York, NY, USA, 2014; pp. 375–391. [Google Scholar]
  38. Pembridge, J.; Jordan, K. Balancing the Influence of Driving and Restricting Factors to Use Active Learning; American Society for Engineering Education: New Orleans, LA, USA, 2016. [Google Scholar]
  39. Dancy, M.; Henderson, C. Pedagogical practices and instructional change of physics faculty. Am. J. Phys. 2010, 78, 1056–1063. [Google Scholar] [CrossRef]
  40. Henderson, C.; Dancy, M. Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Phys. Rev. Spec. Top.-Phys. Educ. Res. 2007, 3, 20102. [Google Scholar] [CrossRef]
  41. Pagnucci, N.; Carnevale, F.A.; Bagnasco, A.; Tolotti, A.; Cadorin, L.; Sasso, L. A cross-sectional study of pedagogical strategies in nursing education: Opportunities and constraints toward using effective pedagogy. BMC Med. Educ. 2015, 15, 138. [Google Scholar] [CrossRef] [PubMed]
  42. Cooper, J.L.; MacGregor, J.; Smith, K.A.; Robinson, P. Implementing Small-Group Instruction: Insights from Successful Practitioners. New Dir. Teach. Learn. 2000, 2000, 63–76. [Google Scholar] [CrossRef]
  43. Cross, K.J.; Mamaril, N.A.; Johnson, N.; Herman, G.L. Understanding How a Culture of Collaboration Develops among STEM Faculty; American Society for Engineering Education: New Orleans, LA, USA, 2016. [Google Scholar]
  44. Froyd, J.E.; Borrego, M.; Cutler, S.; Henderson, C.; Prince, M.J. Estimates of use of research-based instructional strategies in core electrical or computer engineering courses. IEEE Trans. Educ. 2013, 56, 393–399. [Google Scholar] [CrossRef]
  45. Borrego, M.; Cutler, S.; Prince, M.; Henderson, C.; Froyd, J.E. Fidelity of Implementation of Research-Based Instructional Strategies (RBIS) in Engineering Science Courses. J. Eng. Educ. 2013, 102, 394–425. [Google Scholar] [CrossRef]
  46. Ambrose, S.A.; Bridges, M.W.; DiPietro, M.; Lovett, M.C.; Norman, M.K. How Learning Works: Seven Research-Based Principles for Smart Teaching; John Wiley & Sons: Hoboken, NJ, USA, 2010. [Google Scholar]
  47. Fowler, D.A.; Macik, M.L.; Kaihatu, J.; Bakenhus, C.A.H. Impact of Curriculum Transformation Committee Experience on Faculty Perspectives of Their Teaching and Its Influence on Student Learning; American Society for Engineering Education: New Orleans, LA, USA, 2016. [Google Scholar]
  48. Henderson, C.; Dancy, M.; Niewiadomska-Bugaj, M. Use of research-based instructional strategies in introductory physics: Where do faculty leave the innovation-decision process? Phys. Rev. Spec. Top.-Phys. Educ. Res. 2012, 8, 020104. [Google Scholar] [CrossRef]
  49. Carroll, J.S. Introduction to organizational analysis: The three lenses. MIT Sloan Sch. Manag. Revis. Work. Pap. 2006, 14, 1–13. [Google Scholar]
  50. Schein, E.H. Organizational Culture and Leadership; John Wiley & Sons: Hoboken, NJ, USA, 2006; Volume 356. [Google Scholar]
  51. Handelsman, J.; Ebert-May, D.; Beichner, R.; Bruns, P.; Chang, A.; DeHaan, R.; Gentile, J.; Lauffer, S.; Stewart, J.; Tilghman, S.M. Scientific teaching. Science 2004, 304, 521–522. [Google Scholar] [CrossRef]
  52. Graham, R. Achieving Excellence in Engineering Education: The Ingredients of Successful Change; Royal Academy of Engineering: London, UK, 2012. [Google Scholar]
  53. Seymour, E.; De Welde, K. Why Doesn’t Knowing Change Anything? Constraints and Resistance, Leverage and Sustainability. In Transforming Institutions: Undergraduate Stem Education for the 21st Century; Weaver, G.C., Ed.; Purdue University Press: West Lafayette, IN, USA, 2015. [Google Scholar]
  54. Luna-Reyes, L.F.; Martinez-Moyano, I.J.; Pardo, T.A.; Cresswell, A.M.; Andersen, D.F.; Richardson, G.P. Anatomy of a group model-building intervention: Building dynamic theory from case study research. Syst. Dyn. Rev. 2006, 22, 291–320. [Google Scholar] [CrossRef]
  55. Kim, H.; Andersen, D.F. Building confidence in causal maps generated from purposive text data: Mapping transcripts of the Federal Reserve. Syst. Dyn. Rev. 2012, 28, 311–328. [Google Scholar] [CrossRef]
  56. Gearing, R.E. Bracketing in research: A typology. Qual. Health Res. 2004, 14, 1429–1452. [Google Scholar] [CrossRef] [PubMed]
  57. Tufford, L.; Newman, P. Bracketing in qualitative research. Qual. Soc. Work 2012, 11, 80–96. [Google Scholar] [CrossRef]
  58. Fischer, C.T. Bracketing in qualitative research: Conceptual and practical matters. Psychother. Res. 2009, 19, 583–590. [Google Scholar] [CrossRef] [PubMed]
  59. Creswell, J.W. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 3rd ed.; Sage Publications: Thousand Oaks, CA, USA, 2009; p. 260. [Google Scholar]
  60. Poggenpoel, M.; Myburgh, C. The researcher as research instrument in educational research: A possible threat to trustworthiness? (A: Research_instrument). Education 2003, 124, 418. [Google Scholar]
  61. Richardson, G.P. Reflections on the foundations of system dynamics. Syst. Dyn. Rev. 2011, 27, 219–243. [Google Scholar] [CrossRef]
  62. Mehra, B. Bias in qualitative research: Voices from an online classroom. Qual. Rep. 2002, 7, 1–19. [Google Scholar] [CrossRef]
  63. Schwaninger, M.; Groesser, S. System Dynamics Modeling: Validation for Quality Assurance. In Complex Systems in Finance and Econometrics; Meyers, R.A., Ed.; Springer New York: New York, NY, USA, 2011; pp. 767–781. [Google Scholar]
  64. Saldaña, J. The Coding Manual for Qualitative Researchers; Sage Publications: London, UK, 2021; pp. 1–440. [Google Scholar]
  65. Eisenhardt, K.M.; Graebner, M.E. Theory building from cases: Opportunities and challenges. Acad. Manag. J. 2007, 50, 25–32. [Google Scholar] [CrossRef]
  66. Crawley, E.; Malmqvist, J.; Ostlund, S.; Brodeur, D. Rethinking Engineering Education; Springer: Berlin, Heidelberg, Germany, 2007; p. 302. [Google Scholar]
  67. Arlett, C.; Lamb, F.; Dales, R.; Willis, L.; Hurdle, E. Meeting the needs of industry: The drivers for change in engineering education. Eng. Educ. 2010, 5, 18–25. [Google Scholar] [CrossRef]
  68. Borrego, M.; Bernhard, J. The emergence of engineering education research as an internationally connected field of inquiry. J. Eng. Educ. 2011, 100, 14–47. [Google Scholar] [CrossRef]
  69. Yin, R.K. Case Study Research: Design and Methods, 3rd ed.; Sage Publications: London, UK, 2003. [Google Scholar]
  70. Miles, M.B.; Huberman, A.M.; Saldaña, J. Qualitative Data Analysis: A Methods Sourcebook; SAGE: Los Angeles, CA, USA, 2014. [Google Scholar]
  71. Prosser, M.; Trigwell, K. Relations between perceptions of the teaching environment and approaches to teaching. Br. J. Educ. Psychol. 1997, 67, 25–35. [Google Scholar] [CrossRef]
  72. Gorozidis, G.; Papaioannou, A.G. Teachers’ motivation to participate in training and to implement innovations. Teach. Teach. Educ. 2014, 39, 1–11. [Google Scholar] [CrossRef]
  73. Hora, M.T. Organizational factors and instructional decision-making: A cognitive perspective. Rev. High. Educ. 2012, 35, 207–235. [Google Scholar] [CrossRef]
  74. Olson, S.; Riordan, D.G. Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics. Report to the President; Executive Office of the President: Washington, DC, USA, 2012. [Google Scholar]
  75. Cohen, D.K.; Ball, D.L. Educational innovation and the problem of scale. Scale Up Educ. Ideas Princ. 2007, 1, 19–36. [Google Scholar]
  76. Zaini, R.M.; Pavlov, O.V.; Saeed, K.; Radzicki, M.J.; Hoffman, A.H.; Tichenor, K.R. Let’s Talk Change in a University: A Simple Model for Addressing a Complex Agenda. Syst. Res. Behav. Sci. 2017, 34, 250–266. [Google Scholar] [CrossRef]
  77. Bong, M.; Skaalvik, E.M. Academic self-concept and self-efficacy: How different are they really? Educ. Psychol. Rev. 2003, 15, 1–40. [Google Scholar] [CrossRef]
  78. Astin, A.W. Achieving Educational Excellence; Jossey-Bass: Hoboken, NJ, USA, 1985. [Google Scholar]
  79. Laird, T.F.N.; Shoup, R.; Kuh, G.D.; Schwarz, M.J. The effects of discipline on deep approaches to student learning and college outcomes. Res. High. Educ. 2008, 49, 469–494. [Google Scholar] [CrossRef]
  80. Pascarella, E.T.; Terenzini, P.T. How College Affects Students. Volume 2. A Third Decade of Research; Jossey-Bass: San Francisco, CA, USA, 2005. [Google Scholar]
  81. Butler, D.L.; Schnellert, L. Collaborative inquiry in teacher professional development. Teach. Teach. Educ. 2012, 28, 1206–1220. [Google Scholar] [CrossRef]
  82. Darby, A.; Knight-McKenna, M. Sustaining Faculty Motivation for Academic Service-Learning. J. Excell. Coll. Teach. 2016, 27, 179–193. [Google Scholar]
  83. Felder, R.M.; Brent, R.; Prince, M.J. Engineering instructional development: Programs, best practices, and recommendations. J. Eng. Educ. 2011, 100, 89–122. [Google Scholar] [CrossRef]
  84. Felder, R.M.; Brent, R. The National Effective Teaching Institute: Assessment of impact and implications for faculty development. J. Eng. Educ. 2010, 99, 121–134. [Google Scholar] [CrossRef]
  85. in de Wal, J.J.; den Brok, P.J.; Hooijer, J.G.; Martens, R.L.; van den Beemt, A. Teachers’ engagement in professional learning: Exploring motivational profiles. Learn. Individ. Differ. 2014, 36, 27–36. [Google Scholar] [CrossRef]
  86. Martin, E. Changing Academic Work: Developing the Learning University; McGraw-Hill Education (UK): London, UK, 1999. [Google Scholar]
  87. Graham, R. The One Less Traveled By: The Road to Lasting Systemic Change in Engineering Education. J. Eng. Educ. 2012, 101, 596–600. [Google Scholar] [CrossRef]
  88. Cruz, J.M.; Hasbun, I.M.; Jones, B.; Adams, S. Beyond superficiality: The Relationship Between Motivation and Deep Learning in Electronic Engineering Students. In Proceedings of the Research on Engineering Education Symposium, Bogota, Colombia, 6–8 July 2017. [Google Scholar]
  89. Laird, T.F.N.; Shoup, R. Measuring Deep Approaches to Learning Using the National Survey of Student Engagement. Presented at the 2006 Association for Institutional Research Annual Forum, Chicago, IL, USA, 14–18 May 2005. [Google Scholar]
  90. Gregory, J.; Jones, R. ‘Maintaining competence’: A grounded theory typology of approaches to teaching in higher education. High. Educ. 2009, 57, 769–785. [Google Scholar] [CrossRef]
  91. Eccles, J.S. Subjective task value and the Eccles et al. model of achievement-related choices. In Handbook of Competence and Motivation; Guilford Publications: New York, NY, USA, 2005; pp. 105–121. [Google Scholar]
  92. Narayanan, A.; Sawaya, W.J.; Johnson, M.D. Analysis of differences in nonteaching factors influencing student evaluation of teaching between engineering and business classrooms. Decis. Sci. J. Innov. Educ. 2014, 12, 233–265. [Google Scholar] [CrossRef]
  93. Jones, B.D. An examination of motivation model components in face-to-face and online instruction. Electron. J. Res. Educ. Psychol. 2010, 8, 915–944. [Google Scholar] [CrossRef]
  94. Jones, B.D.; Miyazaki, Y.; Li, M.; Biscotte, S. Motivational climate predicts student evaluations of teaching: Relationships between students’ course perceptions, ease of course, and evaluations of teaching. AERA Open 2022, 8, 23328584211073167. [Google Scholar] [CrossRef]
  95. Osborne, J.W.; Jones, B.D. Identification with academics and motivation to achieve in school: How the structure of the self influences academic outcomes. Educ. Psychol. Rev. 2011, 23, 131–158. [Google Scholar] [CrossRef]
  96. Simpson, P.M.; Siguaw, J.A. Student Evaluations of Teaching: An Exploratory Study of the Faculty Response. J. Mark. Educ. 2000, 22, 199–213. [Google Scholar] [CrossRef]
  97. Uttl, B.; White, C.A.; Gonzalez, D.W. Meta-analysis of faculty’s teaching effectiveness: Student evaluation of teaching ratings and student learning are not related. Stud. Educ. Eval. 2017, 54, 22–42. [Google Scholar] [CrossRef]
  98. Elmore, R. Getting to scale with good educational practice. Harv. Educ. Rev. 1996, 66, 1–27. [Google Scholar] [CrossRef]
  99. Mourshed, M.; Chijioke, C.; Barber, M. How the worlds most improved school systems keep getting better. Educ. Sci. Moscow 2011, 1, 7–25. [Google Scholar]
  100. Kotter, J.P. Leading change: Why transformation efforts fail. Mus. Manag. Mark. 2007, 20, 20–29. [Google Scholar]
  101. Kaplan, R.S.; Norton, D.P. The Balanced Scorecard: Translating Strategy into Action; Harvard Business Press: Brighton, MA, USA, 1996. [Google Scholar]
  102. Prince, M.J. Does active learning work? A review of the research. J. Eng. Educ. 2004, 93, 223–232. [Google Scholar] [CrossRef]
  103. Eisenhardt, K.M. Building theories from case study research. Acad. Manag. Rev. 1989, 14, 532–550. [Google Scholar] [CrossRef]
  104. Creswell, J.W. Qualitative Inquiry and Research Design: Choosing among Five Approaches; Sage: New York, NY, USA, 2013. [Google Scholar]
  105. Schwaninger, M.; Grosser, S. System dynamics as model-based theory building. Syst. Res. Behav. Sci. 2008, 25, 447. [Google Scholar] [CrossRef]
  106. Borrego, M.; Douglas, E.P.; Amelink, C.T. Quantitative, Qualitative, and Mixed Research Methods in Engineering Education. J. Eng. Educ. 2009, 98, 53–66. [Google Scholar] [CrossRef]
Figure 1. CLD Construction Methods.
Figure 1. CLD Construction Methods.
Education 14 00544 g001
Figure 2. Construction of CLDs.
Figure 2. Construction of CLDs.
Education 14 00544 g002
Figure 3. Conceptual System Dynamics Model of Faculty Motivation to Adopt PBL and SGCs. Blue arrows represent positive causal links (an increase in the cause variable leads to an increase in the effect variable) whereas red dotted arrows represent negative causal links (an increase in the cause variable leads to a decrease in the effect variable).
Figure 3. Conceptual System Dynamics Model of Faculty Motivation to Adopt PBL and SGCs. Blue arrows represent positive causal links (an increase in the cause variable leads to an increase in the effect variable) whereas red dotted arrows represent negative causal links (an increase in the cause variable leads to a decrease in the effect variable).
Education 14 00544 g003
Figure 4. Time invested in covering content balancing dynamic. When professors believe that adopting RBIS will reduce the time available to cover content, they will be less inclined (notice the dotted red line) to adopt such strategies in their classroom.
Figure 4. Time invested in covering content balancing dynamic. When professors believe that adopting RBIS will reduce the time available to cover content, they will be less inclined (notice the dotted red line) to adopt such strategies in their classroom.
Education 14 00544 g004
Figure 5. Time on activities required for promotion balancing dynamics: Faculty are less inclined to adopt RBIS if their workload increases due to increases in research workload, and/or preparation, assessment, and feedback activities involved in RBIS. Faculty workload balancing dynamics: Professors are less inclined to spend their available time adopting RBIS if implementing such strategies does not contribute to their promotion. To add clarity, previous loops are drawn in light gray.
Figure 5. Time on activities required for promotion balancing dynamics: Faculty are less inclined to adopt RBIS if their workload increases due to increases in research workload, and/or preparation, assessment, and feedback activities involved in RBIS. Faculty workload balancing dynamics: Professors are less inclined to spend their available time adopting RBIS if implementing such strategies does not contribute to their promotion. To add clarity, previous loops are drawn in light gray.
Education 14 00544 g005
Figure 6. Class size balancing dynamic: Professors will feel less motivated to adopt RBIS in classes with a higher number of students because these strategies can be more difficult to implement.
Figure 6. Class size balancing dynamic: Professors will feel less motivated to adopt RBIS in classes with a higher number of students because these strategies can be more difficult to implement.
Education 14 00544 g006
Figure 7. Difficulty in implementing RBIS reinforcing dynamics: A higher number of feasible activities leads to a decrease in the perceived difficulty of using RBIS in class and thus more faculty motivation. Pedagogical training reinforcing dynamics: when faculty increase their knowledge about how to implement RBIS, it will be easier to adopt them.
Figure 7. Difficulty in implementing RBIS reinforcing dynamics: A higher number of feasible activities leads to a decrease in the perceived difficulty of using RBIS in class and thus more faculty motivation. Pedagogical training reinforcing dynamics: when faculty increase their knowledge about how to implement RBIS, it will be easier to adopt them.
Education 14 00544 g007
Figure 8. Students’ motivation and learning reinforcing dynamic: Professors are more inclined to use RBIS when students exhibit higher levels of motivation and learning. Students’ engagement in class reinforcing dynamic: Faculty experience heightened motivation and a sense of gratification when they witness students actively engaging in class. Quality and timely feedback reinforcing dynamic: Faculty motivation is enhanced when they recognize that these strategies effectively facilitate their ability to provide students with timely and high-quality feedback.
Figure 8. Students’ motivation and learning reinforcing dynamic: Professors are more inclined to use RBIS when students exhibit higher levels of motivation and learning. Students’ engagement in class reinforcing dynamic: Faculty experience heightened motivation and a sense of gratification when they witness students actively engaging in class. Quality and timely feedback reinforcing dynamic: Faculty motivation is enhanced when they recognize that these strategies effectively facilitate their ability to provide students with timely and high-quality feedback.
Education 14 00544 g008
Figure 9. Students’ evaluation of teaching reinforcing dynamics: Professors can be motivated to adopt RBIS when they observe an increase in positive evaluation scores for their teaching performance. Permissiveness balancing dynamics: Professors will be less inclined to adopt RBIS if they believe that adopting RBIS implies being more permissive with students.
Figure 9. Students’ evaluation of teaching reinforcing dynamics: Professors can be motivated to adopt RBIS when they observe an increase in positive evaluation scores for their teaching performance. Permissiveness balancing dynamics: Professors will be less inclined to adopt RBIS if they believe that adopting RBIS implies being more permissive with students.
Education 14 00544 g009
Figure 10. Students’ ability to succeed with learning activities reinforcing dynamic: Professors are motivated to adopt RBIS in their courses if students possess the minimum level of skills because this leads to higher student success and increased motivation to learn. Sense of urgency to change balancing dynamic: Faculty motivation decreases when they hold the belief that direct instruction is more effective than RBIS in improving students’ learning.
Figure 10. Students’ ability to succeed with learning activities reinforcing dynamic: Professors are motivated to adopt RBIS in their courses if students possess the minimum level of skills because this leads to higher student success and increased motivation to learn. Sense of urgency to change balancing dynamic: Faculty motivation decreases when they hold the belief that direct instruction is more effective than RBIS in improving students’ learning.
Education 14 00544 g010
Figure 11. Levers for Instructional Change.
Figure 11. Levers for Instructional Change.
Education 14 00544 g011
Table 1. Summary of the nine levers.
Table 1. Summary of the nine levers.
LeversDescription
Reducing the content to cover in classesWhen professors believe that adopting RBIS will reduce the time available to cover content, they will be less inclined to adopt them. If the class content is high, professors would prefer lectures because they are perceived as more efficient in covering content.
Increasing the value of teaching in the criteria for tenure and promotionPlacing a higher value on the implementation of innovations in teaching in the criteria for tenure and promotion would create a strong benefit for faculty to invest time in instructional change.
Controlling the faculty workloadMore teaching workload reduces faculty motivation. The practical experience gained by faculty as they use RBIS can reduce their teaching workload over time, especially with policies that provide novice adopters enough time to implement RBIS effectively and allow them to teach future iterations of the same course.
Controlling the class sizeA manageable class size increases faculty motivation. Larger class size is highly correlated with greater difficulty implementing PBL, lower student motivation and engagement, more faculty workload, lesser SET scores, a reduction in the timely and quality feedback that can be provided in the classes, and a less positive experience adopting RBIS.
Implementation of formal pedagogical training on how to implement RBIS in their classroomFaculty motivation can be increased through formal pedagogical training focused on the pedagogical principles that explain why RBIS work and how they can be easily implemented in the classroom. This, in turn, supports a more positive experience adopting RBIS.
Reducing the association between RBIS and permissivenessFaculty will be more motivated if they attribute good student evaluation of teaching scores to an increase in student learning as a consequence of adopting RBIS, instead of an increase in permissiveness or leniency with students as an effort to reduce attrition.
Recognize faculty who adopt RBISFaculty will be more motivated if they believe that adopting RBIS increases their recognition as better teachers.
Demonstrate the effectiveness of adopting RBISFaculty motivation can be enhanced when they are convinced that RBIS effectively enhance learning and engagement. This conviction can be reinforced by gathering evidence, both from their current and future classes, demonstrating that student learning, engagement, and success in class activities improve with the adoption of RBIS.
Increasing the sense of urgencyA powerful initiator of instructional change is the conscious urgency of the need for change. Faculty motivation could increase if professors believe that lectures are not effective enough for students to achieve the learning objectives of their courses.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cruz-Bohorquez, J.M.; Adams, S.G.; Bravo, F.A. The Academic System Influence on Instructional Change: A Conceptual Systems Dynamics Model of Faculty Motivation to Adopt Research-Based Instructional Strategies (RBIS). Educ. Sci. 2024, 14, 544. https://doi.org/10.3390/educsci14050544

AMA Style

Cruz-Bohorquez JM, Adams SG, Bravo FA. The Academic System Influence on Instructional Change: A Conceptual Systems Dynamics Model of Faculty Motivation to Adopt Research-Based Instructional Strategies (RBIS). Education Sciences. 2024; 14(5):544. https://doi.org/10.3390/educsci14050544

Chicago/Turabian Style

Cruz-Bohorquez, Juan Manuel, Stephanie G. Adams, and Flor Angela Bravo. 2024. "The Academic System Influence on Instructional Change: A Conceptual Systems Dynamics Model of Faculty Motivation to Adopt Research-Based Instructional Strategies (RBIS)" Education Sciences 14, no. 5: 544. https://doi.org/10.3390/educsci14050544

APA Style

Cruz-Bohorquez, J. M., Adams, S. G., & Bravo, F. A. (2024). The Academic System Influence on Instructional Change: A Conceptual Systems Dynamics Model of Faculty Motivation to Adopt Research-Based Instructional Strategies (RBIS). Education Sciences, 14(5), 544. https://doi.org/10.3390/educsci14050544

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop