Next Article in Journal
Preliminary Study of the Psychometric Properties of a Questionnaire to Assess Spanish Canoeists’ Perceptions of the Sport System’s Capacity for Talent Development in Women’s Canoeing
Next Article in Special Issue
Planning a Collection of Virtual Patients to Train Clinical Reasoning: A Blueprint Representative of the European Population
Previous Article in Journal
Psychosocial and Sociodemographic Determinants Related to Chronic Diseases in Immigrants Residing in Spain
Previous Article in Special Issue
Clinical Reasoning Needs to Be Explicitly Addressed in Health Professions Curricula: Recommendations from a European Consortium
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development and Evaluation of Interactive Flipped e-Learning (iFEEL) for Pharmacy Students during the COVID-19 Pandemic

1
Department of Pharmaceutics, College of Pharmacy, King Saud University, Riyadh 11451, Saudi Arabia
2
Department of Clinical Pharmacy, College of Pharmacy, King Saud University, Riyadh 11451, Saudi Arabia
3
Kayyali Chair for Pharmaceutical Industries, College of Pharmacy, King Saud University, Riyadh 11451, Saudi Arabia
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2022, 19(7), 3902; https://doi.org/10.3390/ijerph19073902
Submission received: 30 December 2021 / Revised: 28 February 2022 / Accepted: 10 March 2022 / Published: 25 March 2022

Abstract

:
Background: Distance learning has come to the forefront of educational delivery throughout the world due to the COVID-19 pandemic. Presently, there is a paucity of studies that have utilized interactive e-lectures as a model for remote flipped learning. Objectives: To compare educational outcomes for the remote interactive flipped e-learning (iFEEL) activity versus paper-based in-class group learning (PICkLE). Methods: During the spring 2021 semester, tutorials in pharmaceutical quality control and good manufacturing practice were remotely delivered to students by two different approaches: PICkLE and iFEEL. In the latter activity, interactive e-lectures were software-designed and included several audiovisual enhanced illustrations to encourage students to interact with the lecture material prior to attending the virtual class. The class time was reserved for in-class quizzes and discussion. Mean exam scores were compared and voluntary questionnaires were distributed among the participating students as well as healthcare faculty members in 29 Saudi universities. Data from the remotely-delivered course was compared with data from previous course offerings (2018–2020) that used the live PICkLE method. Results: The mean score of post-lecture tests significantly (p < 0.05) increased compared to pre-lecture tests in remote PICkLE and iFEEL, respectively. iFEEL activity showed higher mean post-tests score (95.2%) compared to live PICkLE (90.2%, p = 0.08) and remote PICkLE (93.5%, p = 0.658). Mean comprehensive exam scores increased from 83.8% for remote PICkLE to 89.2% for iFEEL (p = 0.449). On average, 92% of students and 85% of faculty members reported positive feedback on the five quality attributes of the e-lecture. Over 75% of students preferred the iFEEL over PICkLE activity for future course offerings and 84% of faculty members recommend the integration of interactive e-lectures in their future courses. Conclusion: iFEEL represents a novel model of remote flipped learning and shows promising potential to be incorporated into live blended-learning classroom activities.

Graphical Abstract

1. Introduction

The COVID-19 pandemic led to a sudden suspension of face-to-face teaching activities across the world [1]. Immense school closures in more than 200 countries have displaced approximately 1.6 billion learners, equivalent to more than 94% of the student population all over the globe [2,3]. The pandemic forced academic institutions to immediately shift to virtual education or online learning as an emergent solution [1,2,4]. In. In response to such an urgent transition, faculty and educators adopted several strategies to facilitate online learning such as providing educational resources for self-study [2], pre-recorded lecture videos [1], video conferencing, interactive online simulation programs [2], flipped and flipped-jigsaw learning [5,6,7], online problem-based learning [8,9], and synchronous and asynchronous online discussions [2]. Alternative strategies for courses with clinical/practical aspects involved posting clinical case scenarios and/or uploading ready-made videos covering the required topic [2].
In particular, the implementation of the flipped classroom shifts the educator’s role towards that of a facilitator who promotes problem solving, rather than delivering didactic knowledge. However, the online format of flipped classrooms should be carefully implemented to avoid negative learning outcomes. Cho and Kim reported inferior self-directed learning readiness and professor–student interaction when comparing online to face-to-face flipped learning [10]; therefore, these parameters should be carefully considered when implementing online flipped classrooms. On the other hand, Smith and Boscak utilized the flipped classroom mode of instruction in restructuring a course to an online format [5]. They reported successful course delivery with good learning outcomes, highlighting the merits of self-directed learning and flipped-classroom techniques. Duszenko et al. implemented a flipped classroom model in a digital training course [11]. In this study, most students favored the flipped classroom format and 52% acknowledged that the flipped classroom was very helpful. The students preferred the simplistic explanations of their peers as opposed to instructor-led learning and perceived that this method enabled them to identify incomplete or misunderstood information [11]. Ketterer et al. reported improved learning outcomes after implementing an online case-based curriculum in a flipped classroom design [7]. Although flipped learning requires considerable preparation on the instructor’s part, it results in a high-yield educational impact with less instructional time [12].
In fact, several efforts have been investigated to determine the best means to actively engage healthcare learners remotely [13,14]. Pilkington and Hanif preferred short pre-recorded lecture videos rather than live streaming of lectures [1]. After reviewing the lectures, students were then allowed to attend live tutorial sessions if they had any questions that they wanted to ask the lecturer directly. Suppan et al. described the development of a storyline e-learning module for healthcare professionals to provide education about the proper use of personal protective equipment during the pandemic [15]. However, these online education platforms have completely different requirements, structures, and contexts compared to traditional face-to-face education. Most importantly, success in these online learning activities requires a high degree of student self-direction, motivation, and interaction [16].
During the COVID-19 pandemic, some preliminary studies have reported that the sudden transition to online learning has been well accepted [17,18,19]. Some students reported positive-rated aspects of online learning, such as flexibility, affordability [20], higher motivation, less time effort, and that online learning was even easier compared to “face-to-face” courses [21].
However, many other studies [22,23,24] have emphasized the challenges that many students face such as low concentration in online learning [22], lack of engagement in online classes [25], lack of preparedness [20,21], procrastination and self-regulation [20,26,27], poor time management [28], feelings of isolation, low motivation, high stress and anxiety [23] digital divide, and technical difficulties [20,29,30]. On the other hand, teachers have reported the transition to online learning being time constraining and demanding more preparation, better classroom management [24,31] as well as the difficulty to work from home [21], and limited supervision on online assessments [29]. Students and faculty with low digital competence found it difficult to make optimal utilization of online learning [29].
Another challenge that faced both students and educators, was the unprecedented load on the Internet network [30]. Live classrooms placed a large workload on teaching servers with significant peaks on scheduled lecture times; while on-demand classrooms and teaching materials access were spread over the day. Some students and/or educators suffered eventual internet impairment which is mainly critical for synchronous online communication. On the other hand, asynchronous learning environments such as on-demand classrooms and pre-recorded videos appear to be less susceptible to internet impairments but lack the real-time interactions between educators and learners [20].
Therefore, innovative solutions were needed to circumvent such challenges and enhance online learning quality during the COVID-19 pandemic. Dhawan emphasizes that online courses should be dynamic, interactive, student-centered, and interesting [20]. Adedoyin and Soykan highlighted the importance of developing equitable and unbiased online assessment systems to minimize risks of cheating and plagiarism [29]. Khtere and Yousef concluded that the instructor plays a principal role in achieving professional online learning outcomes by acting as a guide and motivator for students, an instructional designer who utilizes technology to enhance the quality of learning, and as a sincere advisor who provides effective feedback, follow-up, and necessary recommendations to students [3].
According to the previously discussed limitations of online learning, there is an urgent need to develop e-learning techniques that are interactive, user-friendly, foster student engagement in the learning process, avoid technical issues, and that are compatible with various digital devices.
As a response to the global lockdown situation, the Center for Excellence in Learning and Teaching at King Saud University (KSU) announced the launch of the fifth cycle of the Excellence in Teaching and Learning Grants Program. This cycle targeted the most current topics in distance learning involving e-learning activities, online/remote assessments, and reusable electronic learning resources.
Therefore, we introduced an interactive e-learning activity for pharmacy students to enable them to continue their education remotely with a high level of interaction and enhanced learning outcomes. To the best of our knowledge, interactive e-lecture flipped learning (iFEEL) is an active learning model that was applied for the first time in our college. It was implemented to evaluate the potential of this method to keep students focused during distance learning and encourage them to review the lecture information electronically prior to attending the virtual lecture [32]. The objective of this study was to introduce the iFEEL activity into the Pharmaceutical Quality Control and Good Manufacturing Practice (GMP) course and compare its educational outcomes against paper-based in-class group learning (PICkLE). Student scores, perceptions, and preferences for teaching modality were compared to determine the most effective strategy.
Important questions that can be addressed by the current study include: “how to motivate/engage students in online learning” and “how to ensure proper knowledge transfer”. Moreover, it is important to explore which novel teaching modalities should be permanently anchored in online learning. The overall goal of this study was to obtain a qualitative and quantitative assessment of the newly developed iFEEL activity by students and teachers in terms of effectiveness, quality assurance, and in comparison with the previously delivered live courses [11].

2. Methods

2.1. Study Design

This was a pilot study that utilized a mixed-methods design [33] and predominately fit the ADDIE (Analysis, Design, Development, Implementation, Evaluation) theoretical framework [34]. The study was conducted in the College of Pharmacy at KSU (Level 10, Bachelor’s degree (B. Pharm), male campus). At the time the study was conducted, the KSU College of Pharmacy was in the final stages of transitioning from offering both the B. Pharm and the Doctor of Pharmacy (PharmD) degrees to a single terminal degree—the PharmD. The Pharmaceutical Quality Control and GMP course (tutorials) was selected to pilot the iFEEL activity in the spring 2021 semester. In addition, data from the past five live offerings of this course (2018–2020) were retrospectively collected and compared with the data from the current remote delivery of the course. This course covers important pharmaceutical quality control tests with an overview of the various pharmacopoeias. By the completion of this course, the students should gain the basic knowledge and skills which they need to gain employment in the quality control of pharmaceutical products. Examples of topics covered by the course include friability, disintegration, dissolution tests, uniformity of dosage units, quality control of glass containers, and pyrogen and particulate matter tests.

2.2. Software and Applications

A variety of software and applications were utilized to facilitate student learning, comprehension, and online assessment. Blackboard® is the standard learning management system provided by KSU. Storyline 360®, an e-learning authoring application designed by Articulate Inc., New York, USA, and Camtasia® 2020 Education, a screen recording and video editing software, developed by TechSmith Corporation, Okemos, MI, USA, were utilized to design the e-lectures. Google Forms® software was provided as part of the free, web-based Google Docs Editors suite offered by Google.

2.3. Procedure

Due to COVID-19 pandemic precautions and according to the Saudi Arabian Ministry of Education’s directions, the course was delivered remotely. In the KSU-College of Pharmacy, cloud meetings software (such as Zoom meetings and Blackboard collaborate) was the standard method to deliver didactic lectures during the COVID-19 pandemic. Alternative methods also included posting recorded videos or PowerPoint presentations. These methods were primarily instructor focused and keeping students’ attention during the lecture could not be ensured or monitored. However, the current study utilized two self-learning interactive approaches namely: (A) paper-based (PICkLE) and (B) and interactive e-lecture (iFEEL) as shown in Figure 1.
In the PICkLE activity, a pre-class online test was distributed to students before each lecture to assess their baseline knowledge of the lecture information (Figure 1, Steps A1). The lecture material for PICkLE was provided in the form of printable PDF files distributed among the students. The students were informed that this material was intended to be studied during the virtual classroom (Step A2). During the class, students were instructed to read the lecture material and solve an open-book online exam in groups (Step A3). The instructor’s role involved answering the students’ inquiries on difficult/unclear lecture information, assessing the students’ answers immediately to evaluate their understanding of the lecture material, and emphasizing the common mistakes rather than delivering the lecture information in the traditional didactic method. After the classroom ended, the students took a post-class online test which was prepared in a method similar to the pre-test (A4).
In the iFEEL activity, the students took an online pre-test (step B1), and then the students were instructed to review the e-lecture and answer the questions embedded in the lecture prior to the class (Step B2). During the class, students were instructed to solve an open-book online exam, individually (Step B3). Similar to PICkLE, the instructor’s role involved resolving the students’ inquiries and emphasizing common mistakes. The students took the post-class test after the class ended (B4).
The course consisted of ten lectures (tutorials). The first five lectures were delivered using PICkLE (as a control) while the remaining five lectures were delivered using iFEEL. In addition, the lectures assigned for the iFEEL were closely related and were considered to be more suitable to be delivered by the audiovisual enhanced iFEEL method. The students were provided with the course lecture materials in printable PDF format at the beginning of the semester (i.e., the 1st lecture).
In previous course offerings before COVID-19, the PICkLE method had successfully been used with high levels of student acceptance. In the current study, students were divided into groups of three members to read lecture material and take the open-book online exam during each virtual lecture [35]. The open-book exam was designed using Google® forms to give prompt feedback to the students regarding their scores [35]. Subsequently, the instructor was able to view the students’ results immediately to assess their understanding of the lecture material, emphasize the common mistakes, and summarize the important information.
The iFEEL is a novel application of flipped learning (FL) using software-designed interactive e-lectures that were made accessible to students at least 24 h before the virtual lecture. The e-lectures were designed by Storyline 360® in the form of interactive slides that contained a written explanation of the lecture information, computerized audio narrations, animated photos/illustrations, and/or educational videos. In addition, the e-lectures were enriched with hyperlinks to provide extra supportive audio/visual explanations of difficult/important information. Most importantly, the lecture slides included numerous interactive questions distributed throughout the lecture. These interactive questions provided the students with prompt feedback/scoring to assess their comprehension and increase their interaction with the scientific material. If the student missed the correct answer, he was usually provided with (1–3) chances/trials until he answered the question correctly. In certain situations, hints were provided to help the student discover the correct answer if they missed it on the first attempt. Besides the standard multiple-choice and fill-in-the-blank questions, some interactive question types such as dragging and dropping, clicking on the correct spot in the picture, watching a video and discovering the mistake, arranging steps in correct order along with audio/visual-enhanced supporting tools were embedded to encourage critical thinking. The educational videos of the lectures were voiced over and edited by the CAMTASIA® program, whenever needed. The e-lecture involved a course progress icon to encourage students to complete the lecture. At the end of the lecture, the students were provided their cumulative score, immediately.
The e-lectures were designed with a restriction feature to prevent the student from skipping any slide and/or question. The student was able to move to the next slide only if he spent the pre-determined time on the slide and/or interacted with the required slide items.
The iFEEL activity was the first interactive e-lecture applied in this course. Therefore, a training video (in Arabic) was designed by the instructors and sent to the students prior to starting the iFEEL model to illustrate the e-lecture features and explain the proper way to interact with them. Captured screenshots of the e-lecture slides are presented in Figure 2. In addition, a model example of the e-lecture can be accessed through the following link: https://360.articulate.com/review/content/5f0c0097-d4ce-4393-a9e5-1793c3140d19/review (accessed on 17 March 2022).
In iFEEL, prior to attending the virtual lecture, the students had to complete the e-lecture and answer all the embedded questions with a minimum of 80% score. Each student was given unlimited attempts to obtain the required score. During the lecture, each student took an in-class open exam individually. Subsequently, the instructor discussed the results, emphasized the common mistakes, and summarized the important information. Student participation was determined by whether or not the student completed the e-lecture and got a minimum of 80% score in the embedded lecture questions, by the specified deadline.
The results of the PICkLE and iFEEL in-class quizzes were not subjected to score comparison due to the major differences between the testing methods (individual vs. group) in each technique; however, student comprehension was assessed by using other methods.
In the current course offering, each lecture, both PICkLE and iFEEL, involved a short pre-class test (pre-test) (Figure 1—Steps A1 and B1) and a post-class test (post-test) (Figure 1—Steps A4 and B4) to evaluate the effect of each learning activity on students’ comprehension of the lecture material. The pre- and post-tests were designed and made available to the students through Blackboard®. The post-test was predominately designed using the “question bank” feature to randomly select three different questions for each student. This step was undertaken to decrease the possibility of any increased scores due to memorization or screen capturing of the pre-test questions [36]. Only the post-test scores served as a reflection of the students’ overall performance and comprehension.
After completing the lectures for each learning method, the students took a paper-based open-book comprehensive exam on campus. Each of the two exams covered only the related lectures for the learning method.
The pre- and post-tests related to the lectures conducted with the PICkLE and iFEEL activities were grouped and analyzed separately. In addition, the results from students’ marks during the midterm exam (PICkLE activity), and the final exam (iFEEL activity) were compared to evaluate learning outcomes and student performance for each technique. In addition, the mean post-test scores of the past five live offerings of this course (2018–2020) were retrospectively collected and compared with remote PICkLE and iFEEL corresponding data. The exams results were statistically analyzed to determine whether there was a significant difference between these activities in terms of student scores.

2.4. Questionnaires to Evaluate Student’s and Faculty’s Perceptions

Each e-lecture contained an embedded questionnaire that solicited students’ opinions about the important lecture elements: the quality/clarity of lecture information, audio narration, educational videos, animations, and embedded questions. This step was necessary to assist the instructors in identifying any technical problems/student dissatisfaction periodically and making immediate modifications.
At the end of the current course, all students were encouraged to participate in a voluntary survey to solicit their feedback about both PICkLE and iFEEL [35]. The survey was in the Arabic language and designed to evaluate student feedback anonymously. The survey was created using Google Forms® and the survey link was sent by the instructor to the students. The survey consisted of three sections related to the PICkLE model, the iFEEL learning model, and general points/suggestions. In addition, data collected from the surveys distributed in the past live courses were analyzed and compared with the current course data.
Another anonymous, voluntary survey was sent to Pharmacy College faculty members at various Saudi Universities. The survey sample covered both governmental and private college faculty members and was intended to solicit their perceptions about iFEEL. The survey was created using Google Forms® with the option of selecting an Arabic or English version, and the survey link was sent to the faculty university emails. The survey consisted of three sections: general respondent information, evaluation of an example of the designed e-lectures, and perceptions and opinions about using e-lectures as a model for flipped learning.

2.5. Validity and Reliability of the Surveys

Each questionnaire was reviewed linguistically and analytically to confirm that the presentation of questionnaire items was relevant to the study scope, clear, and unambiguous to understand and respond to [37]. The questionnaire Likert scale items were rated from 1 (I strongly object/not completely helpful, etc.) to 5 (I strongly support/very helpful, etc.) and subsequently assessed by descriptive analysis namely, the mean and standard deviations [38]. The homogeneity of the faculty’s responses across different general characteristics (gender, academic rank, degree, and discipline) was assessed by ANOVA and Levene’s tests [39,40]. The student’s and faculty’s written comments were subjected to qualitative content analysis to find common themes among the perceptions [41]. The reliability (internal consistency) of the surveys was assessed through Cronbach’s alpha (α) value. An α value above 0.7 was considered acceptable [36,37,42].

2.6. Ethical Considerations

The protocol for this research was reviewed and approved by the KSU Medical College Institutional Review Board (#E-21-5718). Retrospective analyses of student scores and feedback were collected as part of normal assessment during the course. Students were informed in advance that the e-lectures contained an embedded survey to assess the quality of each lecture and identify any technical issues. Student course completion and faculty members’ surveys included a pre-statement that the survey was voluntary and anonymous.

2.7. Statistical Analysis

The analysis was conducted using the Statistical Package for the Social Sciences (SPSS) version 26 (IBM, Armonk, NY, USA). The ANOVA followed by LSD and independent t-test were used to compare students’ scores and attendance. A p-value of ≤0.05 was considered statistically significant.

3. Results

A total of 158 students were involved in the current study. One hundred forty-nine studied the course live from 2018–2020 and nine students took the course remotely in the spring 2021 semester. In the current remote course offering, 88.9% of students were able to complete the e-lectures with more than 80% score on time. Furthermore, the average attendance was 97.8% for PICkLE and 95.6% for IFEEL activity (Table 1). Student mean scores significantly increased from 16.4% (pre-test) to 93.5% (post-test) on the PiCkLE activity and from 22.2% (pre-test) to 95.2% (post-test) on the iFEEL activity (p < 0.05). Interestingly, mean comprehensive exam scores increased from 83.8% (for PICkLE) to 89.2% (for iFEEL). Specifically, mean post-test scores were higher in the iFEEL activity (95.2%) compared to both remote PICkLE (93.5%, p = 0.658) and live PICkLE (90.2%, p = 0.08). When comparing attendance, student scores in the pre-test, post-test, and comprehensive exams, there was no significant difference between the two activities (Table 1).
A total of 93 students completed the surveys which represented an 85.3% response rate of the total students who were surveyed (n = 109). In general, both models showed high student acceptance. Compared to remote PICkLE, student responses were overwhelmingly more positive for the iFEEL activity as shown in Table 2 and Table 3.
The ANOVA and Levene’s tests showed no significant differences (p > 0.05) of faculties perceptions across different gender, academic ranks, degrees, and disciplines. These findings confirm the homogeneity of the responses across these general characteristics. The Cronbach’s alpha (α) values for different students and faculty survey sections ranged from 0.783 to 0.912 indicating high internal consistency and reliability of the utilized tools.
Regarding the quality of the interactive e-lectures, 90–95% of students agreed/strongly agreed, with the clarity, quality, and benefits obtained from the audio explanation, educational videos, animated illustrations, and embedded questions (Table 2).
Interestingly, 94.1%, 62.5%, and 87.5% of the participants recommended that future offerings of the course incorporate the live PICkLE, remote PICkLE, and iFEEL activities, respectively. Regarding the live PICkLE activity, 88% and 70% confirmed the usefulness of the live PICkLE activity and their ability to remember the basic information studied in this activity, respectively. Similarly, 100% and 87.5% of the students felt that the remote PICkLE and iFEEL activities helped them to understand the lecture material, respectively (Table 3). Furthermore, 87.5% reported that the iFEEL activity was useful/highly useful in motivating them to review the lecture materials before the class and motivating them to focus during the lecture (Table 3); while 87.5% responded that the remote PICkLE activity was useful/highly useful in improving their inter-personal skills.
Students were also asked to compare the two models according to their preferences. When asked which model they preferred in terms of understanding the information, 75% preferred the iFEEL activity, 12.5% preferred the PICkLE activity, and 12.5% felt that both activities were equally the same. Interestingly, the student preference further increased towards iFEEL when they were asked about their preferences in terms of visualizing the equipment and their methods of operation. In this point, 87.5% preferred the iFEEL and 12.5% felt they were both the same (Table 3).
Regarding the post-tests, which were conducted in both learning models, 100% and 75% of students reported that the post-tests were useful/very useful in helping them understand key information and encouraging them to stay focused during the lecture, respectively (Table 3).
In respect to the faculty survey, thirty-two faculty members completed the survey; 62.5% were males, 69% were assistant professors and 87.5% were PhD holders (Table 4). Although the response rate for the faculty survey was very low, the responses represented nearly 50% of the Saudi universities with pharmacy/health care colleges.
Similar to the student survey, the faculty survey also showed high acceptance of the interactive e-lectures model. Regarding the quality of the interactive e-lectures, 78–94% of faculty members agreed/strongly agreed with the clarity, quality, and benefits obtained from the audio explanation, educational videos, animated illustrations, and embedded questions (Table 2).
Regarding using e-lectures as a model for flipped learning, 90% and 87% of faculty members reported that this model is helpful/very helpful in motivating students to preview the lecture material before attending the lecture and focus during the lecture, respectively (Table 5). In addition, 94% of faculty members perceived the “educational videos and animated illustrations” to be useful/very useful in understanding the lecture material (especially the material that requires practical skills or visual illustrations) (Table 5).
Interestingly, only 14% of faculty members regularly used e-lectures before the COVID-19 pandemic, 57% used it during the pandemic, and 29% never used it. However, 84% supported/strongly supported the integration of this type of interactive e-lectures in the courses they teach (Table 5). To explore students’ and faculties’ perceptions, a qualitative content analysis was employed to analyze the open-ended responses. After revision and refinement, the process yielded the most important themes captured from students’ and faculties’ responses. The results are described in Table 6, Table 7 and Table 8.

4. Discussion

In the current study, the PICkLE and iFEEL activities were evaluated for their potential enhancement of student engagement prior to and within the lecture. iFEEL embodied the qualities of a high-impact e-learning activity during the COVID-19 pandemic [22]. Students and faculty perceived that the content was presented in an effective and user-friendly format that engaged the students. Student learning was supported by synchronous in-class activities, and the PICkLE activity could always be used as a contingency plan if technical difficulties arise.
In live lectures, it is easy to facilitate group learning activities with no significant interruption from other group members. However, in online learning, implementation of the PICkLE activity faced some difficulties because each group of students had to enter a separate, unsupervised virtual meeting room to avoid interruption from their colleagues in the other groups. After completing the in-class group test, students had to come back again to the instructor’s main virtual room to complete the discussion. This challenge, along with occasional intermittent internet connection problems, hindered the smooth flow of the synchronous online discussions. Wong and Kan reported that maintaining effective interaction, among students and with the instructors, in small online group learning could be challenging [9]. Duszenko et al. reported that some students faced technical difficulties and poor interaction between group members during online group work. Accordingly, 54% of students opposed the notion that online teamwork helped them understand the topic [11]. Furthermore, Kalmar et al. found that students reported lowered productivity in group work and difficulty in receiving feedback from peers after switching to online education [43]. This could explain the reason for the relatively lower student preference using remote PICkLE activity versus the live PICkLE in future course offerings (Table 3). On the other hand, the iFEEL activity involved interactive embedded questions in the e-lecture which strengthened the students’ comprehension of the lecture material and helped them to take the in-class lecture test, individually. This can be confirmed by the instructor’s observation that the students required less time for taking the in-class test and/or discussing their results. However, both techniques had a positive influence on student attendance, participation, and exam scores. These encouraging results led to the incorporation of the developed e-lectures in the subsequent offering of the Pharmaceutical Quality Control and GMP course.
The high attendance rate, in both activities, indirectly reflects the acceptance of students of both methods in the current emergent distance-learning situation. Moreover, the high rate of student participation in the iFEEL activity reflects the high student acceptance and the lack of serious technical problems faced by students while reviewing the e-lectures. Rodrigues et al. also noted that e-learning was associated with higher student attendance in a systematic review conducted prior to COVID-19 [44]; however, Abbasi et al. reported that healthcare students expressed difficulties with course schedules during the pandemic [45]. These obstacles may have been more profound among distance learning naïve students than those who were more technologically literate [46,47].
An improvement in students’ marks is generally an indication that the students achieved the course-related learning outcomes. In this respect, the post-tests student scores increased by 5.7- and 4.2-folds (p < 0.05) compared to pre-tests in PICkLE and iFEEL, respectively. These findings are in agreement with a recent study that evaluated an e-learning course with computer-assisted simulation materials. The study showed a significant increase in the mean post-test score compared to that of the pre-test [36]. These findings imply that both iFeel and PICkLE were successful in improving students’ comprehension and learning outcomes.
The post-tests were very useful in helping students to understand the most important information and motivating them to focus during the lecture (Table 3). Similarly, several previous studies showed positive educational outcomes of “End of class” (post-lecture) tests in significantly increasing average exam scores [48], encouraging student introspection related to their personal comprehension [49,50], as well as improvements in student course satisfaction [51].
In our study, the iFEEL showed a non-significant increase in mean comprehensive exam score and post-test scores compared to PICKLE. These results are parallel to the findings of Wilson et al. that showed a non-significant increase in student performance when comparing a flipped classroom activity with traditional teaching methods [32]. The high percentage and similarities in student attendance and scores are indications that both PICkLE and iFEEL activities are non-inferior to each other and can be an alternative to the standard didactic lectures as a part of either live or distant learning courses. A recent study also concluded that a non-significant increase in student exam scores when comparing two different active learning activities is a positive indication that both strategies are equally effective in enhancing student learning and understanding as an alternative to traditional lectures [50]. Moreover, mean post-test scores were non-significantly higher in the iFEEL activity compared to pre-pandemic live PICkLE. Although no significant difference was found, these findings confirm that the adopted e-course was at least as effective for content knowledge as our pre-COVID face-to-face course. Duszenko et al. reported that none of the e-learning instructed students performed worse than their face-to-face instructed colleagues [11].
A recent study that implemented similar interactive e-lectures reported some technical issues such as tone quality, difficulties playing lectures (including using the full-screen mode on some equipment and incompatibility with PC operating system), or technical abilities of students (e.g., installing relevant software) [11]. This highlights the importance of evaluating the quality attributes of e-lectures periodically. In our study, the majority of students agreed/strongly agreed that the audio, videos, animated illustrations, and embedded questions were clear and of good quality. No student had negative comments about the quality of any element of the lecture and none reported any technical concerns.
Student perceptions of the iFEEL activity were more favorable than the PICkLE activity. Most students (78.5%) preferred the iFEEL over the PICkLE for future course offerings, particularly in terms of visualizing the equipment and their operation. This data represents a significant improvement in students’ acceptance of flipped learning compared to previous studies [52,53]. Wilson et al. found that only 5.7% of students preferred the flipped method alone [32]. Cho and Kim reported that students showed inferior self-directed learning readiness and instructor–student interaction in online flipped learning compared to their counterparts who received face-to-face flipped learning [10].
Most students affirmed the benefit of PICkLE for understanding the lecture and developing their personal skills. Similarly, Bouw et al. reported that students had positive perceptions toward student-led team-based learning activities and considered them as an effective method of peer-to-peer teaching [54]. Valler-Jones concluded that students stated that peer-led activities enhance many essential skills such as communication, teamwork, and leadership [55].
Similar to the students’ feedback, most faculty members considered the iFEEL activity to be of good quality and believed that it had a high potential to encourage student engagement with the lecture material both before and within the class. The increased percentage of faculty members that utilized interactive e-lectures during the COVID-19 pandemic may indicate the possible future acceptance of such a model in their regular teaching practices.
During the COVID-19 pandemic and the emergent distant learning, virtual cloud lectures were plagued by some technical issues such as the intermittent internet connection faced by some students [45,56,57]. Alternatively, some faculty chose to send their recorded videos or PowerPoint slides to the students [1,58]. A common drawback of these methods is that they are non-interactive, primarily instructor-focused, and lack the instructor’s supervision and/or control of how much time students spend reviewing the lecture materials. Even with the Blackboard feedback feature to monitor student access to the posted content, some students might just play the video and not pay attention to lecture explanation. These frustrations were echoed in many published studies conducted during the pandemic. Healthcare students preferred traditional lectures due to the lack of application of clinical skills, minimal opportunities for interaction, the poor delivery methods of technologically-naïve faculty members, vague or inadequate policies for online classes, exams, and grade distribution, limitations on exam times, and difficulty concentrating [45,46,57,59,60,61,62,63,64,65]. Recommendations for improvement included combining traditional teaching methods and blended learning, more interaction during lectures, and better-designed e-courses [45,62,66]. In our study, an attempt was made to overcome these obstacles with the utilization of the iFEEL activity. The interactive e-lectures were restricted to prevent the student from skipping any slide if he did not spend the pre-determined time on the slide, and/or interact with the required slide items. In addition, several embedded questions were distributed throughout the e-lecture slides. Student participation was only considered if he received a minimum of 80% score by the specified deadline. These factors together could have enhanced student focus during lecture review, step-by-step comprehension, and his interaction with the lecture materials. During the lecture period, students had to complete an individual open-book exam followed by a debriefing session with the instructor. iFEEL has several additional advantages including the use of technology to help students visualize the lecture material prior to the lecture and the ability to partially overcome the internet limitations through decreasing the time spent in live streaming of the lecture explanation and focusing more on discussion and activities. Moreover, this method also allows the students to study the lecture material at a suitable time based on their daily schedule. Pilkington and Hanif reported that the asynchronous sessions were more equitable than synchronous ones because students with difficult and challenging home/learning environments (such as disruptions at home, limited access to devices, poor internet, etc.) were minimally disadvantaged [1]. Duszenko et al. reported that students valued the greater flexibility and time efficiency, student-centered format (particularly for flipped classrooms) offered by an online learning environment [11]. These findings indicate that iFEEL has great potential for enhancing student learning outcomes and promoting an increased level of faculty involvement. Förster et al. reported many advantages of e-learning mentioned by lecturers, and, interestingly, all were in agreement with the views of the post-graduate medical trainees. The advantages included the flexibility and compatibility with each individual’s lifestyle as well as cost-savings and the elimination of hardships associated with commuting [63]. However, it should be noted that previous research also suggested that a supportive learning environment played a key role in the successful student adaptation of online education [4].
Several limitations should be discussed. Due to the limited number of current course participants, the results of the study may not be generalizable. Future studies should include larger samples, multiple class cohorts, or students from multiple universities. In this study, the designed e-lectures consumed a significant amount of effort and time to prepare. This might hinder faculty acceptance to apply such interactive e-lectures in their courses. Future steps should involve integration with information technology unit in order to facilitate the fast design of the e-lectures.
We recommend that future studies are designed to assess the long-term effects of iFEEL on knowledge retention and to define course types that are best suited for this activity. Furthermore, the study of benefits and burdens upon the faculty is necessary in order to facilitate implementing such e-learning activities. This study provides another active learning resource that can be incorporated into distance as well as blended learning, during and after the COVID-19 pandemic.

5. Conclusions

Educators must continue to strive to ensure that student comprehension and retention remain paramount. iFEEL represents a novel model of remote flipped learning and shows promising potential to be incorporated into online learning activities. The current model can be applied in healthcare courses and other disciplines that require a visual explanation and/or practical skills. Both faculty and students held positive opinions about the iFEEL activity. Furthermore, iFEEL post-test student scores significantly improved when compared to their pre-test results, and students preferred this activity over the PICkLE activity. In addition, the significant improvement in student pre-test scores and the non-significant differences between the iFEEL and PICkLE activities are indications that both activities can be potentially incorporated into distance learning courses. Developing distance-based engaging learning techniques, such as iFEEL, is essential for the current and future learning environments.

Author Contributions

Conceptualization, A.A.S.; formal analysis, A.A.S.; investigation, A.A.S., Z.A., A.Y.S. and O.Y.; methodology, Z.A.; supervision, A.A.S.; visualization, A.A.S. and A.Y.S.; writing—original draft, A.A.S., A.Y.S. and O.Y.; writing—review and editing, I.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research is funded by the Deanship of Scientific Research at King Saud University, research group no# (RG-1441-414). The APC is funded by the Deanship of Scientific Research at King Saud University, research group no# (RG-1441-414).

Institutional Review Board Statement

The protocol for this research was approved by Institutional Review Board of King Saud University- Medical College (Research Project No #E-21-5718, revised and approved on 10 June 2021).

Data Availability Statement

The data that support the findings of this study are available from the corresponding author, [AAS], upon reasonable request.

Acknowledgments

The authors extend their appreciation to the Deanship of Scientific Research at King Saud University for funding this work through research group no# (RG-1441-414).

Conflicts of Interest

The authors report no conflict of interest.

References

  1. Pilkington, L.I.; Hanif, M. An account of strategies and innovations for teaching chemistry during the COVID-19 pandemic. Biochem. Mol. Biol. Educ. 2021, 49, 320–322. [Google Scholar] [CrossRef]
  2. Ballad, C.A.C.; Labrague, L.J.; Cayaban, A.R.R.; Turingan, O.M.; Al Balushi, S.M. Self-directed learning readiness and learning styles among Omani nursing students: Implications for online learning during the COVID-19 pandemic. Nurs. Forum 2021, 57, 94–103. [Google Scholar] [CrossRef] [PubMed]
  3. Khtere, A.R.; Yousef, A.M.F. The Professionalism of Online Teaching in Arab Universities. Educ. Technol. Soc. 2021, 24, 1–12. [Google Scholar]
  4. Attarabeen, O.F.; Gresham-Dolby, C.; Broedel-Zaugg, K. Pharmacy student stress with transition to online education during the COVID-19 pandemic. Curr. Pharm. Teach. Learn. 2021, 13, 928–934. [Google Scholar] [CrossRef] [PubMed]
  5. Smith, E.; Boscak, A. A virtual emergency: Learning lessons from remote medical student education during the COVID-19 pandemic. Emerg. Radiol. 2021, 28, 445–452. [Google Scholar] [CrossRef]
  6. Haftador, A.M.; Shirazi, F.; Mohebbi, Z. Online class or flipped-jigsaw learning? Which one promotes academic motivation during the COVID-19 pandemic? BMC Med. Educ. 2021, 21. [Google Scholar] [CrossRef]
  7. Ketterer, B.; Childers, J.W.; Arnold, R.M. An Innovative Application of Online Learning for Hospice Education in Medicine Trainees. J. Palliat. Med. 2021, 24, 919–923. [Google Scholar] [CrossRef]
  8. Morgado, M.; Mendes, J.J.; Proença, L. Online Problem-Based Learning in Clinical Dental Education: Students’ Self-Perception and Motivation. Healthc 2021, 9, 420. [Google Scholar] [CrossRef]
  9. Wong, F.M.F.; Kan, C.W.Y. Online Problem-Based Learning Intervention on Self-Directed Learning and Problem-Solving through Group Work: A Waitlist Controlled Trial. Int. J. Environ. Res. Public Health 2022, 19, 720. [Google Scholar] [CrossRef]
  10. Cho, M.K.; Kim, M.Y. Factors Influencing SDL Readiness and Self-Esteem in a Clinical Adult Nursing Practicum after Flipped Learning Education: Comparison of the Contact and Untact Models. Int. J. Environ. Res. Public Health 2021, 18, 1–12. [Google Scholar]
  11. Duszenko, M.; Fröhlich, N.; Kaupp, A.; Garaschuk, O. All-digital training course in neurophysiology: Lessons learned from the COVID-19 pandemic. BMC Med. Educ. 2022, 22. [Google Scholar] [CrossRef]
  12. Belfi, L.M.; Bartolotta, R.J.; Giambrone, A.E.; Davi, C.; Min, R.J. “Flipping” the Introductory Clerkship in Radiology: Impact on Medical Student Performance and Perceptions. Acad. Radiol. 2015, 22, 794–801. [Google Scholar] [CrossRef] [PubMed]
  13. Co, M.; Chung, P.H.Y.; Chu, K.M. Online teaching of basic surgical skills to medical students during the COVID-19 pandemic: A case-control study. Surg. Today 2021, 51, 1404–1409. [Google Scholar] [CrossRef] [PubMed]
  14. Co, M.; Chu, K.M. Distant surgical teaching during COVID-19—A pilot study on final year medical students. Surg. Pract. 2020, 24, 105–109. [Google Scholar] [CrossRef]
  15. Suppan, M.; Gartner, B.; Golay, E.; Stuby, L.; White, M.; Cottet, P.; Abbas, M.; Iten, A.; Harbarth, S.; Suppan, L. Teaching Adequate Prehospital Use of Personal Protective Equipment During the COVID-19 Pandemic: Development of a Gamified e-Learning Module. JMIR Serious Games 2020, 8, e20173. [Google Scholar] [CrossRef]
  16. Chou, P.N. Effect of students’ self-directed learning abilities on online learning outcomes: Two exploratory experiments in electronic engineering. Int. J. Humanit. Soc. Sci. 2012, 2, 172–179. [Google Scholar]
  17. Alassaf, P.; Szalay, Z.G. Transformation toward e-learning: Experience from the sudden shift to e-courses at COVID-19 time in Central European countries; Students’satisfaction perspective. Stud. Mundi Econ. 2020, 7. [Google Scholar] [CrossRef]
  18. Gonzalez, T.; De la Rubia, M.A.; Hincz, K.P.; Comas-Lopez, M.; Subirats, L.; Fort, S.; Sacha, G.M. Influence of COVID-19 confinement on students’ performance in higher education. PLoS ONE 2020, 15, e0239490. [Google Scholar] [CrossRef]
  19. Khalil, R.; Mansour, A.E.; Fadda, W.A.; Almisnid, K.; Aldamegh, M.; Al-Nafeesah, A.; Alkhalifah, A.; Al-Wutayd, O. The sudden transition to synchronized online learning during the COVID-19 pandemic in Saudi Arabia: A qualitative study exploring medical students’ perspectives. BMC Med. Educ. 2020, 20. [Google Scholar] [CrossRef]
  20. Dhawan, S. Online Learning: A Panacea in the Time of COVID-19 Crisis. J. Educ. Technol. Syst. 2020, 49, 5–22. [Google Scholar] [CrossRef]
  21. Schlenz, M.A.; Schmidt, A.; Wöstmann, B.; Krämer, N.; Schulz-Weidner, N. Students’ and lecturers’ perspective on the implementation of online learning in dental education due to SARS-CoV-2 (COVID-19): A cross-sectional study. BMC Med. Educ. 2020, 20, 1–7. [Google Scholar] [CrossRef]
  22. Bao, W. COVID-19 and online teaching in higher education: A case study of Peking University. Hum. Behav. Emerg. Technol. 2020, 2, 113. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Rogowska, A.M.; Kuśnierz, C.; Bokszczanin, A. Examining Anxiety, Life Satisfaction, General Health, Stress and Coping Styles During COVID-19 Pandemic in Polish Sample of University Students. Psychol. Res. Behav. Manag. 2020, 13, 797–811. [Google Scholar] [CrossRef] [PubMed]
  24. Melgaard, J.; Monir, R.; Lasrado, L.A.; Fagerstrøm, A. Academic Procrastination and Online Learning during the COVID-19 Pandemic. Procedia Comput. Sci. 2022, 196, 117–124. [Google Scholar] [CrossRef] [PubMed]
  25. Patricia Aguilera-Hermida, A. College students’ use and acceptance of emergency online learning due to COVID-19. Int. J. Educ. Res. Open 2020, 1, 100011. [Google Scholar] [CrossRef] [PubMed]
  26. Pelikan, E.R.; Lüftenegger, M.; Holzer, J.; Korlat, S.; Spiel, C.; Schober, B. Learning during COVID-19: The role of self-regulated learning, motivation, and procrastination for perceived competence. Zeitschrift Fur Erziehungswiss. 2021, 24, 1. [Google Scholar] [CrossRef]
  27. Hong, J.C.; Lee, Y.F.; Ye, J.H. Procrastination predicts online self-regulated learning and online learning ineffectiveness during the coronavirus lockdown. Pers. Individ. Dif. 2021, 174, 110673. [Google Scholar] [CrossRef]
  28. Rasheed, R.A.; Kamsin, A.; Abdullah, N.A. Challenges in the online component of blended learning: A systematic review. Comput. Educ. 2020, 144, 103701. [Google Scholar]
  29. Adedoyin, O.B.; Soykan, E. COVID-19 pandemic and online learning: The challenges and opportunities. Interact. Learn. Environ. 2020, 1–13. [Google Scholar] [CrossRef]
  30. Favale, T.; Soro, F.; Trevisan, M.; Drago, I.; Mellia, M. Campus traffic and e-Learning during COVID-19 pandemic. Comput. Networks 2020, 176. [Google Scholar] [CrossRef]
  31. Rapanta, C.; Botturi, L.; Goodyear, P.; Guàrdia, L.; Koole, M. Online University Teaching During and After the COVID-19 Crisis: Refocusing Teacher Presence and Learning Activity. Postdigital Sci. Educ. 2020, 2, 923–945. [Google Scholar] [CrossRef]
  32. Wilson, J.A.; Waghel, R.C.; Dinkins, M.M. Flipped classroom versus a didactic method with active learning in a modified team-based learning self-care pharmacotherapy course. Curr. Pharm. Teach. Learn. 2019, 11, 1287–1295. [Google Scholar] [CrossRef] [PubMed]
  33. Schoonenboom, J.; Johnson, R.B. How to Construct a Mixed Methods Research Design. Kolner Z. Soz. Sozpsychol. 2017, 69, 107. [Google Scholar] [CrossRef] [PubMed]
  34. Branch, R.M. Instructional design: The ADDIE approach. Instr. Des. ADDIE Approach 2010, 1–203. [Google Scholar] [CrossRef]
  35. Lull, M.E.; Mathews, J.L. Online Self-testing Resources Prepared by Peer Tutors as a Formative Assessment Tool in Pharmacology Courses. Am. J. Pharm. Educ. 2016, 80, 124. [Google Scholar] [CrossRef]
  36. Ohsato, A.; Seki, N.; Nguyen, T.T.T.; Moross, J.; Sunaga, M.; Kabasawa, Y.; Kinoshita, A.; Morio, I. Evaluating e-learning on an international scale: An audit of computer simulation learning materials in the field of dentistry. J. Dent. Sci. 2022, 17, 535–544. [Google Scholar] [CrossRef]
  37. Aithal, A.; Aithal, P.S. Development and Validation of Survey Questionnaire & Experimental Data—A Systematical Review-based Statistical Approach. SSRN Electron. J. 2020. [Google Scholar] [CrossRef]
  38. Hurtado-Parrado, C.; Gantiva, C.; Gómez-A, A.; Cuenya, L.; Ortega, L.; Rico, J.L. Editorial: Research on Emotion and Learning: Contributions from Latin America. Front. Psychol. 2020, 11, 11. [Google Scholar] [CrossRef]
  39. Devkaran, S.; O’Farrell, P.N.; Ellahham, S.; Arcangel, R. Impact of repeated hospital accreditation surveys on quality and reliability, an 8-year interrupted time series analysis. BMJ Open 2019, 9, 1V. [Google Scholar] [CrossRef] [Green Version]
  40. O’Neill, M.E.; Mathews, K.L. Levene tests of homogeneity of variance for general block and treatment designs. Biometrics 2002, 58, 216–224. [Google Scholar] [CrossRef]
  41. Cho, H.J.; Zhao, K.; Lee, C.R.; Runshe, D.; Krousgrill, C. Active learning through flipped classroom in mechanical engineering: Improving students’ perception of learning and performance. Int. J. STEM Educ. 2021, 8. [Google Scholar] [CrossRef]
  42. Mahmoud, M.A.; Islam, M.A.; Ahmed, M.; Bashir, R.; Ibrahim, R.; Al-Nemiri, S.; Babiker, E.; Mutasim, N.; Alolayan, S.O.; Al Thagfan, S.; et al. Validation of the arabic version of general medication adherence scale (GMAS) in sudanese patients with diabetes mellitus. Risk Manag. Healthc. Policy 2021, 14, 4235–4241. [Google Scholar] [CrossRef] [PubMed]
  43. Kalmar, E.; Aarts, T.; Bosman, E.; Ford, C.; de Kluijver, L.; Beets, J.; Veldkamp, L.; Timmers, P.; Besseling, D.; Koopman, J.; et al. The COVID-19 paradox of online collaborative education: When you cannot physically meet, you need more social interactions. Heliyon 2022, 8, e08823. [Google Scholar] [CrossRef] [PubMed]
  44. Rodrigues, H.; Almeida, F.; Figueiredo, V.; Lopes, S.L. Tracking e-learning through published papers: A systematic review. Comput. Educ. 2019, 136, 87–98. [Google Scholar] [CrossRef]
  45. Abbasi, M.S.; Ahmed, N.; Sajjad, B.; Alshahrani, A.; Saeed, S.; Sarfaraz, S.; Alhamdan, R.S.; Vohra, F.; Abduljabbar, T. E-Learning perception and satisfaction among health sciences students amid the COVID-19 pandemic. Work 2020, 67, 549–556. [Google Scholar] [CrossRef]
  46. Sindiani, A.M.; Obeidat, N.; Alshdaifat, E.; Elsalem, L.; Alwani, M.M.; Rawashdeh, H.; Fares, A.S.; Alalawne, T.; Tawalbeh, L.I. Distance education during the COVID-19 outbreak: A cross-sectional study among medical students in North of Jordan. Ann. Med. Surg. 2020, 59, 186–194. [Google Scholar] [CrossRef]
  47. Freeze, R.D.; Alshare, K.A.; Lane, P.L.; Wen, H.J. IS success model in e-learning context based on students’ perceptions. J. Inf. Syst. Educ. 2010, 21, 173–184. [Google Scholar]
  48. Chan, P.; Kim, S.; Garavalia, L.; Wang, J. Implementing a strategy for promoting long-term meaningful learning in a pharmacokinetics course. Curr. Pharm. Teach. Learn. 2018, 10, 1048–1054. [Google Scholar] [CrossRef]
  49. Vinall, R.; Kreys, E. Use of End-of-Class Quizzes to Promote Pharmacy Student Self-Reflection, Motivate Students to Improve Study Habits, and to Improve Performance on Summative Examinations. Pharm 2020, 8, 167. [Google Scholar] [CrossRef]
  50. Shahba, A.A.; Sales, I. Design Your Exam (DYE): A Novel Active Learning Technique to Increase Pharmacy Student Engagement in the Learning Process. Saudi Pharm. J. SPJ Off. Publ. Saudi Pharm. Soc. 2021, 29, 1323–1328. [Google Scholar] [CrossRef]
  51. Hennig, S.; Staatz, C.E.; Bond, J.A.; Leung, D.; Singleton, J. Quizzing for success: Evaluation of the impact of feedback quizzes on the experiences and academic performance of undergraduate students in two clinical pharmacokinetics courses. Curr. Pharm. Teach. Learn. 2019, 11, 742–749. [Google Scholar] [CrossRef] [PubMed]
  52. Khanova, J.; McLaughlin, J.E.; Rhoney, D.H.; Roth, M.T.; Harris, S. Student Perceptions of a Flipped Pharmacotherapy Course. Am. J. Pharm. Educ. 2015, 79, 140. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  53. Giuliano, C.A.; Moser, L.R. Evaluation of a Flipped Drug Literature Evaluation Course. Am. J. Pharm. Educ. 2016, 80, 66. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  54. Bouw, J.W.; Gupta, V.; Hincapie, A.L. Assessment of students’ satisfaction with a student-led team-based learning course. J. Educ. Eval. Health Prof. 2015, 12, 23. [Google Scholar] [CrossRef] [PubMed]
  55. Valler-Jones, T. The impact of peer-led simulations on student nurses. Br. J. Nurs. 2014, 23, 321–326. [Google Scholar] [CrossRef] [PubMed]
  56. Camargo, C.P.; Tempski, P.Z.; Busnardo, F.F.; Martins, M.d.A.; Gemperli, R. Online learning and COVID-19: A meta-synthesis analysis. Clinics 2020, 75, e2286. [Google Scholar] [CrossRef] [PubMed]
  57. Gismalla, M.D.-A.; Mohamed, M.S.; Ibrahim, O.S.O.; Elhassan, M.M.A.; Mohamed, M.N. Medical students’ perception towards E-learning during COVID 19 pandemic in a high burden developing country. BMC Med. Educ. 2021, 21, 377. [Google Scholar] [CrossRef]
  58. Gewin, V. Five tips for moving teaching online as COVID-19 takes hold. Nature 2020, 580, 295–296. [Google Scholar]
  59. Alsoufi, A.; Alsuyihili, A.; Msherghi, A.; Elhadi, A.; Atiyah, H.; Ashini, A.; Ashwieb, A.; Ghula, M.; Ben Hasan, H.; Abudabuos, S.; et al. Impact of the COVID-19 pandemic on medical education: Medical students’ knowledge, attitudes, and practices regarding electronic learning. PLoS ONE 2020, 15, e0242905. [Google Scholar] [CrossRef]
  60. Ramos-Morcillo, A.J.; Leal-Costa, C.; Moral-García, J.E.; Ruzafa-Martínez, M. Experiences of Nursing Students during the Abrupt Change from Face-to-Face to e-Learning Education during the First Month of Confinement Due to COVID-19 in Spain. Int. J. Environ. Res. Public Health 2020, 17, 5519. [Google Scholar] [CrossRef]
  61. Sarwar, H.; Akhtar, H.; Naeem, M.M.; Khan, J.A.; Waraich, K.; Shabbir, S.; Hasan, A.; Khurshid, Z. Self-Reported Effectiveness of e-Learning Classes during COVID-19 Pandemic: A Nation-Wide Survey of Pakistani Undergraduate Dentistry Students. Eur. J. Dent. 2020, 14, S34–S43. [Google Scholar] [CrossRef] [PubMed]
  62. Al Zahrani, E.M.; Al Naam, Y.A.; AlRabeeah, S.M.; Aldossary, D.N.; Al-Jamea, L.H.; Woodman, A.; Shawaheen, M.; Altiti, O.; Quiambao, J.V.; Arulanantham, Z.J.; et al. E- Learning experience of the medical profession’s college students during COVID-19 pandemic in Saudi Arabia. BMC Med. Educ. 2021, 21, 443. [Google Scholar] [CrossRef]
  63. Förster, C.; Eismann-Schweimler, J.; Stengel, S.; Bischoff, M.; Fuchs, M.; Graf von Luckner, A.; Ledig, T.; Barzel, A.; Maun, A.; Joos, S.; et al. Opportunities and challenges of e-learning in vocational training in General Practice—A project report about implementing digital formats in the KWBW-Verbundweiterbildung(plus). GMS J. Med. Educ. 2020, 37, Doc97. [Google Scholar] [CrossRef] [PubMed]
  64. AlQhtani, A.; AlSwedan, N.; Almulhim, A.; Aladwan, R.; Alessa, Y.; AlQhtani, K.; Albogami, M.; Altwairqi, K.; Alotaibi, F.; AlHadlaq, A.; et al. Online versus classroom teaching for medical students during COVID-19: Measuring effectiveness and satisfaction. BMC Med. Educ. 2021, 21, 452. [Google Scholar] [CrossRef]
  65. Bhattarai, B.; Gupta, S.; Dahal, S.; Thapa, A.; Bhandari, P. Perception of Online Lectures among Students of a Medical College in Kathmandu: A Descriptive Cross-sectional Study. JNMA J. Nepal Med. Assoc. 2021, 59, 234–238. [Google Scholar] [CrossRef]
  66. Ibrahim, N.K.; Al Raddadi, R.; AlDarmasi, M.; Al Ghamdi, A.; Gaddoury, M.; AlBar, H.M.; Ramadan, I.K. Medical students’ acceptance and perceptions of e-learning during the COVID-19 closure time in King Abdulaziz University, Jeddah. J. Infect. Public Health 2021, 14, 17–23. [Google Scholar] [CrossRef]
Figure 1. Graphical representation of the execution steps of (A) PICkLE and (B) iFEEL activities. * The students were informed that the lecture material was intended to be studied during the virtual classroom.
Figure 1. Graphical representation of the execution steps of (A) PICkLE and (B) iFEEL activities. * The students were informed that the lecture material was intended to be studied during the virtual classroom.
Ijerph 19 03902 g001
Figure 2. Captured screenshots of the e-lecture (iFEEL) slides.
Figure 2. Captured screenshots of the e-lecture (iFEEL) slides.
Ijerph 19 03902 g002
Table 1. Students’ average attendance, participation, and exam score.
Table 1. Students’ average attendance, participation, and exam score.
Teaching TechniqueAverage Attendance
(%) *
Mean Pre-Test Score
(%)
Mean Post-Test Score
(%)
Mean Comprehensive Exam Score (%)
Live Paper-based learning method (PICkLE), n = 149NANA90.2 ± 8.26NA
Remote Paper-based learning method (PICKLE), n = 997.8 ± 5.016.4 ± 17.193.5 ± 8.483.8 ± 18.7
Remote Interactive-electronic learning method (iFEEL), n = 995.6 ± 9.922.2 ± 16.395.2 ± 7.789.2 ± 9.2
Statistical testIndependent t-testIndependent t-testANOVA followed by LSDIndependent t-test
p-value0.6670.474iFEEl vs. live PICkLE, (0.08), iFEEl vs. remote PICkLE (0.658) 0.449
* Data are presented as percentage of total class students (mean ± SD).
Table 2. Students’ and faculty members’ perceptions on the five quality attributes of the interactive e-lecture.
Table 2. Students’ and faculty members’ perceptions on the five quality attributes of the interactive e-lecture.
Question **Students (Responses = 40),
Faculty (Responses = 32)
I Strongly Support (%) *I Support (%) *Neutral (%) *I object (%) *I Strongly Object (%) *MeanStandard Deviation
The lecture material was clear and easy to understandStudents72.5%22.5%5%004.680.57
Faculty31.3%56.3%12.5%004.190.64
The audio quality and illustrations were clearStudents65.0%27.5%7.5%004.580.64
Faculty37.5%56.3%6.3%004.310.59
The lecture animation helped me to understand the lecture material Students65.0%25.0%10.0%004.550.68
Faculty25.0%53.1%18.8%004.000.76
The video quality was goodStudents72.5%22.5%5%004.680.57
Faculty34.4%50.0%12.5%004.160.77
The embedded questions were interactive and helped me to understand the lectureStudents72.5%17.5%10%004.630.67
Faculty31.3%50.0%15.6%004.090.78
* Data are presented as the percentage of total class students’ and faculty responses. ** Questions were rephrased to combine both students and faculty members’ responses.
Table 3. Students’ perceptions about the live PICkLE, remote PICkLE, and iFEEL activity.
Table 3. Students’ perceptions about the live PICkLE, remote PICkLE, and iFEEL activity.
QuestionLearning MethodI Strongly Support
(%) *
I Support
(%) *
Neutral
(%) *
I Object
(%) *
I Strongly Object
(%) *
MeanStandard Deviation
Do you support incorporating these learning models in future course offerings?Live PICkLE
(n = 84)
77.4%16.7%4.8%1.2%04.700.62
Remote PICkLE
(n = 8)
62.5%012.5%25%04.001.41
iFEEL
(n = 8)
75%12.5%012.5%04.501.07
QuestionLearning MethodExcellent
(%) *
Good
(%) *
Acceptable
(%) *
Weak
(%) *
Very Weak
(%) *
MeanStandard Deviation
How well do you remember the basic information you studied in the course? Live PICkLE
(n = 85)
12.9%57.6%24.7%2.4%2.4%3.760.80
How would you rate your understanding of the lecture material? Remote PICkLE
(n = 8)
62.5%37.5%0004.630.52
iFEEL
(n = 8)
75%12.5%12.5%004.630.74
QuestionLearning MethodVery
Useful
(%) *
Useful
(%) *
Neutral
(%) *
Not Useful
(%) *
Absolutely Useless
(%) *
MeanStandard Deviation
How useful was studying this course in groups? Live PICkLE
(n = 85)
57.6%30.6%8.2%1.2%2.4%4.400.88
How useful was this model in developing your personal skills (teamwork - the ability to negotiate and persuade–decision-making)? Remote
PICkLE
(n = 8)
50%37.5%12.5%004.380.74
How useful was the “e-lectures” in motivating you to review the scientific material before attending the lecture? iFEEL
(n = 8)
75%12.5%12.5%004.630.74
How useful was the “e-lectures” in motivating you to focus during the lecture?iFEEL
(n = 8)
75%12.5%12.5%004.630.74
QuestionRemote PICkLE *iFEEL *Both are Same *None of Them *
Which method do you prefer to use in teaching this course in terms of clarification and retention the information? (n = 8)12.5%75%12.5%0
Which method do you prefer to use in teaching this course in terms of visualizing the instruments and how they work? (n = 8)087.5%12.5%0
* Data are presented as the percentage of total class students’ responses (n = 8).
Table 4. Faculty demographics.
Table 4. Faculty demographics.
GenderMale *Female *
62.5%37.5%
Academic rankProfessor *Associate Professor *Assistant Professor *Lecturer *Teaching Assistant *
3.1%15.6%68.8%12.5%0
Academic DegreePh.D. *M.Sc. *Pharm. D *Bachelor *M.D *
87.5%12.5%000
* Data are presented as the percentage of total faculties’ responses (n = 32).
Table 5. Evaluation of using e-lectures as a model for flipped learning by faculty.
Table 5. Evaluation of using e-lectures as a model for flipped learning by faculty.
QuestionVery Helpful
(%) *
Helpful
(%) *
Neutral
(%) *
Not Helpful
(%) *
Not
Completely Helpful (%) *
MeanStandard Deviation
What is the expected effect of using this model in motivating the student to peruse the scientific material before attending the lecture? (n = 32)25.065.13.16.304.090.73
What is the expected effect of using this model in motivating the student to focus during the lecture? (n = 32)40.646.99.43.104.250.76
QuestionVery Useful (%) *Useful (%) *Neutral (%) *Not Useful (%) *Not Completely Useful (%) *MeanStandard Deviation
How useful do you consider the “visual instructional videos and animated illustrations” in understanding the material (especially the one that needs practical skill or visual illustration)? (n = 32)53.140.66.3004.470.62
QuestionI Strongly Support (%) *I Support (%) *Neutral (%) *I Object (%) *I Strongly
Object (%) *
MeanStandard Deviation
Do you support the integration of this type of interactive e-lectures in the courses you teach? (n = 32)37.546.99.43.13.14.130.94
* Data are presented as the percentage of total faculties’ responses.
Table 6. Advantages and Disadvantages of paper-based in class group learning (PICkLE) from students’ point of view.
Table 6. Advantages and Disadvantages of paper-based in class group learning (PICkLE) from students’ point of view.
ItemInitial CodesNumber of ReferencesMain Themes
Advantages
  • Learning by extracting information and answering questions
  • Learning ways to search for information
2Self-directed learning
  • Deep understanding
  • Ease access to the information
  • Simplified in a way that does not exclude crucial information
2Effective comprehension
  • Teamwork
  • Decision-making
  • Responsibility
  • Peer learning
2Enhancing interpersonal skills
Disadvantages
  • Time consuming
  • Inappropriate class schedule
2Time-related issues
  • Language difficulty
  • A lot of reading
1Difficulty in comprehension
  • Not (physically) dealing with devices considering that we have dealt with them in previous courses
1Lack of hands-on training for practical aspects
Table 7. Advantages and disadvantages of interactive flipped e-learning activity (iFEEL) from students’ points of view.
Table 7. Advantages and disadvantages of interactive flipped e-learning activity (iFEEL) from students’ points of view.
ItemInitial CodesNumber of ReferencesMain Themes
Advantages
  • Concise, useful, and clear
  • Not boring
  • Explanation with examples, questions and testing quickly
3Appropriate format/structure
  • Time flexibility
  • Ability to re-access the lecture anytime
1Flexible ways of learning
Disadvantages
  • Not (physically) dealing with devices considering that we have dealt with them in previous courses
1Lack of hands-on training for practical aspects
  • Not taking it seriously
1Lack of dedication
Table 8. Advantages and disadvantages of interactive flipped e-learning activity (iFEEL) from faculty points of view.
Table 8. Advantages and disadvantages of interactive flipped e-learning activity (iFEEL) from faculty points of view.
ItemInitial CodesNumber of ReferencesMain Themes
Advantages
  • Current generation suitability
  • Providing ways for different learning styles
2Modern Teaching/learning strategy
  • Clearance and simplicity
  • Information flow
  • Higher level of comprehension
  • Visualization
  • Good for the practical aspect of the course
12Appropriate format/structure
  • Draws the student’s attention to the lecture information
  • Motivates students to interact with the lecture questions
  • Increases fun during learning
  • Motivates students to prepare for the lecture, focus during the lecture, and interact with the instructor and his colleagues
  • Allows more time for activities and exercises during the class
  • Engaging
9Engaging and Interactive
  • Student can study/revise the lecture at his convenience
3Flexibility
  • Student-centered and self-directed learning
3Student-centered
  • Better time management for the students and the instructors
2Time effective
Disadvantages
  • Preparation and execution are time consuming for the instructors
  • Time consuming for the student
5Time-related issues
  • Software costs
  • Internet issues
  • Requires technical skills
4Technical issues
  • Not applicable to all topics particularly theory extensive courses
  • The design is suitable for self-education and work-shops rather than academic activities
4Subject/topic issues
Monitoring system is required to check compliance4Compliance issues
  • Not taking into account individual differences among students
  • Limits direct contact and live interaction with the instructor
  • Completing the e-lecture does not mean that the student benefited from the lecture material
5Poor comprehension and interaction
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Shahba, A.A.; Alashban, Z.; Sales, I.; Sherif, A.Y.; Yusuf, O. Development and Evaluation of Interactive Flipped e-Learning (iFEEL) for Pharmacy Students during the COVID-19 Pandemic. Int. J. Environ. Res. Public Health 2022, 19, 3902. https://doi.org/10.3390/ijerph19073902

AMA Style

Shahba AA, Alashban Z, Sales I, Sherif AY, Yusuf O. Development and Evaluation of Interactive Flipped e-Learning (iFEEL) for Pharmacy Students during the COVID-19 Pandemic. International Journal of Environmental Research and Public Health. 2022; 19(7):3902. https://doi.org/10.3390/ijerph19073902

Chicago/Turabian Style

Shahba, Ahmad A., Zaid Alashban, Ibrahim Sales, Abdelrahman Y. Sherif, and Osman Yusuf. 2022. "Development and Evaluation of Interactive Flipped e-Learning (iFEEL) for Pharmacy Students during the COVID-19 Pandemic" International Journal of Environmental Research and Public Health 19, no. 7: 3902. https://doi.org/10.3390/ijerph19073902

APA Style

Shahba, A. A., Alashban, Z., Sales, I., Sherif, A. Y., & Yusuf, O. (2022). Development and Evaluation of Interactive Flipped e-Learning (iFEEL) for Pharmacy Students during the COVID-19 Pandemic. International Journal of Environmental Research and Public Health, 19(7), 3902. https://doi.org/10.3390/ijerph19073902

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop