Quality Assurance Framework for the Design and Delivery of Virtual, Real-Time Courses
Abstract
:1. Introduction
1.1. Educational Pedagogies
1.1.1. Outcome-Based Education
1.1.2. Active Learning
1.2. Quality Assurance
2. Method Used to Design the Quality Assurance Framework
- Outcome-based education
- Active learning practices
- Virtual, real-time teaching and learning
3. Conceptual QA Framework
- An institution “Information and Communication Technology Support” (ICTS) constituent that manages the Learning Management System (LMS) and provides support to educators during the design and delivery of courses.
- An institution “Teaching and Learning Support” (TLS) constituent that trains educators in outcome-based teaching and active learning methods. Training of educators and students in the institution’s LMS is ideally managed by both the ICTS and TLS where the TLS focuses on active learning capabilities of the LMS. In addition, TLS train educators in developing effective communication skills in an online environment.
- A “Course Management System” (CMS) constituent that guides the sequential actions of educators for the design, delivery, and direct assessment of outcome-based, virtual, real-time classes that incorporate active learning practices; this constituent also provides a system for the indirect assessment of the performance of educators.
4. The Course Management System
4.1. Module 1—Outcome-Based Lesson Preparation and Rehearsal
4.1.1. Module1/Phase1: Design Class Material, Learning Methods, and Direct Assessments
Design Structure
Educator Preparation
- Learn about the capabilities of the institution’s LMS; typically, this is provided by the institution’s ICTS. This training provides educators with the tools, and limitations, for the virtual, real-time delivery mode of teaching.
- Learn about different active learning methods; typically, this is provided by the institution’s TLS. For the virtual, real-time mode of delivery, the focus should be on the active learning methods that can be implemented through the institution’s LMS; educators can be trained on these LMS features by TLS in collaboration with ICTS. Two active learning tools, and some ways they could translate from face-to-face to online modes of delivery, are shown as examples in Table 1.
- Learn about communication techniques that are effective in an online environment; this training may be provided by TLS.
Course Structure (Concurrently with Educator Preparation)
- Formulate the major topics that the course will cover. These need to be congruent with other courses in a curriculum/program.
- Formulate the Course Learning Outcomes (CLOs) for the course that are congruent with the course topics and align with student objectives (or student outcomes) of an educational program. TLS can be charged with training educators in formulating informative CLOs.
Course Material
- Design the course material, that is, the content of each class session/module. Active-learning activities are designed concurrently with, and are incorporated within, the material that is being taught.
- Design the “direct assessments” (for example, exams, quizzes, projects, presentations) concurrently with the course material; these direct assessments are linked to CLOs. There should be reciprocal links between ‘what and how’ material is delivered with ‘what and how’ the material is assessed: just as the taught material dictate assessments, assessments should inform the material that is taught. Indeed, such a system links taught material to CLOs, since the achievement and attainment of CLOs is primarily measured from direct assessments.
- As the course material is being prepared, educators may need to refine their “Course Structure” and or undergo further “Educator Preparation”.
- Educators can more easily design more frequent assessments that allow them to get more recurring feedback on student performance; this also imposes some regulation on study habits of students [60].
- Students receive immediate feedback on their performance on direct assessments, if automatically graded, for example for assessments that rely on choice formats (for example, multiple choice or true/false).
- Students may acquire some flexibility in their timeframes to complete some assessments.
- Students may experience reduced anxiety due to the change of venue from a traditional classroom [61].
- Educators may rely more heavily on choice formats than other forms, like short-answers and active drawing, that could be more reliable indicators of CLO acquisition.
- Academic dishonesty may become more prevalent [66].
- Mechanisms that are designed to lessen the likelihood of academic dishonesty may infringe on privacy.
- There may be a possible disconnect between test scores and cognitive engagement [67].
Lesson Planner
- Each course topic is divided into a series of sessions.
- Each session has activities that incorporate active learning practices.
- Each activity is designated a delivery method in the context of a virtual, real-time class.
- Each delivery method is associated with an estimated time for completion.
- Direct assessment components are linked to CLOs and aligned with activities.
General Considerations during the Design of Class Sessions
- How much setup is needed by the educator?
- How much instruction is needed for students to engage in the tasks? This time should be included in the planning (Table 2).
- Are there possible accessibility issues for some students?
- Can some activities be accomplished asynchronously, for example, as an assignment (such as viewing a video) that students can complete at their own pace before a re-al-time session?
- Can some material, especially those that may be impacted by student accessibility, be flipped such that class sessions are more devoted to interactions between/among educators and students.
- Presentations: Weigh between uploading an application and onscreen sharing of the application.
- Animations: Weigh between using PowerPoint slide animations and building animations with multiple slides.
- Other Factors: Additional factors may impact the quality of the content delivery and the students’ accessibility. These include: Web Browser vs. Application vs. Mobile Application; network throughput for videos; access to outside sources; firewall settings and Virtual Private Network (VPN) since they cause major bandwidth choke; and audio quality of students.
4.1.2. Rehearse Script and Timing
4.1.3. Finalize Files and Upload to Course LMS
4.1.4. Prepare a Contingency Plan
4.2. Module 2—Online Lesson Delivery
4.2.1. Module 2/Phase 1: Ensure Delivery Tools Are Operational
- Audio: The sound of a speakers’ voice matters as much as the content of the message, especially in a virtual learning environment. Therefore, ensure crystal-clear audio and make the most out of the voice’s volume, pitch, tone, breadth, and rate of speech.
- Video: Use live video to help create a sense of community anytime there’s available bandwidth. Ensure the following for optimal effect:
- Put light in front of you.
- Place camera at eye level.
- Keep an appropriate distance from the device camera (use palm technique for measurement).
- Be aware of your background.
4.2.2. Module 2/Phase 2: Record Class Session
4.2.3. Module 2/Phase 3: Welcome Students and Conduct Activities
4.3. Module 3—Indirect Assessment of Educators
- A.
- Peer Assessment: This assessment of the performance of educators is usually carried out by colleagues of educators and/or personnel involved in the TLS.
- B.
- Student Course Feedback: This feedback gauges students’ perception of issues, including course design, course delivery, and educator performance.
- C.
- Student Self-Assessment of CLOs: This feedback is designed to assess students’ self-perception of their performance on CLOs.
4.3.1. Module 3/Phase 1: Design Rubrics for Indirect Assessments
- A.
- Peer Assessment: A generic peer assessment rubric that evaluates different aspects of course design and delivery can be generated by TLS. Educators and their colleagues may engage in discussions on which elements of the rubric would be assessed by the colleague in a class session, depending on the activities planned in that session.
- B.
- Student Course Feedback: A generic student course feedback rubric can be generated by TLS. Educators, including their academic units, may choose from, or add to, elements of this rubric for the evaluation of their courses.
- C.
- Student Self-Assessment of CLOs: These indirect assessment of learning outcomes surveys, also known as indirect assessment instruments in outcome-based assessments, may be designed by educators with the assistance of TLS. These surveys gauge the students’ assessment of their own attainment/achievement of the CLOs. It would be ideal, though hard to implement, to get this feedback from students after every virtual session as this provides educators with information on what to reinforce in future sessions, assignments, and activities. It is recommended that such surveys are deployed regularly, perhaps at the conclusion of each course topic.
4.3.2. Module 3/Phase 2: Deploy Indirect Assessments
- A.
- Peer Assessment: These are done at least once in a course, typically during the middle of the semester. The assessors prepare their reports, which are immediately dis-cussed with educators, thus giving time for educators to adjust the design and/or delivery of their current courses.
- B.
- Student Course Feedback: These are typically deployed at the end of the course and could aid, when combined with results from direct assessments, in ameliorating the performance of the educator, and/or the design and/or delivery of the course in the future.
- C.
- Student Self-Assessment of CLOs: this survey is ideally deployed several times in a course to acquire continuous feedback. If deployed frequently during the course, educators may consider mechanisms for ensuring the timely completion of the surveys by students. For example, educators may restrict access by students to future topics until they have submitted the survey; alternatively, educators may consider assigning a grade weight to the surveys.
4.3.3. Module 3/Phase 3: Generate Action Plans
5. Discussion
5.1. Educational Institution Macro Level
5.1.1. Module—Evaluate and (Re-)Design QA Framework
5.1.2. Module—Design Policies and Procedures Based on QA Framework
5.2. Educational Institution Micro Level
5.2.1. Module—Implement Procedures by Academic/Non-Academic Units
5.2.2. Module—Assess Effectiveness of Implementation
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
References
- Wang, L. Designing and implementing outcome-based learning in a linguistics course: A case study in Hong Kong. Procedia Soc. Behav. Sci. 2011, 12, 9–18. [Google Scholar] [CrossRef] [Green Version]
- Nicol, A.A.M.; Owens, S.M.; Le Coze, S.S.C.L.; MacIntyre, A.; Eastwood, C. Comparison of high-technology active learning and low-technology active learning classrooms. Act. Learn. High. Educ. 2018, 19, 253–265. [Google Scholar] [CrossRef]
- Ross, V. Offline to online curriculum. J. Distance Learn. Adm. State Univ. West. Georg. 2001, 4, 1–4. [Google Scholar]
- Allen, I.E.; Seaman, J.; Garrett, R. Blending in: The Extent and Promise of Blended Education in the United States; The Sloan Consortium: Boston, MA, USA, 2007. [Google Scholar]
- Wolfensberger, A.; Anagnostopoulos, A.; Clack, L.; Meier, M.T.; Kuster, S.P.; Sax, H. Effectiveness of an edutainment video teaching standard precautions—A randomized controlled evaluation study. Antimicrob. Resist. Infect. Control. 2019, 8, 1–11. [Google Scholar] [CrossRef]
- Santally, M.I.; Rajabalee, Y.B.; Sungkur, R.K.; Maudarbocus, M.I.; Greller, W. Enabling continuous improvement in online teaching and learning through e-learning capability and maturity assessment. Bus. Process. Manag. J. 2020, 26, 1687–1707. [Google Scholar] [CrossRef]
- Expósito, A.; Sánchez-Rivas, J.; Gómez-Calero, M.P.; Pablo-Romero, M.P. Examining the use of instructional video clips for teaching macroeconomics. Comput. Educ. 2020, 144. [Google Scholar] [CrossRef]
- Cholifah, P.S.; Nuraini, N.L.S.; Meidina, A.M. Training on development of edutainment-based innovative learning media for teacher professional development. In Proceedings of the 6th International Conference on Education and Technology (ICET 2020), Malang, Indonesia, 17 October 2020; pp. 467–471. [Google Scholar] [CrossRef]
- Hussain, I.; Shahzad, A.H.; Ali, R. A qualitative study on practices and issues of blended learning in higher education. Pak. J. Distance Online Learn. 2019, 5, 189–208. [Google Scholar]
- Chua, A.; Lam, W. Quality assurance in online education: The universitas 21 global approach. Br. J. Educ. Technol. 2007, 38, 133–152. [Google Scholar] [CrossRef]
- Beckford, J. Quality: A Critical Introduction, 4th ed.; Routledge: London, UK, 2016; Volume 36. [Google Scholar]
- Hazelkorn, E. Rankings and the Reshaping of Higher Education: The Battle for World-Class. Excellence; Palgrave Macmillan: London, UK, 2011. [Google Scholar]
- Information Resources Management Association. Research Anthology on Developing Effective Online Learning Courses; IGI Global: Hershey, PA, USA, 2021. [Google Scholar]
- Richey, R.C.; Klein, J.D.; Tracey, M.W. The Instructional Design Knowledge Base: Theory, Research, and Practice; Routledge: New York, NY, USA, 2011. [Google Scholar]
- Goodyear, P. Teaching as design. HERDSA Rev. High. Educ. 2015, 2, 27–50. [Google Scholar]
- Carr-Chellman, A.A. Instructional Design for Teachers: Improving Classroom Practice, 2nd ed.; Routledge: London, UK, 2015. [Google Scholar]
- Tennyson, R.D.; Breuer, K. Psychological foundations for instructional design theory. Instr. Des. Int. Perspect. Theory Res. Model. 2013, 1, 113–134. [Google Scholar] [CrossRef]
- Bates, A.W. Teaching in a Digital Age, 2nd ed.; Tony Bates Associates: Vancouver, BC, Canada, 2019. [Google Scholar]
- Bates, T. Advice to Those about to Teach Online Because of the Corona-Virus. 2020. Available online: https://www.tonybates.ca/2020/03/09/advice-to-those-about-to-teach-online-because-of-the-corona-virus/ (accessed on 19 February 2021).
- Anderson, T.; Rourke, L.; Garrison, D.R.; Archer, W. Assessing teaching presence in a computer conferencing context. J. Asynchronous Learn. Netw. 2001, 5, 27–42. [Google Scholar] [CrossRef]
- Goodyear, P.; Dimitriadis, Y. In medias res: Reframing design for learning. Res. Learn. Technol. 2013, 21. [Google Scholar] [CrossRef] [Green Version]
- Bates, A.W.; Poole, G. Effective Teaching with Technology in Higher Education: Foundations for Success; Jossey-Bass: San Francisco, CA, USA, 2005. [Google Scholar]
- Bullen, M.; Janes, D.P. Making the Transition to E-Learning: Strategies and Issues; Information Science Publishing: Hershey, PA, USA, 2007; ISBN 1-59140-950-0. [Google Scholar]
- Alliance, I.E. Washington Accord. Available online: https://www.ieagreements.org/accords/washington (accessed on 10 February 2021).
- ABET Accreditation Board for Engineering and Technology. 2000. Available online: https://www.abet.org/ (accessed on 10 February 2021).
- Biggs, J.; Medland, E.; Vardi, I. Aligning teaching and assessing to course objectives. Assess. Eval. High. Educ. 2013, 38, 1–16. [Google Scholar]
- Spady, W.G. Outcome-Based Education: Critical Issues and Answers; American Association of School Administrators: Arlington, VA, USA, 1994. [Google Scholar]
- Killen, R. Standards-Referenced Assessment: Linking Outcomes, Assessment and Reporting. In Proceedings of the Annual Conference of the Association for the Study of Evaluation in Education in Southern Africa, Port Elizabeth, South Africa, 26–29 September 2000. [Google Scholar]
- Lingard, M.; Ladwig, R.L.; Mills, J.; Bahr, M.D.; Chant, M.P.; Warry, D.C. The Queensland School Reform Longitudinal Study; Education Queensland: Brisbane, Australia, 2001; Volume 1.
- Driscoll, S.; Wood, A. Developing Outcomes-Based Assessment for Learner-Centered Education: A Faculty Introduction; Stylus: Sterling, VA, USA, 2007. [Google Scholar]
- Bonwell, C.C.; Eison, J.A. Active Learning: Creating Excitement in the Classroom. 1991 ASHE-ERIC Higher Education Reports; Jossey-Bass: San Francisco, CA, USA, 1991. [Google Scholar]
- Baldwin, L. Editorial. Act. Learn. High. Educ. 2018, 19, 189–195. [Google Scholar] [CrossRef] [Green Version]
- Prince, M. Does active learning work? A review of the research. J. Eng. Educ. 2004, 93, 223–231. [Google Scholar] [CrossRef]
- Freeman, S.; Eddy, S.L.; McDonough, M.; Smith, M.K.; Okoroafor, N.; Jordt, H.; Wenderoth, M.P. Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. USA 2014, 111, 8410–8415. [Google Scholar] [CrossRef] [Green Version]
- Hamouda, A.M.S.; Tarlochan, F. Engaging engineering students in active learning and critical thinking through class debates. Procedia Soc. Behav. Sci. 2015, 191, 990–995. [Google Scholar] [CrossRef] [Green Version]
- Hartikainen, S.; Rintala, H.; Pylväs, L.; Nokelainen, P. The concept of active learning and the measurement of learning outcomes: A review of research in engineering higher education. Educ. Sci. 2019, 9, 276. [Google Scholar] [CrossRef] [Green Version]
- Bolliger, D.U.; Des Armier, D. Active learning in the online environment: The integration of student-generated audio files. Act. Learn. High. Educ. 2013, 14, 201–211. [Google Scholar] [CrossRef]
- Dorodchi, M.; Powell, L.; Dehbozorgi, N.; Benedict, A. Strategies to incorporate active learning practice in introductory courses. In Faculty Experiences in Active Learning; J. Murrey Atkins Library: Charlotte, NC, USA, 2014. [Google Scholar]
- Srimathi, H.; Krishnamoorthy, A. Faculty development on active learning. Int. J. Recent Technol. Eng. 2019, 8, 958–962. [Google Scholar]
- Harvey, L.; Green, D. Defining quality. Assess. Eval. High. Educ. 1993, 18, 9–34. [Google Scholar] [CrossRef]
- Brockerhoff, L.; Huisman, J.; Laufer, M. Quality in Higher Education: A Literature Review; Alexander von Humboldt Institute for Internet and Society: Berlin, Germany, 2015; pp. 1–50. [Google Scholar]
- Tezcan-Unal, B.; Winston, K.; Qualter, A. Learning-oriented quality assurance in higher education institutions. Qual. High. Educ. 2018, 24, 221–237. [Google Scholar] [CrossRef]
- Marciniak, R. Quality assurance for online higher education programmes: Design and validation of an integrative assessment model applicable to Spanish universities. Int. Rev. Res. Open Distance Learn. 2018, 19, 126–154. [Google Scholar] [CrossRef]
- Asiyai, R.I. Best practices for quality assurance in higher education: Implications for educational administration. Int. J. Leadersh. Educ. 2020, 1–12. [Google Scholar] [CrossRef]
- Kazimi, A.B.; Shaikh, M.A.; John, S. Issues of syllabus designing practices and quality assurance at higher education level. Glob. Soc. Sci. Rev. 2019, 4, 135–145. [Google Scholar] [CrossRef]
- Andrade, M.S.; Miller, R.M.; Kunz, M.B.; Ratliff, J.M. Online learning in schools of business: The impact of quality assurance measures. J. Educ. Bus. 2019, 95, 37–44. [Google Scholar] [CrossRef]
- Lucander, H.; Christersson, C. Engagement for quality development in higher education: A process for quality assurance of assessment. Qual. High. Educ. 2020, 26, 135–155. [Google Scholar] [CrossRef]
- Hauptman Komotar, M. Discourses on quality and quality assurance in higher education from the perspective of global university rankings. Qual. Assur. Educ. 2020, 28, 78–88. [Google Scholar] [CrossRef]
- Alzafari, K.; Kratzer, J. Challenges of implementing quality in European higher education: An expert perspective. Qual. High. Educ. 2019, 25, 261–288. [Google Scholar] [CrossRef]
- Ryan, T. Quality assurance in higher education: A review of literature. High. Learn. Res. Commun. 2015, 5. [Google Scholar] [CrossRef]
- Schindler, L.; Puls-Elvidge, S.; Welzant, H.; Crawford, L. Definitions of quality in higher education: A synthesis of the literature. High. Learn. Res. Commun. 2015, 5, 3. [Google Scholar] [CrossRef]
- Britto, M.; Ford, C.; Wise, J.M. Three institutions, three approaches, one goal: Addressing quality assurance in online learning. J. Asynchronous Learn. Netw. 2014, 17, 11–24. [Google Scholar] [CrossRef] [Green Version]
- Hénard, F.; Roseveare, D. Fostering Quality Teaching in Higher Education: Policies and Practices; OECD: Paris, France, 2012; p. 54. [Google Scholar]
- Inglis, A. Quality improvement, quality assurance, and benchmarking: Comparing two frameworks for managing quality processes in open and distance learning. Int. Rev. Res. Open Distance Learn. 2005, 6. [Google Scholar] [CrossRef] [Green Version]
- Zuhairi, A. Implementing quality assurance system for open and distance learning in three Asian Open universities: Philippines, Indonesia, And Pakistan. Asian Assoc. Open Univ. J. 2020, 15. [Google Scholar] [CrossRef]
- Learning, O. APEC Quality Assurance of Online Learning Toolkit; APEC: Singapore, 2019. [Google Scholar]
- Abdous, M. E-learning quality assurance: A process-oriented lifecycle model. Qual. Assur. Educ. 2009, 17, 281–295. [Google Scholar] [CrossRef]
- Kerns, W.A. Quality assurance within synchronous sessions of online instruction. In Educational Technology and Resources for Synchronous Learning in Higher Education; IGI Global: Hershey, PA, USA, 2019; pp. 211–228. [Google Scholar]
- Giesbers, B.; Rienties, B.; Tempelaar, D.T.; Gijselaers, W. Why increased social presence through web videoconferencing does not automatically lead to improved learning. E Learn. Digit. Media 2014, 11, 31–45. [Google Scholar] [CrossRef] [Green Version]
- Butler, A.C.; Roediger, H.L. Testing improves long-term retention in a simulated classroom setting. Eur. J. Cogn. Psychol. 2007, 19, 514–527. [Google Scholar] [CrossRef]
- Stowell, J.; Bennett, D. Effects of online testing on student exam performance and test anxiety. J. Educ. Comput. Res. 2010, 42, 161–171. [Google Scholar] [CrossRef]
- Charman, D.; Elmes, A. Formative assessment in a basic geographical statistics module. Comput. Based Assess. 1998, 2, 17–20. [Google Scholar]
- DeSouza, E.; Fleming, M. A comparison of in-class and online quizzes on student exam performance. J. Comput. High. Educ. 2003, 14, 121–134. [Google Scholar] [CrossRef]
- Pennebaker, J.W.; Gosling, S.D.; Ferrell, J.D. Daily online testing in large classes: Boosting college performance while reducing achievement gaps. PLoS ONE 2013, 8, e79774. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Rane, V.; MacKenzie, C.A. Evaluating students with online testing modules in engineering economics: A comparison of student performance with online testing and with traditional assessments. Eng. Econ. 2020, 65, 213–235. [Google Scholar] [CrossRef]
- Kennedy, K.; Nowak, S.; Raghuraman, R.; Thomas, J.; Davis, S.F. Academic dishonesty and distance learning: Student and faculty views. Coll. Stud. J. 2000, 2, 309–314. [Google Scholar]
- Shaw, L.; Macisaac, J.; Singleton-Jackson, J. The efficacy of an online cognitive assessment tool for enhancing and improving student academic outcomes. Online Learn. J. 2019, 23, 124–144. [Google Scholar] [CrossRef]
- Akimov, A.; Malin, M. When old becomes new: A case study of oral examination as an online assessment tool. Assess. Eval. High. Educ. 2020, 45, 1205–1221. [Google Scholar] [CrossRef]
- Abdul Rahim, A.F. Guidelines for online assessment in emergency remote teaching during the COVID-19 pandemic. Educ. Med. J. 2020, 12, 59–68. [Google Scholar] [CrossRef]
- SUNY. The SUNY Online Course Quality Review Rubric OSCQR. Available online: https://oscqr.suny.edu/ (accessed on 8 February 2021).
- Kahoot. 2021. Available online: https://kahoot.com/ (accessed on 10 February 2021).
- Plump, C.M.; LaRosa, J. Using Kahoot! in the classroom to create engagement and active learning: A game-based technology solution for elearning novices. Manag. Teach. Rev. 2017, 2, 151–158. [Google Scholar] [CrossRef]
In-Class Activity | Online Activity | Description of Online Tool | Good for |
---|---|---|---|
Whiteboard | Whiteboard | Offers space for brainstorming and group activities. |
|
Group Discussion | Breakout Rooms | Allows for multiple, simultaneous, small group interactions that are separate from the main group. |
|
Topics | Sessions | Activities: Type of Demonstration, Instruction, Discussion, or Other Activity | Delivery Method: How Activity/Content Will Be Delivered? | Time | Assessment Components a |
---|---|---|---|---|---|
Topic 1 | Session 1 CLOs 1, 2, and 3 | a. Presenting | a. PowerPoint slides | a. 10–15 min | Exam 1- Question 1 (CLOs 1, 3) b |
b. Demonstrating code | b. Online compiler (through shared screen) | b. 10–15 min | |||
c. Demonstrating concept + group discussion | c. Online video + breakout rooms (See Table 1) | c. 5 min | Quiz 1- Question 1 (CLO 2) b | ||
Session 2 CLOs x, y, and z c | TBD c | TBD c | TBD c | TBD c |
Event | Contingency Plan |
---|---|
Students cannot log in. | Ensure ICTS is available to provide immediate support. |
Educator audio is not working. | Log onto a second machine, and/or have another smart device on standby. |
Electricity goes out for educator. | Ensure that another smart device is fully charged and internet router has alternative power. |
Video will not play. | Have videos available outside of the platform on a separate server (for example, YouTube). Log onto a second machine, and/or have another smart device on standby. |
The student presenter disappears. | Have a copy of the student presenters’ slide open on your machine. Have a backup presenter(s). |
Educator is late. | Provide students with contact information of the educator. |
Participants cannot use chat. | Have an alternative chat access, like WhatsApp. |
Application Sharing suddenly stops. | Know the tools and be flexible to spontaneously adjust during the live session. Plan alternative activities. |
Session did not record. | Use a screen recording application with external audio as a backup plan. |
Slides will not load. | Have a copy of your slides in a PDF format |
Demonstration site is down. | Have a static copy of the demonstration site stored on a public drive that is accessible to students. |
Pre-work was not sent. | Preload any document onto one or shared public drive with a link to access documents |
Task | |
---|---|
Conduct audio checks. | Monitor chat. |
Upload slides to present. | Point and click. |
Organize participants into breakout groups. | Locate and paste URL. |
Turn on/off enhanced participant rights. | Set up activities. |
Provide instructions. | Facilitate discussion. |
Transfer handout file. | Identify open microphones and mute them. |
Clear status indicators. | Respond to technical questions. |
Communication Tips | |
---|---|
Do not assume the audience is comfortable learning online; provide resources such as Quick Tip Cards before the session. This will make participants feel more at ease and able to focus on content. | Do not be afraid of the silence. Silence does not necessarily mean participants are disengaged; they just might be processing. Allow time for them to answer and react (TIP: Silently count to 10 before advancing). Be careful of proceeding before they have had enough time. |
Do not read from the screen: participants can read for themselves. Your voice track should supplement content on the screen. | Do not apologize for the tool. The virtual experience is not the same as in-person, but you can accomplish the same objectives and do almost all of the same activities. |
Avoid the use of filler words like “uh,” “um,” “like,” “because,” or “you know?”: Record yourself during a dry run and determine what your fillers are; then practice them out of you. | Be careful of “rambling”—nerves often cause this. If you start going too fast or get off track, just stop, mute your audio, take a deep breath, un-mute the audio, and begin again. |
Set the expectation at the beginning on how they are to engage, ask questions, or provide feedback. | Keep your tone conversational, but still sound professional through word-choice. |
Modulate and project your voice; avoid being monotone or mumbling. | Don’t wait until the end to ask for questions: engage participants early and often. |
Give clear and succinct instructions when directing activities. |
If You Say in the Physical Classroom | Then You Would Say in the Virtual Classroom | Virtual Class Collaboration Feature to Use |
---|---|---|
“Let me demonstrate…” | “Please look at my screen as I start my application sharing.” | Application Sharing |
“Select this or that …” | “Please respond by selecting green check or red X found under your Status Indicators.” | Status Indicator Icons |
“Explore this website on your own.” | “Please click the link provided in the Chat area. This will open a new browser window.” | Hyperlink text in Chat or Application; Share Web Content |
“Share an example of your own.” | “Please click the Raise Hand icon to indicate if you would like to share.” “Please post your example in the Chat area.” | VOIP—Pass microphone privileges; Status Indicators; Chat |
“Draw on the flip chart to…” | “Using your annotation tools, found on the top left corner of your screen, click your Ellipse tool and then click on the screen to …” | Annotation Tools |
“Please fill out the evaluation form.” | “Please complete the assessment now being opened on your screen.” | Testing or Polling |
“Take a 10-min break.” | “We will now take a 10 min break. Let me know you have returned by giving me a Green check.” | Status Indicator |
“I see that you are Confused.” | “How did that work for you? Please raise your hand or post in the Chat area.” | Status Indicator; Chat |
“Turn to the practice unit that begins on page …” | “Download the assignment.” | Materials Panel; Share Pod; Push material to participants |
“We will now watch a video.” | “I will now play a video. The video should now be displaying. | Video File; Hyperlink text to a video |
“Let’s brainstorm.” | “Type your responses in Chat…” “Click your Raise Hand icon if you would like to share…” | Chat Text; Status Indicators; Whiteboard |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Karam, M.; Fares, H.; Al-Majeed, S. Quality Assurance Framework for the Design and Delivery of Virtual, Real-Time Courses. Information 2021, 12, 93. https://doi.org/10.3390/info12020093
Karam M, Fares H, Al-Majeed S. Quality Assurance Framework for the Design and Delivery of Virtual, Real-Time Courses. Information. 2021; 12(2):93. https://doi.org/10.3390/info12020093
Chicago/Turabian StyleKaram, Marcel, Hanna Fares, and Salah Al-Majeed. 2021. "Quality Assurance Framework for the Design and Delivery of Virtual, Real-Time Courses" Information 12, no. 2: 93. https://doi.org/10.3390/info12020093
APA StyleKaram, M., Fares, H., & Al-Majeed, S. (2021). Quality Assurance Framework for the Design and Delivery of Virtual, Real-Time Courses. Information, 12(2), 93. https://doi.org/10.3390/info12020093