The Impact of Student Response Systems (SRS) on Student Achievements: A University-Scale Study with Deep Exploratory Data Analysis (EDA)
Abstract
:1. Introduction
2. Methods
2.1. Procedure
- To utilize the SRS in the educational process of bachelor’s sections where at least one section must include ≥20 students;
- To submit a final report, a faculty survey, and to encourage his/her students to participate in the students’ survey on utilizing SRS.
2.2. Characteristics of the Participants
- Five sections (202 students) showed technical errors in score calculation and/or missing/incomplete values;
- Male campus #3 involved only six students in (1 section) and none of them utilized SRS;
- 12 sections (342 students) were registered in the I-Cal course, some of them utilized SRS in the theoretical labs (not in the lectures), and hence, they could not be identified (from those who did not use SRS) in terms of their achievement in the final exams;
- 449 students withdrew from their courses in the early weeks of the semester.
Course Name | Abbreviation |
---|---|
English Language (1st level) | EL-1 |
Pharmaceutical calculations and liquid dosage forms | PC-LDF |
Arabic Language-Writing Skills | AL-WS |
Medical Terminology | MT |
Professional Ethics in the Health Sector | PE-HS |
Health Management | HM |
Organization of Healthcare Services | OHS |
Biostatistics | BIO-STAT |
Introduction to Plant Production | Intro-PP |
Enzymology | Enz |
Molecular biology | M-BIO |
Physiology | Phys |
Integral Calculus | I-Cal |
Linear Algebra in Business | LA-B |
2.3. Assessment of Student Achievements
2.4. Questionnaires to Evaluate Student’s Perceptions of SRS
2.5. Ethical Considerations
2.6. Data Analysis
2.6.1. Sampling
2.6.2. Statistical Analysis
3. Results
3.1. General Study Findings
3.2. Pass Rate (%)
3.3. Student Scores
3.4. The Correlation between Average Section Score and the Number of SRS Sessions/Questions
3.5. Students’ Survey Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. Questionnaires to Evaluate Student’s Perceptions of SRS
- First section: General information
- Dear Student: Thank you for agreeing to participate in this electronic survey entitled: Measuring the Impact of Students’ Use of Student Response Systems (SRS).
- The purpose of this questionnaire is to assess the effect of using some active learning applications on students’ academic achievement. It takes about 3 min to complete this questionnaire.
- We assure you that all your answers are used for the purposes of scientific research, which does not include the publication of any private data on the identity of the participant.
- If you agree to participate in this online survey, please click below to get started.
- 1-
- Participant gender: (Please select one)
- Male
- Female
- 2-
- College:………………………………………………………………….
- 3-
- The academic level: (Please select one)
- Level 1
- Level 2
- Level 3
- Level 4
- Level 5
- Level 6
- Level 7
- Level 8
- Level 9
- Level 10
- 4-
- Did you use the personal response system during the first semester of the 2022 academic year?(Please select one)
- Yes
- No: (Finish and Skip to send)
The second section: for students who used Student response systems (SRS). - 5-
- Mention the names of the courses in which you used (SRS) during the first semester 2022:
- 6-
- What is your most preferred learning style? (Please select one)
- Visual Learning: (Pictures, Drawings and Charts)
- Auditory learning: (listening to recorded lectures repeatedly)
- kinesthetic learning: (by touching the stereoscopic shapes or conducting laboratory experiments yourself, if possible)
- Learning through reading and writing: (by writing lectures and writing notes on them and then reading them)
- 7-
- What is your favorite means of interaction with the lecturer? (Inside the classroom)(Please select one)
- Discussions and oral conversations to answer the questions of the lecturer.
- Personal Response Devices (SRS) or mobile apps in order to collect student responses.
- Answer the paper-based quizzes before or after the lecture.
- 8-
- In general, are you satisfied with the use of personal response systems (SAR) in lectures? (Please select one)
- Highly unsatisfied
- Unsatisfied
- neutral
- Satisfied
- Highly satisfied
- 9-
- What is your score in the courses in which (SRS) was used during the first semester 2022?Please be careful in answering, knowing that the questionnaire does not reveal your identity. (For example: your score for each course is: A+, A, etc.).
- 10-
- What is your cumulative GPA until the end of the first semester 2022?Please be careful in answering, knowing that the questionnaire does not reveal your identity. (Ex: 3.56).
References
- Kuhlthau, C.C.; Maniotes, L.K.; Caspari, A.K. Guided Inquiry: Learning in the 21st Century; Abc-Clio: Santa Barbara, CA, USA, 2015. [Google Scholar]
- Stukalenko, N.M.; Zhakhina, B.B.; Kukubaeva, A.K.; Smagulova, N.K.; Kazhibaeva, G.K. Studying Innovation Technologies in Modern Education. Int. J. Environ. Sci. Educ. 2016, 11, 7297–7308. [Google Scholar]
- Crouch, C.H.; Mazur, E. Peer instruction: Ten years of experience and results. Am. J. Phys. 2001, 69, 970–977. [Google Scholar]
- White, P.; Syncox, D.; Alters, B. Clicking for grades? Really? Investigating the use of clickers for awarding grade-points in post-secondary education. Interact. Learn. Environ. 2011, 19, 551–561. [Google Scholar] [CrossRef]
- Connor, E.J.S.; Libraries, T. Using cases and clickers in library instruction: Designed for science undergraduates. Sci. Technol. Libr. 2011, 30, 244–253. [Google Scholar] [CrossRef]
- Cain, J.; Robinson, E. A primer on audience response systems: Current applications and future considerations. Am. J. Pharm. Educ. 2008, 72, 77. [Google Scholar] [CrossRef]
- Lantz, M.E.; Stawiski, A. Effectiveness of clickers: Effect of feedback and the timing of questions on learning. Comput. Hum. Behav. 2014, 31, 280–286. [Google Scholar] [CrossRef]
- Velasco, M.; Çavdar, G. Politics. Teaching large classes with clickers: Results from a teaching experiment in comparative politics. PS Political Sci. Politics 2013, 46, 823–829. [Google Scholar] [CrossRef] [Green Version]
- Klein, K.; Kientz, M. A model for successful use of student response systems. Nurs. Educ. Perspect. 2013, 34, 334–338. [Google Scholar]
- King, S.O. Investigating the most neglected student learning domain in higher education: A case study on the impact of technology on student behaviour and emotions in university mathematics learning. Probl. Educ. 21st Century 2016, 72, 31–52. [Google Scholar] [CrossRef]
- Mathiasen, H. Digital Voting Systems and Communication in Classroom Lectures—An Empirical Study Based around Physics Teaching at Bachelor Level at Two Danish Universities. J. Interact. Media Educ. 2015, 1. [Google Scholar] [CrossRef] [Green Version]
- Altwijri, O.; Alsadoon, E.; Shahba, A.A.; Soufan, W.; Alkathiri, S. The Effect of Using Student Response Systems (SRS)” on Faculty Performance and Student Interaction in the Classroom. Sustainability 2022, 14, 14957. [Google Scholar] [CrossRef]
- Strasser, N. Who wants to pass math? Using clickers in calculus. J. Coll. Teach. Learn. (TLC) 2010, 7. [Google Scholar] [CrossRef]
- Chien, Y.-T.; Chang, Y.-H.; Chang, C.-Y. Do we click in the right way? A meta-analytic review of clicker-integrated instruction. Educ. Res. Rev. 2016, 17, 1–18. [Google Scholar] [CrossRef] [Green Version]
- Hussain, F.N.; Wilby, K.J. A systematic review of audience response systems in pharmacy education. Curr. Pharm. Teach. Learn. 2019, 11, 1196–1204. [Google Scholar] [CrossRef] [PubMed]
- Wood, R.; Shirazi, S.J.C. A systematic review of audience response systems for teaching and learning in higher education: The student experience. Comput. Educ. 2020, 153, 103896. [Google Scholar] [CrossRef]
- Katsioudi, G.; Kostareli, E. A Sandwich-model experiment with personal response systems on epigenetics: Insights into learning gain, student engagement and satisfaction. FEBS Open Bio 2021, 11, 1282–1298. [Google Scholar] [CrossRef]
- Castillo-Manzano, J.I.; Castro-Nuño, M.; López-Valpuesta, L.; Sanz-Díaz, M.T.; Yñiguez, R.J.C. Measuring the effect of ARS on academic performance: A global meta-analysis. Comput. Educ. 2016, 96, 109–121. [Google Scholar] [CrossRef]
- Atlantis, E.; Cheema, B.S. Effect of audience response system technology on learning outcomes in health students and professionals: An updated systematic review. JBI Evid. Implement. 2015, 13, 3–8. [Google Scholar] [CrossRef]
- Nelson, C.; Hartling, L.; Campbell, S.; Oswald, A.E. The effects of audience response systems on learning outcomes in health professions education. A BEME systematic review: BEME Guide No. 21. Med. Teach. 2012, 34, e386–e405. [Google Scholar] [CrossRef] [Green Version]
- Hunsu, N.J.; Adesope, O.; Bayly, D.J. A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect. Comput. Educ. 2016, 94, 102–119. [Google Scholar] [CrossRef]
- Liu, D.J.; Walker, J.; Bauer, T.; Zhao, M. Facilitating Classroom Economics Experiments with an Emerging Technology: The Case of Clickers. SSRN 989482. 2007. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=989482 (accessed on 1 June 2023).
- Ortiz, B.L. The effects of student response systems on student achievement and engagement. Masters Thesis, Faculty of California State Polytechnic University, Pomona, CA, USA, 2014. [Google Scholar]
- Hayter, J.; Rochelle, C.F. Clickers: Performance and Attitudes in Principles of Microeconomics. SSRN 2226401. 2013. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2226401 (accessed on 14 May 2023).
- Daza, L. A Simple Way to Analyze Student Performance Data with Python. Available online: https://towardsdatascience.com/a-simple-way-to-analyze-student-performance-data-with-python-cc09c7508c4c (accessed on 7 May 2023).
- Shahba, A.A.; Alashban, Z.; Sales, I.; Sherif, A.Y.; Yusuf, O. Development and Evaluation of Interactive Flipped e-Learning (iFEEL) for Pharmacy Students during the COVID-19 Pandemic. Int. J. Environ. Res. Public Health 2022, 19, 3902. [Google Scholar] [CrossRef]
- Shahba, A.A.; Sales, I. Design Your Exam (DYE): A novel active learning technique to increase pharmacy student engagement in the learning process. Saudi Pharm. J. 2021, 29, 1323–1328. [Google Scholar] [CrossRef]
- Mishra, P.; Pandey, C.M.; Singh, U.; Gupta, A.; Sahu, C.; Keshri, A. Descriptive statistics and normality tests for statistical data. Ann. Card. Anaesth. 2019, 22, 67–72. [Google Scholar] [CrossRef]
- McHugh, M.L. The chi-square test of independence. Biochem. Med. 2013, 23, 143–149. [Google Scholar] [CrossRef] [Green Version]
- Anderson, M.R.; Baumhauer, J.F.; DiGiovanni, B.F.; Flemister, S.; Ketz, J.P.; Oh, I.; Houck, J.R. Determining Success or Failure After Foot and Ankle Surgery Using Patient Acceptable Symptom State (PASS) and Patient Reported Outcome Information System (PROMIS). Foot Ankle Int. 2018, 39, 894–902. [Google Scholar] [CrossRef]
- Nahm, F.S. Nonparametric statistical tests for the continuous data: The basic concept and the practical use. Korean J. Anesth. 2016, 69, 8–14. [Google Scholar] [CrossRef]
- Zimmerman, D.W. Type I Error Probabilities of the Wilcoxon-Mann-Whitney Test and Student T Test Altered by Heterogeneous Variances and Equal Sample Sizes. Percept. Mot. Ski. 1999, 88, 556–558. [Google Scholar] [CrossRef]
- Zimmerman, D.W. A Warning About the Large-Sample Wilcoxon-Mann-Whitney Test. Underst. Stat. 2003, 2, 267–280. [Google Scholar] [CrossRef]
- Schober, P.; Boer, C.; Schwarte, L.A. Correlation Coefficients: Appropriate Use and Interpretation. Anesth. Analg. 2018, 126, 1763–1768. [Google Scholar] [CrossRef]
- Akoglu, H. User’s guide to correlation coefficients. Turk. J. Emerg. Med. 2018, 18, 91–93. [Google Scholar] [CrossRef]
- Blasco-Arcas, L.; Buil, I.; Hernández-Ortega, B.; Sese, F. Using clickers in class. The role of interactivity, active collaborative learning and engagement in learning performance. Comput. Educ. 2013, 62, 102–110. [Google Scholar] [CrossRef]
- Kay, R.H.; LeSage, A.J.C. Examining the benefits and challenges of using audience response systems: A review of the literature. Comput. Educ. 2009, 53, 819–827. [Google Scholar] [CrossRef]
- Bullock, D.; LaBella, V.; Clingan, T.; Ding, Z.; Stewart, G.; Thibado, P. Enhancing the student-instructor interaction frequency. Phys. Teach. 2002, 40, 535–541. [Google Scholar] [CrossRef] [Green Version]
- Bunce, D.M.; Flens, E.A.; Neiles, K.Y. How long can students pay attention in class? A study of student attention decline using clickers. J. Chem. Educ. 2010, 87, 1438–1443. [Google Scholar] [CrossRef]
- Skelly, A.C.; Dettori, J.R.; Brodt, E.D. Assessing bias: The importance of considering confounding. Evid.-Based Spine-Care J. 2012, 3, 9–12. [Google Scholar] [CrossRef] [Green Version]
- Liu, C.; Sands-Meyer, S.; Audran, J. The effectiveness of the student response system (SRS) in English grammar learning in a flipped English as a foreign language (EFL) class. Interact. Learn. Environ. 2019, 27, 1178–1191. [Google Scholar] [CrossRef]
- Lantz, M.E. The use of ‘clickers’ in the classroom: Teaching innovation or merely an amusing novelty? Comput. Hum. Behav. 2010, 26, 556–561. [Google Scholar] [CrossRef]
- Li, R. Communication preference and the effectiveness of clickers in an Asian university economics course. Heliyon 2020, 6, e03847. [Google Scholar] [CrossRef]
- Barrio, C.M.; Muñoz-Organero, M.; Soriano, J.S. Can gamification improve the benefits of student response systems in learning? An experimental study. IEEE Trans. Emerg. Top. Comput. 2015, 4, 429–438. [Google Scholar] [CrossRef]
SCIENTIFIC DISCIPLINE | FACULTY | COURSE CODE | SRS USE | Raw Data | Sampled Data | p-Value * |
---|---|---|---|---|---|---|
n | n | |||||
Community | Applied_studies_and_ community_service | BIO-STAT | NO | 53 | 25 | 0.48 |
YES | 25 | 25 | 1.00 # | |||
EL1 | NO | 1199 | 28 | 0.19 | ||
YES | 28 | 28 | 1.00 # | |||
MT | NO | 152 | 21 | 0.73 | ||
YES | 21 | 21 | 1.00 # | |||
PE-HS | NO | 147 | 21 | 0.36 | ||
YES | 21 | 21 | 1.00 # | |||
HM | NO | 55 | 36 | 0.84 | ||
YES | 36 | 36 | 1.00 # | |||
OHS | NO | 104 | 24 | 0.56 | ||
YES | 24 | 24 | 1.00 # | |||
Health | Medicine | Phys | NO | 146 | 146 | 1.00 # |
YES | 188 | 146 | 0.72 | |||
Pharmacy | PC-LDF | NO | 54 | 54 | 1.00 # | |
YES | 69 | 54 | 0.76 | |||
Humanities | Humanities_and_ Social_Sciences | AL-WS | NO | 3328 | 21 | 0.09 |
YES | 21 | 21 | 1.00 # | |||
Science | Business_administration | LA-B | NO | 96 | 87 | 0.91 |
YES | 87 | 87 | 1.00 # | |||
Food_and_Agriculture_ Sciences | Intro-PP | NO | 118 | 23 | 0.61 | |
YES | 23 | 23 | 1.00 # | |||
Sciences | Enz | NO | 10 | 10 | 1.00 # | |
YES | 10 | 10 | 1.00 # | |||
M-BIO | NO | 10 | 10 | 1.00 # | ||
YES | 22 | 10 | 1.00 |
Study Parameters | TOTAL_REGISTERED _STUDENTS | STUDIED_Students | Passing Rate (%) | Prohibition Rate (%) | Failure Rate (%) | Average Section Score (out of 5) | A+ (%) | A (%) | B+ (%) | B (%) | C+ (%) | C (%) | D+ (%) | D (%) |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
mean | 35.1 | 32.7 | 95.1 | 0.4 | 4.5 | 4.1 | 32.9 | 18.0 | 13.4 | 9.4 | 6.4 | 5.2 | 4.1 | 5.7 |
std | 17.1 | 15.9 | 7.2 | 1.5 | 6.8 | 0.7 | 28.2 | 13.0 | 11.1 | 9.1 | 8.3 | 7.0 | 5.9 | 9.8 |
min | 4.0 | 3.0 | 61.5 | 0.0 | 0.0 | 1.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
25% | 21.0 | 20.0 | 93.1 | 0.0 | 0.0 | 3.7 | 10.4 | 8.3 | 4.4 | 2.2 | 0.0 | 0.0 | 0.0 | 0.0 |
50% | 37.0 | 36.0 | 97.9 | 0.0 | 1.9 | 4.4 | 25.0 | 15.2 | 10.9 | 7.7 | 4.7 | 2.3 | 0.0 | 0.0 |
75% | 47.0 | 44.0 | 100.0 | 0.0 | 6.5 | 4.7 | 46.2 | 26.2 | 20.0 | 13.8 | 10.0 | 7.7 | 7.6 | 7.1 |
max | 118.0 | 108.0 | 100.0 | 8.3 | 30.8 | 5.0 | 98.1 | 66.7 | 60.0 | 50.0 | 66.7 | 33.3 | 30.0 | 50.0 |
STUDY_LEVEL | Overall SRS SATISFACTION (OSS) | SRS COURSE SCORE | GPA | |
---|---|---|---|---|
n | 67 | 67 | 72 | 72 |
mean | 4.76 | 4.36 | 4.52 | 4.26 |
std | 2.14 | 1.18 | 0.58 | 0.57 |
min | 1 | 1 | 2 | 2.8 |
25% | 3 | 4 | 4.5 | 3.87 |
50% | 5 | 5 | 4.75 | 4.35 |
75% | 6.5 | 5 | 5 | 4.78 |
max | 9 | 5 | 5 | 5 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shahba, A.A.-W.; Soufan, W.; Altwijri, O.; Alsadoon, E.; Alkathiri, S. The Impact of Student Response Systems (SRS) on Student Achievements: A University-Scale Study with Deep Exploratory Data Analysis (EDA). Systems 2023, 11, 384. https://doi.org/10.3390/systems11080384
Shahba AA-W, Soufan W, Altwijri O, Alsadoon E, Alkathiri S. The Impact of Student Response Systems (SRS) on Student Achievements: A University-Scale Study with Deep Exploratory Data Analysis (EDA). Systems. 2023; 11(8):384. https://doi.org/10.3390/systems11080384
Chicago/Turabian StyleShahba, Ahmad Abdul-Wahhab, Walid Soufan, Omar Altwijri, Elham Alsadoon, and Saud Alkathiri. 2023. "The Impact of Student Response Systems (SRS) on Student Achievements: A University-Scale Study with Deep Exploratory Data Analysis (EDA)" Systems 11, no. 8: 384. https://doi.org/10.3390/systems11080384
APA StyleShahba, A. A. -W., Soufan, W., Altwijri, O., Alsadoon, E., & Alkathiri, S. (2023). The Impact of Student Response Systems (SRS) on Student Achievements: A University-Scale Study with Deep Exploratory Data Analysis (EDA). Systems, 11(8), 384. https://doi.org/10.3390/systems11080384