Next Article in Journal
Receiver-Triggered Handshake Protocol for DTN in Disaster Area
Next Article in Special Issue
Output from Statistical Predictive Models as Input to eLearning Dashboards
Previous Article in Journal
Dynamis: Effective Context-Aware Web Service Selection Using Dynamic Attributes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Quantitative Analysis of the Usage of a Pedagogical Tool Combining Questions Listed as Learning Objectives and Answers Provided as Online Videos

1
Department of Biology, Faculty of Science, University of Ottawa, 30 Marie Curie Private, Ottawa, ON K1N 6N5, Canada
2
Department of Biochemistry, Microbiology and Immunology, Faculty of Medicine, University of Ottawa, 451 Smyth Road, Ottawa, ON K1H 8M5, Canada
*
Author to whom correspondence should be addressed.
Future Internet 2015, 7(2), 140-151; https://doi.org/10.3390/fi7020140
Submission received: 27 February 2015 / Revised: 29 April 2015 / Accepted: 8 May 2015 / Published: 15 May 2015
(This article belongs to the Special Issue eLearning)

Abstract

:
To improve the learning of basic concepts in molecular biology of an undergraduate science class, a pedagogical tool was developed, consisting of learning objectives listed at the end of each lecture and answers to those objectives made available as videos online. The aim of this study was to determine if the pedagogical tool was used by students as instructed, and to explore students’ perception of its usefulness. A combination of quantitative survey data and measures of online viewing was used to evaluate the usage of the pedagogical practice. A total of 77 short videos linked to 11 lectures were made available to 71 students, and 64 completed the survey. Using online tracking tools, a total of 7046 views were recorded. Survey data indicated that most students (73.4%) accessed all videos, and the majority (98.4%) found the videos to be useful in assisting their learning. Interestingly, approximately half of the students (53.1%) always or most of the time used the pedagogical tool as recommended, and consistently answered the learning objectives before watching the videos. While the proposed pedagogical tool was used by the majority of students outside the classroom, only half used it as recommended limiting the impact on students’ involvement in the learning of the material presented in class.

1. Introduction

The transformation of undergraduate science education started over 20 years ago in the USA with the objective to improve competitiveness in technology development and shortage of qualified candidates for high-tech jobs. Less than 40% of students enter universities and colleges with an interest in science, technology, engineering and mathematics (STEM); this combined with the low percentage who graduate with a degree in those fields motivated decision makers to target research in higher education in science [1,2,3,4]. The matter was reviewed extensively by the National Science Foundation and a major criticism was that STEM subjects are often taught in isolation instead of as an integrated curriculum [5]. Further analysis of the teaching methods practiced in science courses pointed at the observation that traditional lectures often promote the acquisition of facts rather than understanding of scientific concepts [6,7,8,9].
To address the challenge of teaching with the objective to promote understanding and application of concepts in science, government sponsored programs as well as institutions funded research-based initiatives aimed at developing and implementing new teaching interventions. Popular interventions are designed to increase students’ active participation in their learning, and are offered either as a complement to, or in replacement of, traditional lectures [10,11]. Interventions characterized as active learning are defined as “anything course-related that all students in a class session are called upon to do other than simply watching, listening and taking notes” [12]. The impact of active learning on examination performance and failure rates for undergraduate STEM education was recently quantified by applying statistical analysis to the findings of combined studies [13]. This meta-analysis found that the average examination scores improved by 6% in active learning sections, and students taught with traditional lectures were 1.5 times more likely to fail, providing strong support for the application of pedagogical interventions as a teaching practice in undergraduate science classes.
The design of pedagogical interventions intended to promote students’ participation in learning concepts in science remains a challenge [14]. Due to the rapid development of specialized fields such as molecular biology, academics teaching undergraduate science courses at the more advanced level are confronted with the pressure to cover recently discovered facts in addition to the main established concepts and with no additional class time allocated. Furthermore, the reiteration of the importance of understanding and application of scientific concepts previously acquired in the context of the specialized information presented add to the challenge of integrating students’ centered activities to promote learning. The design and implementation of interventions aimed at increasing students’ participation in their learning is compounded by the reality of class sizes and often large groups of students, which makes the application of traditional “active learning” approaches very challenging. For researchers in the field of education, a universal definition of “active learning” is lacking and often attributed to the different interpretation of some terms used by various authors. Perhaps core elements of active learning, including student activity and engagement in the learning process, provide a more accepted definition.
Taking advantage of internet distribution of learning material, another approach was developed to increase students’ participation in their learning, and referred to as “learner control” [15,16,17]. This later gives trainees control over when and where their training occurs, and to some extent allows them to make decisions on the content. While “active learning” takes place in the classroom, “learner control” is designed for learning outside the classroom, and provide students with more freedom “in becoming, or not, mindfully engaged” [18]. Clearly, traditional lectures and learner control approaches in education are at opposite ends of the spectrum of students’ involvement in learning; from passive to active, and neither benefiting all students [19].
Taking into consideration the benefits of active learning and the reality of large class size, a pedagogical tool was previously developed to increase learning of scientific concepts relevant to a Molecular Biology course offered to a large class of undergraduate students registered in the third year of a science program [20]. This tool has two components: a list of learning objectives listed at the end of each lecture, and a series of videos posted online providing answers to each learning objective. Students were advised to first answer the learning objectives and then to watch the videos to evaluate the content of their answers. In our analysis of the impact of the tool on students’ learning, we measured students’ exam scores, and reported a 2% increase of students’ average scores when videos were made available [20]. The next step in the evaluation of this intervention is to measure the use of the pedagogical tool by students and to explore students’ perception of the tool. The aims of the current study were to measure the use of the pedagogical tool; to determine if the tool was used as recommended; and to explore the students’ perception of its usefulness.

2. Results

In the current study, a pedagogical tool was used to complement the lecture component of a 3rd year undergraduate Molecular Biology course. The pedagogical tool consisted of learning objectives (questions) listed at the end of each lecture, and answers to those objectives in video format were made available to students online [20]. Using online tracking tools, a total of 7046 views were recorded for all 77 videos. The number of daily views of the videos during the complete academic term is illustrated in Figure 1. The two peaks in the number of views correspond exactly to the dates of examinations; first exam was held on 1 October 2013 (831 views), the second was held on 5 November 2013 (459 views). Expectedly, more views were recorded at the time of the first examination, since more videos were posted before this examination (47 vs. 30; Table 1). The pedagogical tool was not used by the faculty member teaching the third section of the course; hence, no viewing activity was detected during that period.
A survey was designed to assess students’ usage and perceptions of the learning tool; 64 students or 90.1% completed the survey. Based on the survey, the majority of the students (98.4%) found the videos to be useful and to promote understanding of the material presented in class. Students' usage patterns of the online videos are summarized in Table 2. Most students (73.4%) accessed all the videos, and about 80% watched more than half the videos. Very few students (3%) admitted to having watched the videos instead of attending class on at least one occasion. The majority of students watched several videos at a time, with 67.7% watching six or more videos in one sitting. All participants agreed that they liked having the option to control the pace and/or repeat sections of the videos.
Figure 1. Daily numbers of views of the videos during the complete academic term.
Figure 1. Daily numbers of views of the videos during the complete academic term.
Futureinternet 07 00140 g001
Table 1. Structure of the molecular biology course.
Table 1. Structure of the molecular biology course.
Lecture title (number of lectures)Faculty memberNumber of videos
DNA structure and chemistryA8
Genome organizationA9
DNA replicationA11
DNA repairA8
DNA recombination (2)A10
First examination
Control gene expression in prokaryotesA11
Control gene expression in phage lambdaA9
Protein synthesis and the genetic code (2)A10
Transcription in prokaryotesB0
Transcription in eukaryotesB0
Control of gene expression in eukaryotes: regulatory proteinsB0
Control of gene expression in eukaryotes: Combinatorial controlB0
Second examination
Control of gene expression in eukaryotes: DNA methylationB0
Control of gene expression in eukaryotes: translational controlB0
Maturation and transport of RNAC0
Regulatory RNAC0
Mobile genetic elementsC0
Genetic control of the immunoglobulin locus (2)C0
Methods to measure gene expressionC0
Final examination
Table 2. Students’ usage patterns.
Table 2. Students’ usage patterns.
Usage patternPercentage of students (n = 64)
Watched all videos  73.4%
Watched more than half79.7%
Watched instead of attending class  3.1%
Watched at least two videos per sitting  96.7%
Watched six or more in one sitting67.7%
Liked having the option to repeat/control pace  100%
An important question we sought to address was whether the students were using the pedagogical tool as it was intended. Students were advised to first answer the learning objectives and then watch the videos to evaluate the content of their answers. Interestingly, approximately half of the students (53.1%) always or most of the time used the pedagogical tool as recommended, and consistently answered the learning objectives before watching the videos (Figure 2). A substantial proportion of students (37%) followed the instructions only sometimes, while about 10% never attempted to solve the learning objectives before accessing the videos.
Figure 2. Percentage of students who attempted to solve the learning objectives before accessing the videos.
Figure 2. Percentage of students who attempted to solve the learning objectives before accessing the videos.
Futureinternet 07 00140 g002
Next, we wanted to assess whether or not using the professor’s voice in the videos was important to students. The majority of participants (93.8%) found it important that the narrator of the videos was the professor (Figure 3). Since the frame of each video was limited to a white page on which the narrator was drawing a diagram to illustrate the answer to a learning objective, we asked whether seeing the narrator’s face would improve the student’s learning experience. The majority of students (68.7%) indicated that seeing the face of the narrator rather than just the hands was not important, while 28% were unsure whether or not seeing the speaker’s face would improve their learning experience (Figure 4).
Figure 3. Importance of using the professor’s voice in the videos.
Figure 3. Importance of using the professor’s voice in the videos.
Futureinternet 07 00140 g003
Figure 4. Percentage of students who agreed that seeing the speaker’s face in the video would improve their learning experience.
Figure 4. Percentage of students who agreed that seeing the speaker’s face in the video would improve their learning experience.
Futureinternet 07 00140 g004

3. Discussion

The quantitative evaluation of the effectiveness of a new pedagogical intervention designed to increase students’ participation in the learning of concepts outside the classroom was studied. In the current study, we measured the use of the pedagogical tool and perception of students of its usefulness. Data collected using either online tracking tools or a survey indicated that the students used the pedagogical tool. The total number of views of online videos was 7046 for the 77 videos available to 71 students. Viewing activities of individual students were not tracked but rather represented viewing regardless of user identity, and represented a limitation in the method of quantification used. The possibility of multiple viewings made by the same student would lead to an overestimation of the number of views. Similarly, potential viewing by the 10 students who dropped the course within the first few weeks could have led to an overestimation of the number of views. Downloading the videos for later viewing without accessing the institution Blackboard Learn, although requiring more advanced computer skills, would have led to an underestimation of the number of views. The measure of daily views was also indicative of video usage by students, and two sharp peaks of activity were recorded on the days of the first and second examination. Peak heights were proportional to the number of videos made available to students: 831 views of 47 videos for the first examination, 459 views of 30 videos for the second examination, and no views on the date of the final examination, consistent with the lack of videos made available for the final examination.
The high viewing activity measured with online tools was consistent with the data collected from the survey, and indicated that 73.4% of the students who answered the survey watched all the videos. The survey provided additional information regarding the usefulness of the pedagogical tool, particularly about the recommended use of the videos and of the learning objectives. Instructions provided aimed at stimulating engagement in the learning of the concepts by addressing each learning objective independently, and knowing that students could verify the content of their answers by watching the videos and obtaining immediate feedback on their performance. While 73.4% of the students watched all the videos, only 53.1% used the tool as instructed—first answered the learning objectives before watching the video. For the other half of the students, limiting the use of the tool to viewing the videos reduced their participation in the learning of the material. Clearly, the video component of the tool was more popular than the learning objectives. For those students, benefits of the tool were limited to the viewing of videos, which represents a passive activity.
The recording of daily views was also indicative of limiting the use of the tool to viewing the videos. The two sharp peaks of activity measured on both days of the examinations limited the time available for answering the learning objectives and using the tool as instructed. In addition, 67.7% of students reported having watched 6 or more videos in one sitting, restricting the time for answering the learning objectives, and reducing the benefits of engagement in this activity. Reasons for not using the tool as instructed and limiting the participation for approximately half the students were not explored. The lack of engagement and reluctance of some students to engage in active learning activities was previously reported [19,21,22,23]. Previous reports indicated a lack of understanding of the benefits students would draw even without trying, a resistance to participation in class activities in front of classmates, and simply the lack of evidence of the benefits, including scores on tests [23]. The most important factor in promoting students’ participation in new learning activities is the demonstration of improved academic achievement as a result of using the proposed pedagogical tool. Similar to previous studies using standard measures of academic achievement, we measured a small increase in scores of 2% in association with the tool [20,24]. Increasing students’ use of the tool appropriately rather than with a limitation to video viewing remains a challenge.
The survey included questions designed to measure the perception students have of the usefulness of the intervention. Previous studies have identified the perception students have of interventions designed to increase learning as a significant contributor to the decision to engage in the suggested activity [23]. Similarly, the technical advantages offered by online videos, and previously reported to increase students’ use of the tools, were reported in the current study and included the option to repeat/control the pace of the videos [25,26,27]. The characteristics of the videos students might find important were investigated. Interestingly, the majority (93.8%) found it important that the narrator of the videos was the Professor, while only a small percentage (3.1%) would have preferred to see the face of the narrator rather than just the hands. Clearly, students attributed high importance to having the professor involved in the learning activity, albeit outside the classroom. This observation is significant for the development of videos used in complement to traditional lectures, and provides evidence for how students relate to Professors both in the classroom and when accessing videos featuring the Professor outside the classroom. Further research is required to identify the elements of an effective pedagogical intervention aimed at increasing the participation of students in learning.
The interpretation of the survey results has limitations including the previously described Hawthorne effect [28], whereby students knowingly react positively to any novel intervention regardless of its merit. This is particularly relevant for the question enquiring about the usefulness of the videos and 98.4% found the videos to be useful in assisting their learning. Although the contribution of the Hawthorne effect was not measured in the current survey, the positive impact of the pedagogical tool on students could have been attributed in part to the Hawthorne effect.
The current study provides an initial evaluation of the usefulness of the pedagogical tool and students’ reporting a high rate of usefulness motivates additional studies. Assessing how students benefited from the intervention and if learning increased requires looking at a broad range of learning outcomes including but not limited to: identifying what constitute an improvement in learning, quantifying the magnitude of the reported improvements, the impact of the tool on learning outcomes, and interpreting data. Our previous study measuring a modest 2% increase of scores on exams in association with the introduction of the tool in combination with the measured use of the tool and perception of students of its usefulness in the current study, justifies the use of the pedagogical tool in the future.

4. Methods

4.1. Course Format and Evaluation

The Molecular Biology course is offered once a year during the fall term, and taught in either English or French language. The course is a requirement for the Bachelor of Science (B.Sc.) degrees in Biology, Biochemistry, Biomedical Sciences, and Biopharmaceutical Sciences, and is worth 3 of the 90 credits required for the completion of the B.Sc. degree. For the current study, data were collected from the course taught in French, and lectures were presented twice a week as 80-min sessions for a total of 33 h of lectures during the academic term. Attendance to lectures was not recorded and was not mandatory. The material was presented by three faculty members all using the traditional lecture format consisting of PowerPoint Presentation guided lectures delivered in a classroom equipped with multimedia technology. All PowerPoint Presentations were made available to students between 24 and 48 h before each lecture through the Blackboard Learn system managed by the institution. The pedagogical tool was used only for the first half of the course and by only one of the faculty members’ who designed both the learning objectives and the videos (Table 1).
Students’ evaluation was conducted through three written examinations all taken in the classroom. The three exams had the same format, and consisted of three sections. Marks for each section were attributed as followed: 15 multiple choice questions for 30 marks, 10 short-answer questions for 50 marks and 2 long-answer questions for 20 marks. The students were given 80 min to complete the first two exams and 160 min to complete the final exam. None of the exams were cumulative. The final mark for the course was calculated from all three written exams with the following contribution from each exam: 30% from the first mid-term, 30% from the second mid-term and 40% from the final exam.

4.2. Student Population

The Molecular Biology course was offered to undergraduate students who had completed the first two years of a program offered in the Faculty of Science with the majority of students registered in the Biochemistry and Biology programs. Courses completed by students during the first two years were at the introductory level on the topics of biology, chemistry, mathematics and physics, and were completed at the same institution. The prerequisite for the Molecular Biology course was an introductory genetics course taken during the second year of the undergraduate degree. The number of students who completed the course and the three written examinations was 71 and the initial number of registered students was 81.

4.3. Pedagogical Tool—Learning Objectives and Online Videos

The pedagogical tool consisted of two components: a series of 8 to 15 learning objectives, formatted as questions and listed at the end of each lecture; and online videos presenting answers to each learning objective. Details of the content of the learning objectives and of the videos were described in our previous publication [20]. Table 1 includes a description of the subject covered in class, and the number of learning objectives associated with each lecture. Further details, including the subject of each learning objective and duration of each video, are listed in Table S1. Essentially, the pedagogical tool was designed to promote the learning and application of concepts presented in class. Concepts were either previously learned or presented in class and their application to the field of molecular biology was emphasized during the lectures. The learning objectives were posted online at the end of the lecture material for each individual lecture in the format of a PowerPoint Presentation, and provided 24 to 48 h prior to each lecture. The videos were made available online using the institution Blackboard Learn to students immediately after class, and all material remained available until the date of the final examination. At the end of each lecture, students were invited to address each of the learning objectives before watching the videos, and it was reiterated that videos could be used to verify the completeness and exactitude of answers. With the intent to promote students’ engagement in the learning of the concepts, the Professor strongly recommended to use the pedagogical tool early on in the preparation for the examination rather than only watching the videos the night before exams. The content and preparation of the pedagogical tool including the filming was done by faculty member A, who also posted of all the material online. The online delivery tool was managed by the educational institution and described as Blackboard Learn, and access was limited to students registered in the course at all times. Videos describing the learning objectives for each lecture were deposited on the online management platform Bits On The Run (purchased by LongTail Video, New York, NY, USA), accessible to students via a link posted on Blackboard Learn. Built-in tools to track viewing were used to measure the number of views on a daily basis. At the end of the term, viewing activities including the number of views and the duration of views for individual videos were extracted from the BitsOnTheRun platform as “.csv” files and analyzed using Microsoft Excel software (Mircosoft Corporation, Redmond, WA, USA). Ethics approval for the two components of the pedagogical tool, namely the learning objectives as the videos, was not required.

4.4. Survey Design and Implementation

The survey consisted of 10 questions (Figure S1) printed on a single page, and was distributed to all students at the end of the last lecture presented by Faculty member A, who developed the learning objectives. All questions of the survey were of the multiple choice type and did not require students to provide any written comments. Students were invited to answer all the questions directly on the survey, anonymously, and to return the questionnaire during the next lecture to Faculty member B, who did not use the pedagogical tool. To maximize the response rate, students were informed that approximately 5 to 10 min would be required to answer all the questions, and outcomes would provide valuable information for improving all aspects of the tool including video quality and content. Answers from filled surveys were compiled and analyzed by the authors. An ethics approval for the survey was submitted to the University of Ottawa Office of Research Ethics and Integrity, and classified under the category “Research Based on Secondary Use of Data”.

Acknowledgments

This research project was financed by a grant to Odette Laneuville from the Centre for University Teaching at the University of Ottawa.

Author Contributions

Study design and data collection: Laneuville. Survey design: Laneuville and Sikora. Data analysis and interpretation: Laneuville and Sikora. Manuscript writing: Laneuville and Sikora.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alberts, B.; Mayo, M. Video: Education and technology. Science 2009, 323, 53. [Google Scholar] [PubMed]
  2. The Coalition for Reform of Undergraduate STEM Education. Achieving Systemic Change: A Sourcebook for Advancing and Funding Undergraduate STEM Education; Association of American Colleges and Universities: Washington, DC, USA, 2014. [Google Scholar]
  3. American Association for the Advancement of Science. Vision and Change in Undergraduate Biology Education: A Call to Action. 2011. Available online: http://visionandchange.org/files/2011/03/Revised-Vision-and-Change-Final-Report.pdf (accessed on 27 February 2015).
  4. National Science Board. Science and Engineering Indicators 2014; National Science Foundation: Arlington, VA, USA, 2014. [Google Scholar]
  5. Committee on Integrated STEM Education. National Academy of Engineering and National Research Council of the National Academies. In STEM Integration in K–12 Education: Status, Prospects, and an Agenda for Research; National Academies Press: Washington, DC, USA, 2014. [Google Scholar]
  6. PULSE Community. Available online: http://pulsecommunity.org (accessed on 27 February 2015).
  7. Smith, M.K.; Jones, F.H.M.; Gilbert, S.L.; Wieman, C.E. The classroom observation protocol for undergraduate STEM (COPUS): A new instrument to characterize university STEM classroom practices. CBE Life Sci. Educ. 2013, 12, 618–627. [Google Scholar] [CrossRef] [PubMed]
  8. Hora, M.T.; Ferrare, J.J. Remeasuring postsecondary teaching: How singular categories of instruction obscure the multiple dimensions of classroom practice. J. Coll. Sci. Teach. 2014, 43, 36–41. [Google Scholar]
  9. Wieman, C.; Gilbert, S. The Teaching Practices Inventory: A New Tool for Characterizing College and University Teaching in Mathematics and Science. CBE Life Sci. Educ. 2014, 13, 552–569. [Google Scholar] [PubMed]
  10. Routhledge. Rethinking Pedagogy for a Digital Age: Designing for 21st Century Learning, 2nd ed.; Routhledge: New York, NY, USA, 2013. [Google Scholar]
  11. Anderson, W.A.; Banerjee, U.; Drennan, C.L.; Elgin, C.S.; Epstein, I.R.; Handelsman, J.; Hatfull, G.F.; Losick, R.; O’Dowd, D.K.; Olivera, B.M.; et al. Science education. Changing the culture of science education at research universities. Science 2011, 331, 152–153. [Google Scholar] [CrossRef] [PubMed]
  12. Felder, R.M.; Brent, R. Active learning: An introduction. ASQ High. Educ. Brief 2009, 2, 1–5. [Google Scholar]
  13. Freeman, S.; Eddy, S.L.; McDonough, M.; Smith, M.K.; Okoroafor, N.; Jordt, H.; Wenderoth, M.P. Active learning increases student performance in science, engineering, and mathematics. Proc. Nal. Acad. Sci. USA 2014, 111, 8410–8415. [Google Scholar] [CrossRef]
  14. Andrews, T.M.; Leonard, M.J.; Colgrove, C.A.; Kalinowski, S.T. Active learning not associated with student learning in a random sample of college biology courses. CBE Life Sci. Educ. 2011, 10, 394–405. [Google Scholar] [CrossRef] [PubMed]
  15. Merchant, Z.; Goetz, E.T.; Cifuentes, L.; Keeney-Kennicutt, W.; Davis, T.J. Effectiveness of virtual reality-based instruction on students’ learning outcomes in K-12 and higher education: A meta-analysis. Comput. Educ. 2014, 70, 20–40. [Google Scholar] [CrossRef]
  16. Dabbagh, N.; Kitsantas, A. Personal Learning Environments, social media, and self-regulated learning: A natural formula for connecting formal and informal learning. Internet High Educ. 2012, 15, 3–8. [Google Scholar] [CrossRef]
  17. Akbulut, Y.; Cardak, C.S. Adaptive educational hypermedia accommodating learning styles: A content analysis of publications from 2000 to 2011. Comput. Educ. 2012, 58, 835–842. [Google Scholar] [CrossRef]
  18. Salomon, G.; Perkins, D.N.; Globerson, T. Partners in cognition: Extending human intelligence with intelligent technologies. Educ. Res. 1991, 20, 2–9. [Google Scholar] [CrossRef]
  19. Bonwell, C.C.; Eison, J.A. Active learning: Creating excitement in the classroom; ASHEERIC Higher Education Report No.1; George Washington University: Washington, DC, USA, 1991. [Google Scholar]
  20. Dupuis, J.; Coutu, J.; Laneuville, O. Application of linear mixed-effect models for the analysis of exam scores: Online video associated with higher scores for undergraduate students with lower grades. Comput. Educ. 2013, 66, 64–73. [Google Scholar] [CrossRef]
  21. Erickson, F. Qualitative Research Methods for Science Education. In Second International Handbook of Science Education; Fraser, B.J., Tobin, K., McRobbie, C.J., Eds.; Springer Media B.V.: Houten, The Netherlands, 2012; pp. 1451–1469. [Google Scholar]
  22. Gasiewski, J.A.; Eagan, M.K.; Garcia, G.A.; Hurtado, S.; Chang, M.J. From gatekeeping to engagement: A multicontextual, mixed method study of student academic engagement in introductory STEM courses. Res. High Educ. 2012, 53, 229–261. [Google Scholar] [CrossRef] [PubMed]
  23. Brophy, J.E. Motivating Students to Learn, 3rd ed.; Routledge: Abingdon, UK, 2010. [Google Scholar]
  24. Dubin, R.; Taveggia, T. The Teaching-Learning Paradox. A Comparative Analysis of College Teaching Methods; Center for the Advanced Study of Educational Administration, University of Oregon: Eugene, OR, USA, 1968. [Google Scholar]
  25. Cheon, J.; Lee, S.; Crooks, S.M.; Song, J. An investigation of mobile learning readiness in higher education based on the theory of planned behavior. Comput. Educ. 2012, 59, 1054–1064. [Google Scholar] [CrossRef]
  26. Bowen, W.G. Higher Education in the Digital Age; Princeton University Press: Princeton, NJ, USA, 2013. [Google Scholar]
  27. Kay, R.H. Exploring the use of video podcasts in education: A comprehensive review of the literature. Comput. Hum. Behav. 2012, 28, 820–831. [Google Scholar] [CrossRef]
  28. Adair, J.G. The Hawthorne effect: A reconsideration of the methodological artifact. J .Appl. Psychol. 1981, 69, 334–345. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Laneuville, O.; Sikora, D. Quantitative Analysis of the Usage of a Pedagogical Tool Combining Questions Listed as Learning Objectives and Answers Provided as Online Videos. Future Internet 2015, 7, 140-151. https://doi.org/10.3390/fi7020140

AMA Style

Laneuville O, Sikora D. Quantitative Analysis of the Usage of a Pedagogical Tool Combining Questions Listed as Learning Objectives and Answers Provided as Online Videos. Future Internet. 2015; 7(2):140-151. https://doi.org/10.3390/fi7020140

Chicago/Turabian Style

Laneuville, Odette, and Dorota Sikora. 2015. "Quantitative Analysis of the Usage of a Pedagogical Tool Combining Questions Listed as Learning Objectives and Answers Provided as Online Videos" Future Internet 7, no. 2: 140-151. https://doi.org/10.3390/fi7020140

APA Style

Laneuville, O., & Sikora, D. (2015). Quantitative Analysis of the Usage of a Pedagogical Tool Combining Questions Listed as Learning Objectives and Answers Provided as Online Videos. Future Internet, 7(2), 140-151. https://doi.org/10.3390/fi7020140

Article Metrics

Back to TopTop