Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance
Abstract
:1. Introduction
1.1. Information Problem Solving
1.2. Information Problem-Solving Instruction
- Learning tasks are understood as authentic real-life tasks and their solution requires the integration and coordination of skills, knowledge and attitudes.
- Both supportive information and guidance are needed to develop cognitive models and strategies in order to complete the learning task.
- Procedural information has to be carefully designed by providing step-by-step instruction and explicit skills and procedures.
- Part-task practice should be included to provide enough training for recurrent skills.
1.2.1. Whole-Task IPS Instruction
- Driving questions. Driving questions are open questions given in an initial phase. They guide students to accomplish the key phases of the whole task, help learners to activate prior knowledge and offer relevant resources to solve tasks efficiently. IPS research shows that driving questions turn out to be a useful supporting tool for assisting students through the phases and skills required to solve IPS tasks [35,36]. In this line, [23] found that driving questions had positive effects in regulating the IPS process in higher education. The authors of [36] also used driving questions to aid law students in the proper use of electronic resources and guide them through their learning process.
- Prompting. The authors of [3] define prompts as a simple and yet effective method to provide instructional feedback in an online environment. These researchers summarise the effectiveness of three timing prompts: anticipative prompts delivered before execution of the targeted skill, instructional prompts provided during skill performance and reflection prompts given after execution of the targeted skill. Similarly, [12] used efficiently computer-embedded prompts in secondary education to help students assess and select the relevant information, but also to reflect and regulate the IPS process. In addition, [35] found that the use of metacognitive prompts in collaborative settings efficiently provides guidance in searching information skills. The authors of [37] also highlighted that prompting laypersons when consulting the internet proved to be a positive tool to acquire knowledge on health issues.
- Content representation tools. Content representation tools provide learners with category elements of an underlying ontology [38] and can aid students to structure a task domain. Different studies have used content representation tools to guide the students’ search and navigation across multiple information sources [39] and organise ill-structured web information [40], concept maps [41] or data tables [42] to assist in understanding and structuring web information retrieved from multiple digital resources.
- Process worksheets. Activities such as breaking up the task into parts or steps and completing process worksheets that focus on key processes to solve the whole tasks are effectively used in IPS instruction research [3,43] with two main benefits: first, completing process worksheets stimulates the active processing of essential information to solve the IPS task. Second, they ensure that students will follow a correct systematic approach and generate schemas of correct solution strategies themselves [44]. In this line, [23] included process worksheets as scaffolds in IPS instruction in higher education and discovered that intervention students regulated the problem-solving process and judged the information found more often.
- Writing and communication support. Most IPS tasks asked students to articulate a written response constructed from multiple digital resources. Writing from multiple documents entails complex cognitive processes and is a challenging activity itself [12]. Despite this difficulty and importance, this aspect of IPS skill has been marginally considered in most IPS studies [39]. Therefore, supporting the productive process of writing is necessary [35,37]. Our study tackles this issue by giving support to students’ writing processes when solving an IPS task and providing templates of how to organise information in a leaflet, flyer, brochure, letter or argumentative essay given to the students.
1.2.2. Embedded Instruction
1.2.3. Long-Term Instruction
1.3. The Study
2. Materials and Methods
2.1. Participants
2.2. Study Design and Procedure
- Action 1. Initial evaluation of control and experimental students at the beginning of the research project, namely, Test 1.
- Action 2. Implementation of the IPS instruction: only the experimental group followed the three-year intervention.
- Action 3. Follow-up evaluation: at the end of every academic year, control and experimental students were tested, namely, Test 2, Test 3 and Test 4.
2.3. Materials and Characteristics of the IPS Instruction
- Long-term instruction. Students were exposed to a wide range of learning tasks and domain-specific instruction [60], and had plenty of opportunities to practice the different skills and subskills in an extensive curriculum over a period of three academic years, which makes the transferability of the IPS skills an easier task [13].
- Embedded instruction. This principle aims to design authentic curricular tasks and teach the current subjects more meaningfully, because the learning tasks must be fully integrated in the regular curriculum. Since the role of the domain knowledge is an important factor that can be analysed in IPS [61] and instruction could be more effective by tackling more subjects and settings to facilitate the transfer of IPS skills [3,10], our long-term instruction was embedded in the contents of four curricular areas: science, technology, maths (STEM) and social science.
- Whole-task instruction. Students solved problems and challenges in which they covered the entire IPS process and had to use all the constituent skills and subskills from beginning to end, including the practice of IPS as a whole process in which one skill was coordinated and integrated with the rest of the skills [10]. The instruction provided students with the five-step structure, whereby each skill and its respective subskills functioned in an ordered and iterative way [62]. In addition, the learning tasks designed during the longitudinal project were optimised by means of technological support displayed on screens during the learning process. All these measures guaranteed students proper guidance and support to learn specific IPS skills and subskills [11,12,33] and the following five types of educative support were included: driving questions, prompting, content representation tools, process worksheets and writing and communicating support. Figure 3 shows and illustrates the five types of educational support designed to promote the IPS development.
2.4. Data Collection
2.5. Measurements
2.6. Data Analysis
- Descriptive statistics were computed for each dependent variable and each test. For quantitative variables, the following were shown: median, mean, minimum and maximum values. For qualitative variables, the following were shown: counts and percentages.
- In order to compare the results shown by the control and intervention groups, each test was subjected to a bivariate analysis. A non-parametric Wilcoxon test for comparing medians was used for quantitative dependent variables, and a chi-squared test for qualitative dependent variables.
- To analyse the longitudinal effect of long-term IPS instruction on the different dependent variables, a general linear model with repeated measures, e.g., [65], was established for the quantitative dependent variables. The results were given in terms of least square mean differences. For qualitative nominal dependent variables, on account of three nominal categories (no answer, facts, explanation), three logistic regression models, e.g., [66], were established. For these models, the results were displayed using the odds ratio (OR) and the corresponding 95% confidence interval reported.
3. Results
3.1. Descriptive Results and Bivariate Analyses
3.2. Longitudinal Analyses of IPS Instruction Effects on Task Performance
4. Discussion and Conclusions
Limitations and Implications for Future Research
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Brand-Gruwel, S.; Wopereis, I.; Vermetten, Y. Information problem solving by experts and novices: Analysis of a complex cognitive skill. Comput. Hum. Behav. 2005, 21, 487–508. [Google Scholar] [CrossRef]
- Brand-Gruwel, S.; Wopereis, I.; Walraven, A. A descriptive model of information problem solving while using internet. Comput. Educ. 2009, 53, 1207–1217. [Google Scholar] [CrossRef]
- Frerejean, J.; van Strien, J.L.H.; Kirschner, P.A.; Brand-Gruwel, S. Completion strategy or emphasis manipulation? Task support for teaching information problem solving. Comput. Hum. Behav. 2016, 62, 90–104. [Google Scholar] [CrossRef] [Green Version]
- Kirschner, P.A.; van Merriënboer, J.J. Do learners really know best? Urban legends in education. Educ. Psychol. 2013, 48, 169–183. [Google Scholar] [CrossRef]
- Van Deursen, A.J.A.M.; van Diepen, S. Information and strategic Internet skills of secondary students: A performance test. Comput. Educ. 2013, 96, 218–226. [Google Scholar] [CrossRef]
- Rosman, T.; Mayer, A.-K.; Krampen, G. A longitudinal study on information-seeking knowledge in psychology undergraduates: Exploring the role of information literacy instruction and working memory capacity. Comput. Educ. 2016, 96, 94–108. [Google Scholar] [CrossRef]
- Spisak, J. Secondary Student Information Literacy Self-efficacy vs. Performance. Ph.D. Dissertation, University Richmond, Richmond, VA, USA, November 2018. [Google Scholar]
- Hinostroza, J.E.; Ibieta, A.; Labbé, C.; Soto, M.T. Browsing the internet to solve information problems: A study of students’ search actions and behaviours using a ‘think aloud’protocol. Int. J. Inf. Educ. Technol. 2018, 23, 1933–1953. [Google Scholar] [CrossRef]
- Dinet, J.; Chevalier, A.; Tricot, A. Information search activity: An overview. Eur. Rev. Soc. Psychol. 2012, 62, 49–62. [Google Scholar] [CrossRef]
- Frerejean, J.; Velthorst, G.J.; van Strien, J.L.; Kirschner, P.A.; Brand-Gruwel, S. Embedded instruction to learn information problem solving: Effects of a whole task approach. Comput. Hum. Behav. 2019, 90, 117–130. [Google Scholar] [CrossRef]
- Mason, L.; Junyent, A.A.; Tornatora, M.C. Epistemic evaluation and comprehension of web-source information on controversial science-related topics: Effects of a short-term instructional intervention. Comput. Educ. 2014, 76, 143–157. [Google Scholar] [CrossRef]
- Raes, A.; Schellens, T.; De Wever, B.; Vanderhoven, E. Scaffolding information problem solving in web-based collaborative inquiry learning. Comput. Educ. 2012, 59, 82–94. [Google Scholar] [CrossRef]
- Wopereis, I.; Brand-Gruwel, S.; Vermetten, Y. The effect of embedded instruction on solving information problems. Comput. Hum. Behav. 2008, 24, 738–752. [Google Scholar] [CrossRef]
- Van Merrienböer, J.J.G.; Kirschner, P.A. Ten Steps to Complex Learning: A Systematic Approach to Four-Component Instructional Design, 3rd ed.; Routledge: New York, NY, USA, 2008. [Google Scholar]
- Donnelly, A.; Leva, M.C.; Tobail, A.; Valantasis Kanellos, N. A Generic Integrated and Interactive Framework (GIIF) for Developing Information Literacy Skills in Higher Education. PG Diploma in Practitioner Research Projects, DIT. 2018. Available online: https://api.semanticscholar.org/CorpusID:69560549 (accessed on 8 August 2020).
- Walraven, A.; Brand-Gruwel, S.; Boshuizen, H.P. Information-problem solving: A review of problems students encounter and instructional solutions. Comput. Hum. Behav. 2008, 24, 623–648. [Google Scholar] [CrossRef]
- Winne, P. Enhancing self-regulated learning and information problem solving with ambient big data. In Contemporary Technologies in Education: Maximizing Student Engagement, Motivation, and Learning; Adesope, O.O., Rud, A.G., Eds.; Palgrave Macmillan: New York, NY, USA, 2019; pp. 145–162. [Google Scholar]
- Wallace, M.R.; Kupperman, J.; Krajcik, J.; Soloway, E. Science on the Web: Students online in a sixth-grade classroom. J. Learn. Sci. 2000, 9, 75–104. [Google Scholar] [CrossRef]
- Argelagós, E.; Pifarré, M. Unravelling Secondary Students’ Challenges in Digital Literacy: A Gender Perspective. J. Educ. Train. Stud. 2017, 5, 42–55. [Google Scholar] [CrossRef] [Green Version]
- Ladbrook, J.; Probert, E. Information skills and critical literacy: Where are our digikids at with online searching and are their teachers helping? Australas. J. Educ. Technol. 2011, 27, 105–121. [Google Scholar] [CrossRef]
- Van Deursen, A.J.A.M.; Görzig, A.; van Delzen, M.; Perik, H.T.M.; Stegeman, A.G. Primary school children’s internet skills: A report on performance tests of operational, formal, information, and strategic internet skills. Int. J. Commun. Syst. 2014, 8, 1343–1365. [Google Scholar]
- Argelagós, E.; Pifarré, M. Key Information-Problem Solving Skills to Learn in Secondary Education: A Qualitative, Multi-Case Study. Int. J. Educ. Learn. 2016, 5, 1–14. [Google Scholar] [CrossRef]
- Gerjets, P.; Kammerer, Y.; Werner, B. Measuring spontaneous and instructed evaluation processes during Web search: Integrating concurrent thinking-aloud protocols and eye-tracking data. Learn. Instr. 2011, 21, 220–231. [Google Scholar] [CrossRef]
- Walraven, A.; Brand-Gruwel, S.; Boshuizen, H.P. How students evaluate information and sources when searching the World Wide Web for information. Comput. Educ. 2009, 52, 234–246. [Google Scholar] [CrossRef] [Green Version]
- Brand-Gruwel, S.; Kammerer, Y.; van Meeuwen, L.; van Gog, T. Source evaluation of domain experts and novices during Web search. J. Comput. Assist. Learn. 2017, 33, 234–251. [Google Scholar]
- Brante, E.W.; Strømsø, H.I. Sourcing in text comprehension: A review of interventions targeting sourcing skills. Educ. Psychol. Rev. 2018, 30, 773–799. [Google Scholar] [CrossRef]
- Barzilai, S.; Zohar, A.R.; Mor-Hagani, S. Promoting integration of multiple texts: A review of instructional approaches and practices. Educ. Psychol. Rev. 2018, 30, 973–999. [Google Scholar] [CrossRef]
- Kroustallaki, D.; Kokkinaki, T.; Sideridis, G.D.; Simos, P.G. Exploring students’ affect and achievement goals in the context of an intervention to improve web searching skills. Comput. Hum. Behav. 2015, 49, 156–170. [Google Scholar] [CrossRef]
- Probert, E. Information literacy skills: Teacher understandings and practice. Comput. Educ. 2009, 53, 24–33. [Google Scholar] [CrossRef]
- Frerejean, J.; van Strien, J.L.; Kirschner, P.A.; Brand-Gruwel, S. Effects of a modelling example for teaching information problem solving skills. J. Comput. Assist. Learn. 2018, 34, 688–700. [Google Scholar]
- Walraven, A.; Brand-Gruwel, S.; Boshuizen, H.P. Fostering students’ evaluation behaviour while searching the internet. Instr. Sci. 2013, 41, 125–146. [Google Scholar]
- Badia, A.; Becerril, L. Collaborative solving of information problems and group learning outcomes in secondary education. Int. J. Educ. Dev. 2015, 38, 67–101. [Google Scholar]
- Walhout, J.; Brand-Gruwel, S.; Jarodzka, H.; van Dijk, M.; de Groot, R.; Kirschner, P.A. Learning and navigating in hypertext: Navigational support by hierarchical menu or tag cloud? Comput. Hum. Behav. 2015, 46, 218–227. [Google Scholar]
- Wedderhoff, O.; Chasiotis, A.; Mayer, A.K. Information Preferences when Facing a Health Threat-The Role of Subjective Versus Objective Health Information Literacy. In Proceedings of the Sixth European Conference on Information Literacy (ECIL), Oulu, Finland, 24–27 September 2018. [Google Scholar]
- Gagnière, L.; Betrancourt, M.; Détienne, F. When metacognitive prompts help information search in collaborative setting. Eur. Rev. Appl. Psychol. 2012, 62, 73–81. [Google Scholar] [CrossRef]
- Nadolski, R.J.; Kirschner, P.A.; van Merriënboer, J.J. Process support in learning tasks for acquiring complex cognitive skills in the domain of law. Learn. Instr. 2006, 16, 266–278. [Google Scholar] [CrossRef] [Green Version]
- Stadtler, M.; Bromme, R. Effects of the metacognitive computer-tool met. a. ware on the web search of laypersons. Comput. Hum. Behav. 2008, 24, 716–737. [Google Scholar] [CrossRef]
- Suthers, D.D.; Hundhausen, C.D. An experimental study of the effects of representational guidance on collaborative learning processes. J. Learn. Sci. 2003, 12, 183–218. [Google Scholar] [CrossRef] [Green Version]
- Lazonder, A.W.; Rouet, J.F. Information problem solving instruction: Some cognitive and metacognitive issues. Comput. Hum. Behav. 2008, 24, 753–765. [Google Scholar] [CrossRef]
- Salmerón, L.; García, A.; Vidal-Abarca, E. The development of adolescents’ comprehension-based Internet reading skills. Learn. Individ. Differ. 2018, 61, 31–39. [Google Scholar] [CrossRef]
- Alpert, S.R.; Grueneberg, K. Concept mapping with multimedia on the web. J. Educ. Multimed. Hypermedia 2000, 9, 313–330. [Google Scholar]
- Zentner, A.; Covit, R.; Guevarra, D. Exploring effective data visualization strategies in higher education. 2020. Available online: https://ssrn.com/abstract=3322856 (accessed on 8 August 2020).
- Hancock-Niemic, M.A.; Lin, L.; Atkinson, R.K.; Renkl, A.; Wittwer, J. Example-based learning: Exploring the use of matrices and problem variability. Educ. Technol. Res. Dev. 2016, 64, 115–136. [Google Scholar] [CrossRef]
- Van Merriënboer, J.J.G. Training Complex Cognitive Skills: A Four-Component Instructional Design Model for Technical Training; Educational Technology Publications: Englewood Cliffs, NJ, USA, 2008. [Google Scholar]
- Farrell, R.; Badke, W. Situating information literacy in the disciplines: A practical and systematic approach for academic librarians. Ref. Serv. Rev. 2015, 43, 319–340. [Google Scholar] [CrossRef]
- Tricot, A.; Sweller, J. Domain-specific knowledge and why teaching generic skills does not work. Educ. Psychol. Rev. 2014, 26, 265–283. [Google Scholar] [CrossRef]
- Perin, D. Facilitating student learning through contextualization: A review of evidence. Community Coll. Rev. 2011, 39, 268–295. [Google Scholar] [CrossRef]
- Spink, A.; Danby, S.; Mallan, K.; Butler, C. Exploring young children’s web searching and technoliteracy. J. Doc. 2010, 66, 191–206. [Google Scholar] [CrossRef] [Green Version]
- Wang, C.H.; Ke, Y.T.; Wu, J.T.; Hsu, W.H. Collaborative action research on technology integration for science learning. J. Sci. Educ. Tech. 2012, 21, 125–132. [Google Scholar] [CrossRef]
- Argelagós, E.; Pifarré, M. Improving information problem solving skills in secondary education through embedded instruction. Comput. Hum. Behav. 2012, 28, 515–526. [Google Scholar] [CrossRef] [Green Version]
- Squibb, S.D.; Mikkelsen, S. Assessing the value of course-embedded information literacy on student learning and achievement. Coll. Res. Libr. 2016, 77, 164–183. [Google Scholar] [CrossRef]
- Frerejean, J.; van Merriënboer, J.J.; Kirschner, P.A.; Roex, A.; Aertgeerts, B.; Marcellis, M. Designing instruction for complex learning: 4C/ID in higher education. Eur. J. Educ. 2019, 54, 513–524. [Google Scholar] [CrossRef] [Green Version]
- Koni, I. The perception of issues related to instructional planning among novice and experienced teachers. Ph.D. Dissertation, University of Tartu, Tartu, Estonia, 2017. [Google Scholar]
- Rohrer, D. Student instruction should be distributed over long time periods. Educ. Psychol. Rev. 2015, 27, 635–643. [Google Scholar] [CrossRef]
- Van Merrienboer, J.J.; Kester, L.; Paas, F. Teaching complex rather than simple tasks: Balancing intrinsic and germane load to enhance transfer of learning. Appl. Cogn. Psychol. 2006, 20, 343–352. [Google Scholar] [CrossRef]
- Tran, T.; Ho, M.-T.; Pham, T.-H.; Nguyen, M.-H.; Nguyen, K.-L.P.; Vuong, T.-T.; Nguyen, T.-H.T.; Nguyen, T.-D.; Nguyen, T.-L.; Khuc, Q.; et al. How Digital Natives Learn and Thrive in the Digital Age: Evidence from an Emerging Economy. Sustainability 2020, 12, 3819. [Google Scholar] [CrossRef]
- Salmerón, L.; Kammerer, Y.; García-Carrión, P. Searching the Web for conflicting topics: Page and user factors. Comput. Hum. Behav. 2014, 29, 2161–2171. [Google Scholar] [CrossRef]
- Crary, S. Secondary Teacher Perceptions and Openness to Change Regarding Instruction in Information Literacy Skills. Sch. Libr. Res. 2019, 22, 1–26. [Google Scholar]
- Stubeck, C.J. Enabling Inquiry Learning in Fixed-Schedule Libraries: An Evidence-Based Approach. Knowl. Quest 2015, 43, 28–34. [Google Scholar]
- Gerjets, P.; Hellenthal-Schorr, T. Competent information search in the World Wide Web: Development and evaluation of a web training for pupils. Comput. Hum. Behav. 2008, 24, 693–715. [Google Scholar] [CrossRef]
- Brand-Gruwel, S.; Statdler, M. Solving information-based problems: Evaluating sources and information. Learn. Instruct. 2011, 21, 175–179. [Google Scholar] [CrossRef]
- Garcia, C.; Badia, A. Information problem-solving skills in small virtual groups and learning outcomes. J. Comput. Assist. Learn. 2017, 33, 382–392. [Google Scholar] [CrossRef]
- Hakkarainen, K. Emergence of progressive-inquiry culture in computer-supported collaborative learning. Learn. Environ. Res. 2003, 6, 199–220. [Google Scholar] [CrossRef]
- Toma, J.D. Approaching rigor in applied qualitative research. In The Sage Handbook for Research in Education: Engaging Ideas and Enriching Inquiry; Conrad, C.F., Serlin, R.C., Eds.; Sage Publications, Inc.: Thousand Oaks, CA, USA, 2006; pp. 405–423. [Google Scholar]
- McCullagh, P.; Nelder, J.A. Generalized Linear Models; Chapman & Hall: London, UK, 1991. [Google Scholar]
- Hosmer, D.W.; Lemeshow, S. Applied Logistic Regression; John Wiley & Sons: New York, NY, USA, 2000. [Google Scholar]
- Scardamalia, M.; Bereiter, C. Knowledge building and knowledge creation: Theory, pedagogy, and technology. In The Cambridge Handbook of the Learning Sciences, 2nd ed.; Sawyer, R.K., Ed.; Cambridge University Press: New York, NY, USA, 2014; pp. 397–417. [Google Scholar]
- Hoffman, J.L.; Wu, H.K.; Krajcik, J.S.; Soloway, E. The nature of middle school learners’ science content understandings with the use of on-line resources. J. Res. Sci. Teach. 2003, 40, 323–346. [Google Scholar] [CrossRef] [Green Version]
- Henkel, M.; Grafmüller, S.; Gros, D. Comparing information literacy levels of Canadian and German university students. In International Conference on Information; Springer: Cham, Switzerland, 2018; pp. 464–475. [Google Scholar]
- Hämäläinen, E.K.; Kiili, C.; Marttunen, M.; Räikkönen, E.; González-Ibáñez, R.; Leppänen, P.H. Promoting sixth graders’ credibility Evaluation of Web pages: An intervention study. Comput. Hum. Behav. 2020, 110, 106372. [Google Scholar]
- Jarodzka, H.; Balslev, T.; Holmqvist, K.; Nyström, M.; Scheiter, K.; Gerjets, P.; Eika, B. Conveying clinical reasoning based on visual observation via eye-movement modelling examples. Instr. Sci. 2012, 40, 813–827. [Google Scholar] [CrossRef] [Green Version]
- Salmerón, L.; Delgado, P.; Mason, L. Using eye-movement modeling examples to improve critical reading of multiple webpages on a conflicting topic. J. Comput. Assist. Learn. 2020, 1, 1–14. [Google Scholar]
- Argelagós, E.; Brand-Gruwel, S.; Jarodzka, H.M.; Pifarré, M. Unpacking cognitive skills engaged in web-search: How can log files, eye movements, and cued-retrospective reports help? An in-depth qualitative case study. Int. J. Innov. Learn. 2018, 24, 152–175. [Google Scholar] [CrossRef]
- Kammerer, Y.; Brand-Gruwel, S.; Jarodzka, H. The future of learning by searching the Web: Mobile, social, and multimodal. Frontline Learn. Res. 2018, 6, 81–91. [Google Scholar] [CrossRef]
Control | Experimental | Wilcoxon Two-Sample Test | ||||||||
---|---|---|---|---|---|---|---|---|---|---|
Dependent Variable | Test | Mean | Median | Min. | Max. | Mean | Median | Min. | Max. | p Value |
1. Fact-finding task-performance (Total value: 14 points) | 1 | 11 | 11 | 7 | 14 | 10.3 | 11 | 5 | 14 | n.s |
2 | 9 | 10 | 1 | 12 | 10.5 | 11 | 2 | 13 | 400.5 (0.0028) | |
3 | 13 | 13 | 10 | 14 | 12.9 | 13 | 8 | 14 | n.s. | |
4 | 11.4 | 12 | 8 | 14 | 12.2 | 12.5 | 8 | 14 | n.s | |
2. Information-gathering task performance | 1 | 1.5 | 1 | 0 | 3 | 1.7 | 1.7 | 0 | 5 | n.s |
2 | 2.4 | 2 | 0 | 5.5 | 3.7 | 3.7 | 0 | 7 | 418 (0.0074) | |
3 | 2.5 | 2.5 | 0 | 5 | 3.6 | 3 | 0 | 7 | n.s. | |
(Total value: 7 points) | 4 | 3.3 | 3 | 0 | 7 | 4.5 | 5 | 1 | 7 | 421 (0.0085) |
Control | Experimental | Chi Square Test | ||||
---|---|---|---|---|---|---|
Test | Categories | Count | % | Count | % | P Value |
1 | No answer | 14 | 73.68 | 20 | 47.62 | |
Facts | 5 | 26.32 | 22 | 52.38 | ||
Explanation | 0 | 0 | 0 | 0 | ||
3.603 (0.0577) | ||||||
2 | No answer | 6 | 31.58 | 10 | 23.81 | |
Facts | 13 | 68.42 | 24 | 57.14 | ||
Explanation | 0 | 0 | 8 | 19.05 | ||
6.531 (0.0382) | ||||||
3 | No answer | 2 | 10.53 | 6 | 14.29 | |
Facts | 13 | 68.42 | 25 | 59.52 | ||
Explanation | 4 | 21.05 | 11 | 26.19 | ||
n.s. | ||||||
4 | No answer | 8 | 42.11 | 6 | 14.29 | |
Facts | 10 | 52.63 | 19 | 45.24 | ||
Explanation | 1 | 5.26 | 17 | 40.48 | ||
10.059 (0.0065) |
Control | Experimental | ||||||||
---|---|---|---|---|---|---|---|---|---|
Quantitative Dependent Variable | Comparison between Tests | LSM Differences | 95% CI Lower | 95% CI Upper | p Value | LSM Differences | 95% CI Lower | 95% CI Upper | p Value |
1. Fact-finding task-performance (Total value: 14 points) | T1 vs. T2 | 1.7406 | 0.2176 | 3.2635 | 0.0195 | −0.3098 | −1.2670 | 0.6474 | 0.8320 |
T1 vs. T3 | −1.9694 | −3.4282 | −0.5106 | 0.0043 | −2.8788 | −3.6522 | −2.1054 | <0.0001 | |
T1 vs. T4 | −0.8748 | −2.3740 | 0.6244 | 0.4115 | −2.3111 | −3.1602 | −1.4619 | <0.0001 | |
2. Information-gathering task- performance (Total value: 7 points) | T1 vs. T2 | −1.1292 | −2.1397 | −0.1187 | 0.0232 | −1.6525 | −2.4248 | −0.8802 | <0.0001 |
T1 vs. T3 | −1.0808 | −1.9732 | −0.1883 | 0.0121 | −1.3526 | −1.9944 | −0.7107 | <0.0001 | |
T1 vs. T4 | −1.4960 | −2.4916 | −0.5003 | 0.0013 | −2.3852 | −3.0685 | −1.7018 | <0.0001 |
Control vs. Experimental | ||||
---|---|---|---|---|
Qualitative Nominal Dependent Variable 3 | OR Estimate | CI 95% OR Lower | CI 95% OR Upper | p Value |
Model 1. No answer vs. Answer | 1.96 | 1.09 | 3.51 | 0.0250 |
Model 2. No answer vs. Explanation | 5.14 | 1.75 | 15.07 | 0.0035 |
Model 3. Facts vs. Explanation | 3.39 | 1.14 | 10.08 | 0.0286 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pifarré, M.; Argelagós, E. Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance. Sustainability 2020, 12, 7919. https://doi.org/10.3390/su12197919
Pifarré M, Argelagós E. Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance. Sustainability. 2020; 12(19):7919. https://doi.org/10.3390/su12197919
Chicago/Turabian StylePifarré, Manoli, and Esther Argelagós. 2020. "Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance" Sustainability 12, no. 19: 7919. https://doi.org/10.3390/su12197919
APA StylePifarré, M., & Argelagós, E. (2020). Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance. Sustainability, 12(19), 7919. https://doi.org/10.3390/su12197919