Next Article in Journal
Empowering Communications in Vehicular Networks with an Intelligent Blockchain-Based Solution
Next Article in Special Issue
Knowledge of Child Abuse among Trainee Teachers and Teachers in Service in Spain
Previous Article in Journal
Culturally Driven Monitoring: The Importance of Traditional Ecological Knowledge Indicators in Understanding Aquatic Ecosystem Change in the Northwest Territories’ Dehcho Region
Previous Article in Special Issue
Improving Future Teachers’ Digital Competence Using Active Methodologies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance

by
Manoli Pifarré
1,* and
Esther Argelagós
2
1
Department of Psychology, Universitat de Lleida, 25001 Lleida, Spain
2
Department of Psychology, Universitat de Girona, 17004 Girona, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(19), 7919; https://doi.org/10.3390/su12197919
Submission received: 25 August 2020 / Revised: 9 September 2020 / Accepted: 21 September 2020 / Published: 24 September 2020

Abstract

:
This research paper is based on a longitudinal study to find out how long-term embedded whole-task instruction can help students to develop more efficient information problem-solving (IPS) skills that could lead to a better use of internet information for learning and solving digital tasks more effectively. To this end, we designed, implemented and evaluated a three-year instruction programme to promote students’ development of key IPS skills in real-life classroom settings. This research involved sixty-one secondary education students. Forty-two of them received the IPS instruction and their results were analysed longitudinally and subsequently compared to a control group which received the regular courses. The results showed that students who received the IPS instruction improved their performance significantly in tasks in which the use of IPS skills was needed and these students organised and presented the information found on the internet critically and gave personal arguments. The findings also revealed that during the three-year project, the scores of IPS task performance were statistically higher in the instructed students than those obtained in control group students. Our study then provides an insight into how secondary students develop IPS skills throughout long-term instructional support and shows a series of educational implications.

1. Introduction

The impact of the Internet Age has prompted a paradigm shift in education. Nowadays, most of our everyday learning is characterised by drawing knowledge from a wide variety of electronic resources. Learners from different levels are required to search for, collect and understand information from digital external sources and construct a solution to solve a task. This shift has never been more noticeable than amidst the current coronavirus pandemic. In this context, it is important to remember that educational research has identified information problem solving (henceforth IPS) as a complex process that requires the unfolding of complex higher-order cognitive skills, e.g., [1,2,3].
Although it is undeniable that younger generations of students appear to master the skills needed to navigate online digital resources, educational research confirms that, without explicit instruction, students underuse or even lack the IPS skills to find correct and reliable online resources and construct knowledge from them [4,5,6,7]. Therefore, educational research sees the need to provide students with adequate IPS skills to learn from online and digital resources. Furthermore, [8] claim that IPS skills instruction is crucial to promote quality, equality and sustainable education because it has been found that students’ performance in digital skills is initially associated with their socio-economic background, academic achievement and residence location.
Various theoretical models have been proposed to characterise the phases and the cognitive processes involved in IPS that are needed to transform the retrieved web information into knowledge [9]. However, these models describe the stages and cognitive competences involved in the process, but fail to show which students’ specific activities are in each stage and how to best support them. As a consequence, educational institutions and teachers find it difficult to teach the key IPS skills that could help students take full advantage of the opportunities the internet provides for learning and building knowledge autonomously from online digital resources and in finding a suitable place and time in the curriculum [3,5].
In recent years, research has been carried out to analyse the effectiveness of teaching IPS using the internet, e.g., [10,11,12,13]. However, further research is still needed to tailor the existing IPS models to specific groups of students and in specific learning contexts [3,9] and, by so doing, promote quality and sustainable education for all students.
This paper takes a first step towards supporting teachers in embedding IPS skills in educational curricula with the description of the design and empirical testing of instruction for IPS skills. Inspired by the Four-Component Instructional Design (4C/ID) model [14] for teaching complex skills, we have designed a long-term, embedded, whole-task IPS instruction programme to foster learning and meaning-making from digital sources and investigated its effects on students’ task performance.

1.1. Information Problem Solving

Information problem solving (IPS) is a complex cognitive process considered as an important 21st century skill in combination with critical thinking [15]. The authors of [1,2,3] have defined a five-step approach to solving information problems based on a decomposition of the IPS process into constituent skills and subskills. This approach highlights the fact that during the implementation of all skills, it is essential to activate regulation activities, such as orientation, monitoring, steering and evaluating [16,17].
Figure 1 shows this IPS model. Basically, it represents that, when students are confronted with an information problem or challenge, as a first step, they have to define the problem, activate previous knowledge which will help detect what information is needed and formulate a clear question that will lead to the information searching process and the selection of searching terms. Research shows that teenagers have problems when they reach the stage of formulating questions [18], clarifying task requirements, activating prior knowledge and determining what information needs searching [16]. In addition, very few students start searching with a prior reflection on the task at hand and clear outlining of their search [10,19,20].
The searching skill then starts off with the definition of relevant search terms to be used in the search engine. Many students struggle with this second stage and tend to introduce long sentences that may reduce their success in finding relevant and reliable sources [18,21]. Therefore, skills related to the formulation of the search terms and their fine tuning are, indeed, influential abilities to solve a problem in a satisfactory manner [22,23].
To accomplish the skill of selecting the relevance and reliability of information, students must tackle evaluation abilities. However, they do not systematically judge the search results on the search engine results page (SERP) and often find it difficult to choose between reputable and questionable sources [16,20,21,24]. Sometimes, the source selected may well come from a commercial source [11]. As these activities are often challenging, novice students tend to pay little attention to reliability of the sources and the information found [24,25].
The fourth skill, i.e., the one needed to analyse, select and process the information found, comes into play. By virtue of that, useful information is compared, contrasted and elaborated with students’ previous knowledge and information from other resources. It has often been claimed that this skill requires a high level of effort and motivation [26], since it is essential in accomplishing an IPS task [27] successfully. In fact, some studies point out that learners can be instructed to generate more relevant search terms, enhance their evaluation habits and select better information and sources (e.g., [28]).
Finally, the fifth skill in the students’ constructed information is organising and integrating in a personal manner using organisation and communication tools to answer the question or challenge posed at the beginning of the process. Regarding this skill, it is reported that students mainly use ineffective strategies such as copying and pasting information from the web [29], and tend to find a simple answer on a single web page as opposed to reading the web information in a critical and thorough way [12].
To sum up, considering that the resolution of the task as the solution to an information problem from online sources implies a complex cognitive process [12,16], in which secondary students face many challenges, it is essential for them to receive guidance and supervision through a well-designed educational intervention.

1.2. Information Problem-Solving Instruction

It is often claimed that IPS skills are underdeveloped or absent without explicit instruction, even among “digital natives” [1,3,4,5,6,16]. However, educational research shows that students can be instructed to better define the problem and the information needed, generate more relevant search queries, adopt more evaluation criteria, select higher quality resources and deeply process and present information to answer an informational problem [10,25].
Over recent decades, much effort has been made to investigate efficient instructional approaches for IPS and incorporate effective support for guiding students’ activity in searching, retrieving, evaluating and integrating information from multiple web sources (e.g., [3,4,10,11,12,13,16,30]). However, despite the researchers’ efforts made so far, their attempts have proved insufficient and further research is still needed in order to face and shed light on how formal IPS skills training could be designed in order to have a positive impact on students’ learning.
Our study is built on the basis of the four-component instructional design (4C/ID, for short) model [14] to design, implement and empirically test innovative IPS instruction in secondary education. The 4C/ID model advocates the design of four components:
  • Learning tasks are understood as authentic real-life tasks and their solution requires the integration and coordination of skills, knowledge and attitudes.
  • Both supportive information and guidance are needed to develop cognitive models and strategies in order to complete the learning task.
  • Procedural information has to be carefully designed by providing step-by-step instruction and explicit skills and procedures.
  • Part-task practice should be included to provide enough training for recurrent skills.
In the arena of IPS instruction, these four components have been translated according to the following principles: whole-task, embedded and long-term instruction.

1.2.1. Whole-Task IPS Instruction

Whole-task instruction proposes the resolution of ill-structured, authentic and complex real-life situations in which students have to perform all the steps of the IPS process, from beginning to end, and students can find different ways to solve the task. Whole-task instruction has proved to be more effective to teach IPS complex skills than part-task fragmented instruction [10,14]. Whole-task instruction offers the possibility to provide support for all the IPS skills and practice them as a whole process in which one skill relates to and impacts on the others. By contrast, instructional approaches that focus only on practicing specific searching or evaluating skills, e.g., [23], offer students very few occasions to coordinate and integrate all of the five IPS skills [14], and also to transfer [31].
Regarding the support needed, research provides evidence that it is possible to build on whole-task support to improve students’ IPS skills in demanding learning and learning that is difficult to be achieved successfully [11,12,32,33,34]. The main approaches for giving support in IPS instruction and the outcomes obtained are the following five: driving questions, prompting, content representation tools, processing worksheets and writing and communicating support.
  • Driving questions. Driving questions are open questions given in an initial phase. They guide students to accomplish the key phases of the whole task, help learners to activate prior knowledge and offer relevant resources to solve tasks efficiently. IPS research shows that driving questions turn out to be a useful supporting tool for assisting students through the phases and skills required to solve IPS tasks [35,36]. In this line, [23] found that driving questions had positive effects in regulating the IPS process in higher education. The authors of [36] also used driving questions to aid law students in the proper use of electronic resources and guide them through their learning process.
  • Prompting. The authors of [3] define prompts as a simple and yet effective method to provide instructional feedback in an online environment. These researchers summarise the effectiveness of three timing prompts: anticipative prompts delivered before execution of the targeted skill, instructional prompts provided during skill performance and reflection prompts given after execution of the targeted skill. Similarly, [12] used efficiently computer-embedded prompts in secondary education to help students assess and select the relevant information, but also to reflect and regulate the IPS process. In addition, [35] found that the use of metacognitive prompts in collaborative settings efficiently provides guidance in searching information skills. The authors of [37] also highlighted that prompting laypersons when consulting the internet proved to be a positive tool to acquire knowledge on health issues.
  • Content representation tools. Content representation tools provide learners with category elements of an underlying ontology [38] and can aid students to structure a task domain. Different studies have used content representation tools to guide the students’ search and navigation across multiple information sources [39] and organise ill-structured web information [40], concept maps [41] or data tables [42] to assist in understanding and structuring web information retrieved from multiple digital resources.
  • Process worksheets. Activities such as breaking up the task into parts or steps and completing process worksheets that focus on key processes to solve the whole tasks are effectively used in IPS instruction research [3,43] with two main benefits: first, completing process worksheets stimulates the active processing of essential information to solve the IPS task. Second, they ensure that students will follow a correct systematic approach and generate schemas of correct solution strategies themselves [44]. In this line, [23] included process worksheets as scaffolds in IPS instruction in higher education and discovered that intervention students regulated the problem-solving process and judged the information found more often.
  • Writing and communication support. Most IPS tasks asked students to articulate a written response constructed from multiple digital resources. Writing from multiple documents entails complex cognitive processes and is a challenging activity itself [12]. Despite this difficulty and importance, this aspect of IPS skill has been marginally considered in most IPS studies [39]. Therefore, supporting the productive process of writing is necessary [35,37]. Our study tackles this issue by giving support to students’ writing processes when solving an IPS task and providing templates of how to organise information in a leaflet, flyer, brochure, letter or argumentative essay given to the students.

1.2.2. Embedded Instruction

Embedding IPS training within a meaningful context with domain-specific instruction has proved more effective than standalone courses [37,45,46]. Embedding instruction has the potential to increase engagement, motivation, transfer and deep learning [47]. Previous studies investigating embedded instruction have shown good results in primary education [48,49], secondary education [12,31,50] and higher education [13,51].
A literature review offers theoretical and empirical evidence on the effectiveness of whole-task and embedded IPS instruction. However, there are still scarce studies combining these two key instructional approaches. For instance, [10] investigated an embedded IPS course designed according to a whole-task approach and instructed ten student teachers in a quasi-experimental intervention study, finding positive results in the development of IPS skills and task performance. In another study, [23] successfully applied embedded IPS instruction with psychology students. In this study, students obtained good learning outcomes and increased the frequency of the use of some of their IPS constituent skills and regulation activities. More recently, [52] investigated student teachers’ IPS skills through embedded whole-task instruction in a 20-week course and reported that the instruction succeeded in developing cognitive strategies to tackle an information problem.

1.2.3. Long-Term Instruction

Long-term instruction for learning has been considered as instruction that lasts over a quarter of the academic year [53], or even as an instructional course that may take place over two or three weeks [54]. In the specific field of IPS, long-term instruction has been related to a curriculum-wide approach [6,30,52]. Most IPS intervention studies apply short-term instruction and these studies report that some of the improvements in IPS skills reached by the participants disappeared after completing the course [10]. In this vein, researchers claimed the need for “a scaled-up version with more content, more task classes containing tasks of increasing complexity, offered over a longer period of time and embedded in a multitude of contexts, might prove very effective.” [3] (p. 101). This claim is also shared by other studies, in which it is assumed that the whole-task approach to complex learning requires more learning tasks over longer periods than other kinds of instruction, but such practice will lead to better transfer to new settings when designed and conducted adequately [10,55].
In summary, despite the existence of studies confirming that embedded whole-task IPS instruction improves students’ IPS skills, there is still the need to know to what an extent the period of instruction of the IPS skills might have a positive impact on students’ learning and performance results [22,32,56]. Furthermore, while most educational institutions acknowledge that IPS is an essential academic skill in this digital and knowledge era, they struggle with its implementation, and specifically in finding a suitable place and sizeable time in the curriculum for IPS integration [3,39]. IPS skills require domain-specific knowledge and, in order to guarantee their transfer to daily activities, long-term, embedded and supported IPS practice throughout the whole curriculum is needed [13,57].
Notwithstanding this necessity, most IPS instruction is often implemented as a separate course and loosely connected to the curricular contents (e.g., [11]) and secondary education students still face difficulties in their daily school activities [58,59]. Therefore, it is desirable to further investigate how to embed IPS research and instruction in real secondary classrooms and learn curricular contents to provide best practices, approaches and conclusive results of quality education for all students. To this end, this paper tackles this objective and provides answers to this educational challenge by discussing the design, development and empirical testing of a long-term, embedded, whole-task IPS instruction programme in secondary education. Specifically, our research investigates the longitudinal effects of a three-year IPS instruction programme on students’ task performance when solving complex digital problems.

1.3. The Study

The present study is grounded on research by [50], who started to investigate the effects of long-term, embedded, whole-task instruction on the development of IPS skills in secondary education. Our study then follows up on this research and takes a longitudinal approach that aims to answer the following question: what are the effects of long-term, embedded and whole-task IPS instruction on students’ task performance? While research shows discrete short-term learning effects, it is unclear whether there may be higher potential in a long-term situation [3,10]. Our study aims to contribute with new data. With this purpose, a three-year embedded IPS skills intervention programme was designed, during which students solved whole-task projects related to daily life challenges as well as science, technology, mathematics (STEM) and social science curricular contents. In our quasi-experimental design, the digital task performance of students following the regular curriculum (i.e., control group) was compared to that of students following the three-year IPS instruction (i.e., experimental group).
As this long-term training makes use of whole tasks that address and support all constituent IPS skills, our expectations are that those students who follow IPS instruction will display deeper meaning-making from digital sources and better task performance than their counterparts who follow the regular curriculum. With a view to obtaining a more detailed description on the effects of IPS instruction on task performance results, the four evaluation tests carried out to assess task performance over the three-year project included three different tasks of varying difficulty, namely: (a) fact-finding task, (b) information-gathering task and (c) final essay. Our research aims to confirm or reject the following four hypotheses:
Hypothesis 1 (H1).
Students following IPS instruction will carry out a fact-finding task better, as measured by the number of correct answers presented within.
Hypothesis 2 (H2).
Students following IPS instruction will solve an information-gathering task better, as measured by the number of correct argumentative concepts presented within.
Hypothesis 3 (H3).
Students following IPS instruction will write a better final essay, as measured by the level of explanation of the ideas written up.
Hypothesis 4 (H4).
There will be longitudinal differences between the two groups—control and experimental groups—on IPS task performance throughout the three-year project. These differences will be more noticeable in more complex tasks that involve information gathering as well as in the final essay.

2. Materials and Methods

2.1. Participants

The participants of our study were involved in a larger research project that aimed to promote digital literacy in secondary education students and in real-life classroom settings. For this reason, we only recruited a sample of sixty-one students (32 girls and 29 boys). It must be said, though, that these participants were fully committed during our ambitious three-year IPS instruction and completed the four tests of the longitudinal research throughout this period of time. From these, 42 of them corresponded to the experimental group and followed the long-term IPS instruction, while the remaining 19 were members of the control group and followed regular classes. At the beginning of the project, the students’ ages were 12/13, and by the end of the project, their ages were 15/16. They belonged to three urban schools from the city of Lleida (Spain). In order to preserve the natural classroom environment and due to ethical issues to ensure that all students of the same school could benefit from our long-term IPS instruction, one school was established as the control group, while the other two schools acted as the intervention group.
In the control group, the students did not follow the IPS instruction and this group was used to study the natural development of IPS skills by a group of students who live in a digital society and use digital information in their daily life. Therefore, the control group students were free to use internet information to solve school assignments depending on the teacher’s learning objectives. However, the teachers did not provide guided internet use and neither did students participate in any specific instruction, course or workshop related to IPS skills nor internet navigation.
The research complied with the ethical code by requiring the school authorities and parental consent to allow participation of their children in the study, and that of the teachers. The research team guaranteed confidentiality and data protection to all participants.

2.2. Study Design and Procedure

This is a longitudinal study with a quasi-experimental design, including an experimental group, i.e., the group of interest, and a control group, used to establish the natural development of students who did not receive explicit IPS instruction. All the participants completed four tests carried out at four different moments during the three-year project.
The longitudinal study design process consisted of three main actions, as seen in Figure 2:
  • Action 1. Initial evaluation of control and experimental students at the beginning of the research project, namely, Test 1.
  • Action 2. Implementation of the IPS instruction: only the experimental group followed the three-year intervention.
  • Action 3. Follow-up evaluation: at the end of every academic year, control and experimental students were tested, namely, Test 2, Test 3 and Test 4.
To carry out the three-year project, we also counted on the collaboration of eighteen secondary teachers of four school disciplines (namely, science, technology, maths (STEM) and social sciences) who worked hand in hand with our research group in designing the IPS tasks, the supporting tools provided for each task to promote the development of IPS skills and the embedding of the digital tasks in the school curriculum. During this collaboration, our research group ensured that the teachers became aware of the importance of IPS skills to better solve information problems and promoting them in a real-life classroom setting.

2.3. Materials and Characteristics of the IPS Instruction

The long-term, embedded and whole-task IPS instruction consisted in the resolution of 24 web-based learning tasks. Each task consisted in an authentic, ill-structured whole task embedded within four sessions of 60 min each. Therefore, students received approximately 96 hours of sustained and maintained IPS instruction for a period of three years.
All the IPS learning tasks were designed following the three key instructional and methodological principles grounded on the IPS literature review presented previously: long-term, embedded and whole task.
  • Long-term instruction. Students were exposed to a wide range of learning tasks and domain-specific instruction [60], and had plenty of opportunities to practice the different skills and subskills in an extensive curriculum over a period of three academic years, which makes the transferability of the IPS skills an easier task [13].
  • Embedded instruction. This principle aims to design authentic curricular tasks and teach the current subjects more meaningfully, because the learning tasks must be fully integrated in the regular curriculum. Since the role of the domain knowledge is an important factor that can be analysed in IPS [61] and instruction could be more effective by tackling more subjects and settings to facilitate the transfer of IPS skills [3,10], our long-term instruction was embedded in the contents of four curricular areas: science, technology, maths (STEM) and social science.
  • Whole-task instruction. Students solved problems and challenges in which they covered the entire IPS process and had to use all the constituent skills and subskills from beginning to end, including the practice of IPS as a whole process in which one skill was coordinated and integrated with the rest of the skills [10]. The instruction provided students with the five-step structure, whereby each skill and its respective subskills functioned in an ordered and iterative way [62]. In addition, the learning tasks designed during the longitudinal project were optimised by means of technological support displayed on screens during the learning process. All these measures guaranteed students proper guidance and support to learn specific IPS skills and subskills [11,12,33] and the following five types of educative support were included: driving questions, prompting, content representation tools, process worksheets and writing and communicating support. Figure 3 shows and illustrates the five types of educational support designed to promote the IPS development.

2.4. Data Collection

The instrument to collect the data of this study was a web-based authentic whole task that required unfolding learning skills. We designed two versions of the web-based task; both were about astronomy and were similar in terms of style, complexity and structure. Web-based task 1 was used as the basis for Test 1 and Test 3; its content was about the planet Mars. Web-based task 2 was used as the basis for Test 2 and Test 4, its content being about the Moon. All the participants solved the web-based tasks individually, in a real classroom context within 50 min and the students’ answers were all stored in a webserver.

2.5. Measurements

In order to obtain a more detailed description about the effects of the IPS instruction in solving learning tasks of different levels of complexity, the design of the web-based task was divided into three inter-related tasks of varying difficulty, namely: (a) fact-finding task; (b) information-gathering task and (c) final essay; resulting in different variable types, respectively, quantitative and qualitative nominal, each one with different grading scores.
Dependent variable 1: Fact-finding task performance. This consisted in a carefully structured task, in which its 61 participants were asked to complete a conceptual map that involved searching for factual information about a planet (e.g., physical characteristics, orography). This task focused on searching and locating relatively simple pieces of information that was usually found on one single website [59]. Students received 1 point for each question answered correctly, and 0 points for incorrect answers. This was a quantitative variable with a total score of 14 points.
Dependent variable 2: Information-gathering task performance. This was an ill-structured task, in which its 61 participants had to answer seven questions argumentatively. Specifically, this task encouraged students to gather and integrate information from different web sources to find an answer. Information-gathering tasks are more difficult to solve because collecting and integrating information from different sources requires remembering pieces of information while searching. Besides, this type of task involves complex cognitive processes and IPS skills to generate meaning out of complex, electronic documents [33]. Students received 1 point for each question answered correctly, and 0 points for incorrect answers. This variable was quantitative, with a total score of 7 points.
Dependent variable 3: Final essay task performance. This task was a conclusion or sum-up task whereby each student was asked to write a short argumentative essay (of approximately 300 words) that integrated and used comprehensive web information to write a personal argument. To be specific, students were asked to hypothesise whether it would be possible to establish a human colony on Mars or the Moon and, if so, what problems would humans encounter. This variable was qualitative nominal; a rubric scale was constructed to capture the mean level of explanation of the content ideas presented in the essay. The rubric scale started with the category “no answer”, which meant that students had failed to write the essay, followed by two categories related to describing facts: (1) “separate pieces of facts” and (2) “organised facts”; the rubric also established two categories related to explaining: (3) “partial explanation” and (4) “explanation” [19,63].
The study considered reliability and validity issues. For variables 1 and 2, two raters, familiar with both the IPS tasks and the materials, coded 15% of all the answers protocols. Interrater reliability computed on this subsample of protocols yielded a Cohen’s kappa higher than 80. One rater scored the remaining protocols. For variable 3, each participant essay was reviewed by two raters. Discrepancies were solved using a consensus-based approach. Member checking is a well-established procedure to build up “trustworthiness” in qualitative research [64].

2.6. Data Analysis

The statistical methods used in the analyses are described as follows:
  • Descriptive statistics were computed for each dependent variable and each test. For quantitative variables, the following were shown: median, mean, minimum and maximum values. For qualitative variables, the following were shown: counts and percentages.
  • In order to compare the results shown by the control and intervention groups, each test was subjected to a bivariate analysis. A non-parametric Wilcoxon test for comparing medians was used for quantitative dependent variables, and a chi-squared test for qualitative dependent variables.
  • To analyse the longitudinal effect of long-term IPS instruction on the different dependent variables, a general linear model with repeated measures, e.g., [65], was established for the quantitative dependent variables. The results were given in terms of least square mean differences. For qualitative nominal dependent variables, on account of three nominal categories (no answer, facts, explanation), three logistic regression models, e.g., [66], were established. For these models, the results were displayed using the odds ratio (OR) and the corresponding 95% confidence interval reported.
The statistical significance was defined as p < 0.05. All the results were obtained using SAS software, v9.3, SAS Institute Inc. (Cary, NC, USA), 2007.

3. Results

We have organised the presentation of the results into two subsections: (1) descriptive results obtained by the control and experimental group students in IPS task performance variables in each test and bivariate analyses; (2) longitudinal analyses of long-term instruction effects on task performance.

3.1. Descriptive Results and Bivariate Analyses

Table 1 and Table 2 present the main descriptive statistics for the three IPS task performance variables obtained by the students in the four tests.
The results presented in Table 1 and in Figure 4 show that both groups of students obtained lower scores in information-gathering than in fact-finding tasks (in line with Hypotheses 1 and 2). From these results, we can infer that information-gathering tasks are more complex and difficult than fact-finding tasks.
As the project progressed (see Figure 4), on average, both groups obtained better scores in fact-finding as well as in information-gathering tasks. However, this increase was significantly higher in the experimental group that followed the long-term IPS instruction. The experimental group also performed better in solving the information-gathering tasks; thus, from Test 1 through to Test 4, the median increased by 3.3 points (out of 7) in this type of task. By contrast, the increase for the control group students in this task was only of 2 points.
Figure 5 shows the results of the different categories identified in the final essay, these results show that at the outset of the study, no student had any explanation category whatsoever for Test 1. As the project progressed, we identified different pattern responses in control and experimental students. While the experimental group students reduced drastically the percentage of no answer category and increased firmly the percentage of explanation category (from 0 in Test 1 to 40.48% in Test 4), the control group students maintained high rates of no answer and very low rates of explanation throughout the project.
Bivariate analyses were carried out to study the effect of IPS instruction on students’ IPS task performance in each evaluation point. At the outset of the study, in Test 1, no statistically significant differences were found between groups (control vs. experimental) neither for fact-finding nor for information-gathering task performance (see Table 1 and Figure 4).
When solving fact-finding tasks, the bivariate analyses hardly showed statistically significant differences between groups, with the exception of Test 2 (Wilcoxon two-sample test = 400.5, p = 0.0028). However, in solving information-gathering tasks, experimental students obtained higher scores than control students in Tests 2, 3 and 4, and these differences were statistically higher in Test 2 (Wilcoxon two-sample test = 418, p = 0.0074) and Test 4 (Wilcoxon two-sample test = 421, p = 0.0085), compared with the control group (see Table 1 and Figure 4).
Regarding the final essay (Hypothesis 3), the bivariate analyses revealed statistically significant differences between the quality of the responses between the groups in Tests 1, 2 and 4 (see Table 2).
In summary, these results point out that those students who followed the long-term, embedded, whole-task IPS instruction showed higher improvement in the performance of complex IPS tasks and were more capable of generating explanations from web information than control group students.

3.2. Longitudinal Analyses of IPS Instruction Effects on Task Performance

A stratified general linear model with repeated measures analysis was used in order to investigate how the long-term, embedded, whole-task IPS instruction improved the experimental students’ IPS task performance throughout the project. Three analyses were carried out and the results obtained in Test 1 were compared to those in Test 2 (T1 vs. T2), Test 3 (T1 vs. T3) and Test 4 (T1 vs. T4). The results of these three analyses are presented in Table 3.
The longitudinal analysis points out that IPS instruction has a highly statistically significant impact on the evolution of experimental students’ IPS fact-finding and information-gathering task performance (Hypotheses 1 and 2). This positive impact is significantly higher in the resolution of complex tasks. Thus, the estimated least square mean (henceforth, LSM) difference between Test 1 and Test 4 is 2.31 units (out of 14) for fact-finding tasks and 2.38 units (out of 7) for information-gathering tasks.
Control group students also showed a longitudinal increment in IPS task performance scores; the estimated LSM difference between Test 1 and Test 4 is only 0.87 (out of 14) for fact-finding tasks and 1.5 units (out of 7) for information-gathering tasks. However, these increments are lower than the ones obtained by intervention group students.
To verify whether there were any statistical differences between control and intervention groups in IPS task performance scores across time (Hypothesis 4), a stratified general linear model with repeated measures was performed. Statistically significant differences between the two groups were found in information-gathering task performance; the estimated LSM difference between the two groups in this variable was 0.9695 (−0.9695, CI 95% = (−0.386, −1.553)). However, no statistically significant differences between control and experimental groups were found in fact-finding task performance. This result reveals that the positive longitudinal effect of long-term IPS instruction was higher when solving complex IPS tasks.
A logistic regression model was carried out to investigate the longitudinal effect of long-term, embedded, whole-task IPS instruction on final essay task performance (Hypothesis 3). As this is a qualitative nominal variable with three nominal categories (no answer, facts, explanation), three logistic regression models were established. Model 1 analysis (no answer vs. answer) showed that in the control group, the odds ratio of no answer was almost two times higher than that of the intervention group. Model 2 analysis (no answer versus explanation) showed that the odds of obtaining no answer in the control group were five times higher than in the intervention group. Finally, Model 3 analysis (facts vs. explanation) revealed that intervention group students were 3.39 times more likely to generate explanation responses in their essays (see all the results in Table 4) than control group students. This finding shows a positive longitudinal effect of IPS instruction on integrating and using web information comprehensively.

4. Discussion and Conclusions

Even though schools experience difficulties in embedding IPS skills across the curriculum, our study successfully implemented and evaluated long-term, embedded, whole-task IPS instruction in real-life classroom settings. Consequently, the instructional approach used in this study can make a valuable contribution to providing good practice and quality education for all students and to overcome some of the difficulties stated by previous research [59,60].
Moreover, this paper has researched the longitudinal effect of IPS instruction on the students’ task performance and learning in real-life classroom settings. Our results reveal that long-term, embedded, whole-task IPS instruction has a highly positive effect on students’ IPS performance as time goes by. Thus, students following IPS instruction produced a better fact-finding task (Hypothesis 1), information-gathering task (Hypothesis 2) and a better final essay (Hypothesis 3), as verified by the quality of the explanation of the ideas expressed. In addition, we found longitudinal differences between control and experimental groups regarding IPS task performance throughout the three-year project, because instructed students outperformed their counterparts in more complex tasks, namely the information-gathering task and final essay (Hypothesis 4).
These findings coincide with the results obtained in previous research in which the participants’ engagement in web-search instruction could improve their learning and efficiency in solving the task [12,50]. Other authors also pointed out that the better the strategy and reflection on each skill, the better the efficiency, e.g., [22,25].
Our results also reveal that, as the project progressed, those students who followed the long-term, embedded, whole-task IPS instruction solved information problems better than the control students. Hence, the positive effects of the students’ participation in IPS instruction are higher in solving complex tasks. In information-gathering tasks, whereby complex cognitive processes come into play and which encourage students to extract meaning out of complex digital documents, experimental students outperformed control students as the project progressed and these differences increased over time. This is in line with previous studies that had shown that the more complex the task, the deeper the processing of information. Likewise, a more expert pattern was required to solve the problem successfully [10,32,33]. We can then conclude that the IPS instruction designed in this study promoted the students’ development of those informational skills that helped them learn from internet resources.
Besides, our research contributes with experimental evidence to the fact that the IPS instruction designed in this study helped students to generate more and better explanations in their final essay. Thus, our students learned to integrate information from digital resources and build meaning-making to provide a successful answer to an informational problem. Previous research had already demonstrated that the process of explanation increased critical thinking and understanding by pushing an agent to explain the consequences of his/her view and to search for new information needed for answering questions and achieving his/her cognitive goals [67]. To conclude, our study extends previous results and gives experimental evidence that long-term, embedded, whole-task instruction in real-life classroom settings is desirable to efficiently help students to learn while solving complex digital tasks.
In the light of the results obtained in our study, we can claim that embedded support across long-term instruction boosts students’ IPS skill development. The students receiving the instruction mastered IPS skills that helped them draw relevant conclusions out of internet resources and succeeded in using this information meaningfully to construct their own arguments in the final essay. This confirms previous studies by [68], in that such factors as the students’ use of adequate search strategies, the adoption of assessment strategies towards online information and the quality of online resources obtained by students were essential to explain the successful development of science-related conceptual understandings expressed in a final essay. Along the same line of argument, [11] related greater source evaluation reached by instructed students in a secondary school with a deeper level of information comprehension to solve an information task.
Our study strengthens the claim highlighted by previous research in which natural interaction with digital information and non-explicit support is not enough to furnish our students with those IPS skills needed to solve complex information problems [3,14,30,69].

Limitations and Implications for Future Research

Important difficulties were overcome in order to recruit the students and teachers to participate in this demanding three-year study. Especially worth noting are the difficulties to recruit the control group which had less benefits from participating in the project. This resulted in having a small sample, which in some of the analyses could be a weakness, and may not allow us to draw general and transferable conclusions. Furthermore, our research did not study the impact of pedagogical variables, such as school teaching approaches or school teaching programmes, on students’ results. In future studies, possible differences of the participant schools in key variables related with teaching should be monitored and considered.
One could argue that our study did not analyse the students’ results in relation with their initial IPS skills, prior knowledge and interest in the topic at hand, even though these variables had been considered as important in previous IPS skills research, i.e., [11,12]. Despite this limitation, the deep long-term analyses of the progression of individual students regarding IPS skills show that students’ participation in a sheltered learning environment over time can be beneficial to all students, regardless of their background knowledge. Indeed, as the project progressed, we could see that the intervention students performed better in complex tasks. This result is in line with previous research that revealed that providing scaffolds to develop cognitive skills is beneficial to children with higher and lower skills [11].
Another limitation of this study is that, although our IPS instruction offered different and diverse supporting tools to promote the development of IPS skills, other support could have been more effective, such as modelling examples, as previous educational research has shown their efficacy in promoting IPS learning [30] and web page evaluation [70]. A modelling example involves an expert solving a problem while thinking aloud and describing the details and decisions related with each skill during the IPS process. A recorded video can show the screen of and actions made by the expert while concurrently playing the expert’s explanation. Online learning instruction provides an easy opportunity for embedding modelling example videos [30]. In subsequent studies, modelling examples can therefore be included in the repertoires of tools to support learning tasks, and even eye-movement modelling examples (EMMEs), consisting of videos showing a dot representing the eye movements in the modelling example videos [71], which are also effective in a digital educational context [72].
Although we have hypothesised that progressive development of IPS skills in the experimental group could account for the effectiveness of their task performance throughout the project [20,58], in this study we have not analysed the impact of IPS instruction on the development of IPS skills and subskills. In future research work, our intention is to obtain more empirical evidence to support this statement, using a series of techniques to collect and analyse the data, such as log files, eye tracking, think-aloud protocols, cued retrospective reports or a combination of them all [73], and by so doing, be able to apprehend the strengths and weakness of the execution of the IPS process, skills and subskills in greater detail and/or in a qualitative way. The ongoing learning activities during the web-search process could also be analysed in a more in-depth manner [74].
Despite its limitations, our research is significant for educational research in the topic of IPS instruction because it shows that long-term, embedded, whole-task IPS instruction for learning curricular content in real classroom settings is feasible. In a nutshell, our study can contribute to education with the design of a series of learning tasks and supporting tools that can be effective in developing IPS skills in a secondary education curriculum that have a positive impact on students’ learning.

Author Contributions

M.P. and E.A. have contributed equally. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministerio de Economía y Competitividad of the Spanish Government (projects number: EDU2012–32415 and EDU2016–80258-R).

Acknowledgments

The authors would like to thank the eighteen teachers that participated in the implementation of the longitudinal project in the classrooms and especially all the secondary students involved in the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Brand-Gruwel, S.; Wopereis, I.; Vermetten, Y. Information problem solving by experts and novices: Analysis of a complex cognitive skill. Comput. Hum. Behav. 2005, 21, 487–508. [Google Scholar] [CrossRef]
  2. Brand-Gruwel, S.; Wopereis, I.; Walraven, A. A descriptive model of information problem solving while using internet. Comput. Educ. 2009, 53, 1207–1217. [Google Scholar] [CrossRef]
  3. Frerejean, J.; van Strien, J.L.H.; Kirschner, P.A.; Brand-Gruwel, S. Completion strategy or emphasis manipulation? Task support for teaching information problem solving. Comput. Hum. Behav. 2016, 62, 90–104. [Google Scholar] [CrossRef] [Green Version]
  4. Kirschner, P.A.; van Merriënboer, J.J. Do learners really know best? Urban legends in education. Educ. Psychol. 2013, 48, 169–183. [Google Scholar] [CrossRef]
  5. Van Deursen, A.J.A.M.; van Diepen, S. Information and strategic Internet skills of secondary students: A performance test. Comput. Educ. 2013, 96, 218–226. [Google Scholar] [CrossRef]
  6. Rosman, T.; Mayer, A.-K.; Krampen, G. A longitudinal study on information-seeking knowledge in psychology undergraduates: Exploring the role of information literacy instruction and working memory capacity. Comput. Educ. 2016, 96, 94–108. [Google Scholar] [CrossRef]
  7. Spisak, J. Secondary Student Information Literacy Self-efficacy vs. Performance. Ph.D. Dissertation, University Richmond, Richmond, VA, USA, November 2018. [Google Scholar]
  8. Hinostroza, J.E.; Ibieta, A.; Labbé, C.; Soto, M.T. Browsing the internet to solve information problems: A study of students’ search actions and behaviours using a ‘think aloud’protocol. Int. J. Inf. Educ. Technol. 2018, 23, 1933–1953. [Google Scholar] [CrossRef]
  9. Dinet, J.; Chevalier, A.; Tricot, A. Information search activity: An overview. Eur. Rev. Soc. Psychol. 2012, 62, 49–62. [Google Scholar] [CrossRef]
  10. Frerejean, J.; Velthorst, G.J.; van Strien, J.L.; Kirschner, P.A.; Brand-Gruwel, S. Embedded instruction to learn information problem solving: Effects of a whole task approach. Comput. Hum. Behav. 2019, 90, 117–130. [Google Scholar] [CrossRef]
  11. Mason, L.; Junyent, A.A.; Tornatora, M.C. Epistemic evaluation and comprehension of web-source information on controversial science-related topics: Effects of a short-term instructional intervention. Comput. Educ. 2014, 76, 143–157. [Google Scholar] [CrossRef]
  12. Raes, A.; Schellens, T.; De Wever, B.; Vanderhoven, E. Scaffolding information problem solving in web-based collaborative inquiry learning. Comput. Educ. 2012, 59, 82–94. [Google Scholar] [CrossRef]
  13. Wopereis, I.; Brand-Gruwel, S.; Vermetten, Y. The effect of embedded instruction on solving information problems. Comput. Hum. Behav. 2008, 24, 738–752. [Google Scholar] [CrossRef]
  14. Van Merrienböer, J.J.G.; Kirschner, P.A. Ten Steps to Complex Learning: A Systematic Approach to Four-Component Instructional Design, 3rd ed.; Routledge: New York, NY, USA, 2008. [Google Scholar]
  15. Donnelly, A.; Leva, M.C.; Tobail, A.; Valantasis Kanellos, N. A Generic Integrated and Interactive Framework (GIIF) for Developing Information Literacy Skills in Higher Education. PG Diploma in Practitioner Research Projects, DIT. 2018. Available online: https://api.semanticscholar.org/CorpusID:69560549 (accessed on 8 August 2020).
  16. Walraven, A.; Brand-Gruwel, S.; Boshuizen, H.P. Information-problem solving: A review of problems students encounter and instructional solutions. Comput. Hum. Behav. 2008, 24, 623–648. [Google Scholar] [CrossRef]
  17. Winne, P. Enhancing self-regulated learning and information problem solving with ambient big data. In Contemporary Technologies in Education: Maximizing Student Engagement, Motivation, and Learning; Adesope, O.O., Rud, A.G., Eds.; Palgrave Macmillan: New York, NY, USA, 2019; pp. 145–162. [Google Scholar]
  18. Wallace, M.R.; Kupperman, J.; Krajcik, J.; Soloway, E. Science on the Web: Students online in a sixth-grade classroom. J. Learn. Sci. 2000, 9, 75–104. [Google Scholar] [CrossRef]
  19. Argelagós, E.; Pifarré, M. Unravelling Secondary Students’ Challenges in Digital Literacy: A Gender Perspective. J. Educ. Train. Stud. 2017, 5, 42–55. [Google Scholar] [CrossRef] [Green Version]
  20. Ladbrook, J.; Probert, E. Information skills and critical literacy: Where are our digikids at with online searching and are their teachers helping? Australas. J. Educ. Technol. 2011, 27, 105–121. [Google Scholar] [CrossRef]
  21. Van Deursen, A.J.A.M.; Görzig, A.; van Delzen, M.; Perik, H.T.M.; Stegeman, A.G. Primary school children’s internet skills: A report on performance tests of operational, formal, information, and strategic internet skills. Int. J. Commun. Syst. 2014, 8, 1343–1365. [Google Scholar]
  22. Argelagós, E.; Pifarré, M. Key Information-Problem Solving Skills to Learn in Secondary Education: A Qualitative, Multi-Case Study. Int. J. Educ. Learn. 2016, 5, 1–14. [Google Scholar] [CrossRef]
  23. Gerjets, P.; Kammerer, Y.; Werner, B. Measuring spontaneous and instructed evaluation processes during Web search: Integrating concurrent thinking-aloud protocols and eye-tracking data. Learn. Instr. 2011, 21, 220–231. [Google Scholar] [CrossRef]
  24. Walraven, A.; Brand-Gruwel, S.; Boshuizen, H.P. How students evaluate information and sources when searching the World Wide Web for information. Comput. Educ. 2009, 52, 234–246. [Google Scholar] [CrossRef] [Green Version]
  25. Brand-Gruwel, S.; Kammerer, Y.; van Meeuwen, L.; van Gog, T. Source evaluation of domain experts and novices during Web search. J. Comput. Assist. Learn. 2017, 33, 234–251. [Google Scholar]
  26. Brante, E.W.; Strømsø, H.I. Sourcing in text comprehension: A review of interventions targeting sourcing skills. Educ. Psychol. Rev. 2018, 30, 773–799. [Google Scholar] [CrossRef]
  27. Barzilai, S.; Zohar, A.R.; Mor-Hagani, S. Promoting integration of multiple texts: A review of instructional approaches and practices. Educ. Psychol. Rev. 2018, 30, 973–999. [Google Scholar] [CrossRef]
  28. Kroustallaki, D.; Kokkinaki, T.; Sideridis, G.D.; Simos, P.G. Exploring students’ affect and achievement goals in the context of an intervention to improve web searching skills. Comput. Hum. Behav. 2015, 49, 156–170. [Google Scholar] [CrossRef]
  29. Probert, E. Information literacy skills: Teacher understandings and practice. Comput. Educ. 2009, 53, 24–33. [Google Scholar] [CrossRef]
  30. Frerejean, J.; van Strien, J.L.; Kirschner, P.A.; Brand-Gruwel, S. Effects of a modelling example for teaching information problem solving skills. J. Comput. Assist. Learn. 2018, 34, 688–700. [Google Scholar]
  31. Walraven, A.; Brand-Gruwel, S.; Boshuizen, H.P. Fostering students’ evaluation behaviour while searching the internet. Instr. Sci. 2013, 41, 125–146. [Google Scholar]
  32. Badia, A.; Becerril, L. Collaborative solving of information problems and group learning outcomes in secondary education. Int. J. Educ. Dev. 2015, 38, 67–101. [Google Scholar]
  33. Walhout, J.; Brand-Gruwel, S.; Jarodzka, H.; van Dijk, M.; de Groot, R.; Kirschner, P.A. Learning and navigating in hypertext: Navigational support by hierarchical menu or tag cloud? Comput. Hum. Behav. 2015, 46, 218–227. [Google Scholar]
  34. Wedderhoff, O.; Chasiotis, A.; Mayer, A.K. Information Preferences when Facing a Health Threat-The Role of Subjective Versus Objective Health Information Literacy. In Proceedings of the Sixth European Conference on Information Literacy (ECIL), Oulu, Finland, 24–27 September 2018. [Google Scholar]
  35. Gagnière, L.; Betrancourt, M.; Détienne, F. When metacognitive prompts help information search in collaborative setting. Eur. Rev. Appl. Psychol. 2012, 62, 73–81. [Google Scholar] [CrossRef]
  36. Nadolski, R.J.; Kirschner, P.A.; van Merriënboer, J.J. Process support in learning tasks for acquiring complex cognitive skills in the domain of law. Learn. Instr. 2006, 16, 266–278. [Google Scholar] [CrossRef] [Green Version]
  37. Stadtler, M.; Bromme, R. Effects of the metacognitive computer-tool met. a. ware on the web search of laypersons. Comput. Hum. Behav. 2008, 24, 716–737. [Google Scholar] [CrossRef]
  38. Suthers, D.D.; Hundhausen, C.D. An experimental study of the effects of representational guidance on collaborative learning processes. J. Learn. Sci. 2003, 12, 183–218. [Google Scholar] [CrossRef] [Green Version]
  39. Lazonder, A.W.; Rouet, J.F. Information problem solving instruction: Some cognitive and metacognitive issues. Comput. Hum. Behav. 2008, 24, 753–765. [Google Scholar] [CrossRef]
  40. Salmerón, L.; García, A.; Vidal-Abarca, E. The development of adolescents’ comprehension-based Internet reading skills. Learn. Individ. Differ. 2018, 61, 31–39. [Google Scholar] [CrossRef]
  41. Alpert, S.R.; Grueneberg, K. Concept mapping with multimedia on the web. J. Educ. Multimed. Hypermedia 2000, 9, 313–330. [Google Scholar]
  42. Zentner, A.; Covit, R.; Guevarra, D. Exploring effective data visualization strategies in higher education. 2020. Available online: https://ssrn.com/abstract=3322856 (accessed on 8 August 2020).
  43. Hancock-Niemic, M.A.; Lin, L.; Atkinson, R.K.; Renkl, A.; Wittwer, J. Example-based learning: Exploring the use of matrices and problem variability. Educ. Technol. Res. Dev. 2016, 64, 115–136. [Google Scholar] [CrossRef]
  44. Van Merriënboer, J.J.G. Training Complex Cognitive Skills: A Four-Component Instructional Design Model for Technical Training; Educational Technology Publications: Englewood Cliffs, NJ, USA, 2008. [Google Scholar]
  45. Farrell, R.; Badke, W. Situating information literacy in the disciplines: A practical and systematic approach for academic librarians. Ref. Serv. Rev. 2015, 43, 319–340. [Google Scholar] [CrossRef]
  46. Tricot, A.; Sweller, J. Domain-specific knowledge and why teaching generic skills does not work. Educ. Psychol. Rev. 2014, 26, 265–283. [Google Scholar] [CrossRef]
  47. Perin, D. Facilitating student learning through contextualization: A review of evidence. Community Coll. Rev. 2011, 39, 268–295. [Google Scholar] [CrossRef]
  48. Spink, A.; Danby, S.; Mallan, K.; Butler, C. Exploring young children’s web searching and technoliteracy. J. Doc. 2010, 66, 191–206. [Google Scholar] [CrossRef] [Green Version]
  49. Wang, C.H.; Ke, Y.T.; Wu, J.T.; Hsu, W.H. Collaborative action research on technology integration for science learning. J. Sci. Educ. Tech. 2012, 21, 125–132. [Google Scholar] [CrossRef]
  50. Argelagós, E.; Pifarré, M. Improving information problem solving skills in secondary education through embedded instruction. Comput. Hum. Behav. 2012, 28, 515–526. [Google Scholar] [CrossRef] [Green Version]
  51. Squibb, S.D.; Mikkelsen, S. Assessing the value of course-embedded information literacy on student learning and achievement. Coll. Res. Libr. 2016, 77, 164–183. [Google Scholar] [CrossRef]
  52. Frerejean, J.; van Merriënboer, J.J.; Kirschner, P.A.; Roex, A.; Aertgeerts, B.; Marcellis, M. Designing instruction for complex learning: 4C/ID in higher education. Eur. J. Educ. 2019, 54, 513–524. [Google Scholar] [CrossRef] [Green Version]
  53. Koni, I. The perception of issues related to instructional planning among novice and experienced teachers. Ph.D. Dissertation, University of Tartu, Tartu, Estonia, 2017. [Google Scholar]
  54. Rohrer, D. Student instruction should be distributed over long time periods. Educ. Psychol. Rev. 2015, 27, 635–643. [Google Scholar] [CrossRef]
  55. Van Merrienboer, J.J.; Kester, L.; Paas, F. Teaching complex rather than simple tasks: Balancing intrinsic and germane load to enhance transfer of learning. Appl. Cogn. Psychol. 2006, 20, 343–352. [Google Scholar] [CrossRef]
  56. Tran, T.; Ho, M.-T.; Pham, T.-H.; Nguyen, M.-H.; Nguyen, K.-L.P.; Vuong, T.-T.; Nguyen, T.-H.T.; Nguyen, T.-D.; Nguyen, T.-L.; Khuc, Q.; et al. How Digital Natives Learn and Thrive in the Digital Age: Evidence from an Emerging Economy. Sustainability 2020, 12, 3819. [Google Scholar] [CrossRef]
  57. Salmerón, L.; Kammerer, Y.; García-Carrión, P. Searching the Web for conflicting topics: Page and user factors. Comput. Hum. Behav. 2014, 29, 2161–2171. [Google Scholar] [CrossRef]
  58. Crary, S. Secondary Teacher Perceptions and Openness to Change Regarding Instruction in Information Literacy Skills. Sch. Libr. Res. 2019, 22, 1–26. [Google Scholar]
  59. Stubeck, C.J. Enabling Inquiry Learning in Fixed-Schedule Libraries: An Evidence-Based Approach. Knowl. Quest 2015, 43, 28–34. [Google Scholar]
  60. Gerjets, P.; Hellenthal-Schorr, T. Competent information search in the World Wide Web: Development and evaluation of a web training for pupils. Comput. Hum. Behav. 2008, 24, 693–715. [Google Scholar] [CrossRef]
  61. Brand-Gruwel, S.; Statdler, M. Solving information-based problems: Evaluating sources and information. Learn. Instruct. 2011, 21, 175–179. [Google Scholar] [CrossRef]
  62. Garcia, C.; Badia, A. Information problem-solving skills in small virtual groups and learning outcomes. J. Comput. Assist. Learn. 2017, 33, 382–392. [Google Scholar] [CrossRef]
  63. Hakkarainen, K. Emergence of progressive-inquiry culture in computer-supported collaborative learning. Learn. Environ. Res. 2003, 6, 199–220. [Google Scholar] [CrossRef]
  64. Toma, J.D. Approaching rigor in applied qualitative research. In The Sage Handbook for Research in Education: Engaging Ideas and Enriching Inquiry; Conrad, C.F., Serlin, R.C., Eds.; Sage Publications, Inc.: Thousand Oaks, CA, USA, 2006; pp. 405–423. [Google Scholar]
  65. McCullagh, P.; Nelder, J.A. Generalized Linear Models; Chapman & Hall: London, UK, 1991. [Google Scholar]
  66. Hosmer, D.W.; Lemeshow, S. Applied Logistic Regression; John Wiley & Sons: New York, NY, USA, 2000. [Google Scholar]
  67. Scardamalia, M.; Bereiter, C. Knowledge building and knowledge creation: Theory, pedagogy, and technology. In The Cambridge Handbook of the Learning Sciences, 2nd ed.; Sawyer, R.K., Ed.; Cambridge University Press: New York, NY, USA, 2014; pp. 397–417. [Google Scholar]
  68. Hoffman, J.L.; Wu, H.K.; Krajcik, J.S.; Soloway, E. The nature of middle school learners’ science content understandings with the use of on-line resources. J. Res. Sci. Teach. 2003, 40, 323–346. [Google Scholar] [CrossRef] [Green Version]
  69. Henkel, M.; Grafmüller, S.; Gros, D. Comparing information literacy levels of Canadian and German university students. In International Conference on Information; Springer: Cham, Switzerland, 2018; pp. 464–475. [Google Scholar]
  70. Hämäläinen, E.K.; Kiili, C.; Marttunen, M.; Räikkönen, E.; González-Ibáñez, R.; Leppänen, P.H. Promoting sixth graders’ credibility Evaluation of Web pages: An intervention study. Comput. Hum. Behav. 2020, 110, 106372. [Google Scholar]
  71. Jarodzka, H.; Balslev, T.; Holmqvist, K.; Nyström, M.; Scheiter, K.; Gerjets, P.; Eika, B. Conveying clinical reasoning based on visual observation via eye-movement modelling examples. Instr. Sci. 2012, 40, 813–827. [Google Scholar] [CrossRef] [Green Version]
  72. Salmerón, L.; Delgado, P.; Mason, L. Using eye-movement modeling examples to improve critical reading of multiple webpages on a conflicting topic. J. Comput. Assist. Learn. 2020, 1, 1–14. [Google Scholar]
  73. Argelagós, E.; Brand-Gruwel, S.; Jarodzka, H.M.; Pifarré, M. Unpacking cognitive skills engaged in web-search: How can log files, eye movements, and cued-retrospective reports help? An in-depth qualitative case study. Int. J. Innov. Learn. 2018, 24, 152–175. [Google Scholar] [CrossRef]
  74. Kammerer, Y.; Brand-Gruwel, S.; Jarodzka, H. The future of learning by searching the Web: Mobile, social, and multimodal. Frontline Learn. Res. 2018, 6, 81–91. [Google Scholar] [CrossRef]
Figure 1. Five-step systematic approach to information problem solving, based on [1,2,3].
Figure 1. Five-step systematic approach to information problem solving, based on [1,2,3].
Sustainability 12 07919 g001
Figure 2. Procedure followed during the three academic years of the longitudinal study.
Figure 2. Procedure followed during the three academic years of the longitudinal study.
Sustainability 12 07919 g002
Figure 3. Support provided during the information problem-solving instruction.
Figure 3. Support provided during the information problem-solving instruction.
Sustainability 12 07919 g003
Figure 4. Means of fact-finding and information-gathering task performance results (dependent variables 1 and 2); * stands for p < 0.05.
Figure 4. Means of fact-finding and information-gathering task performance results (dependent variables 1 and 2); * stands for p < 0.05.
Sustainability 12 07919 g004
Figure 5. Final essay task performance (dependent variable 3): percentages of qualitative nominal categories.
Figure 5. Final essay task performance (dependent variable 3): percentages of qualitative nominal categories.
Sustainability 12 07919 g005
Table 1. Fact-finding and information-gathering task performance (dependent variables 1 and 2): descriptive and Wilcoxon test results.
Table 1. Fact-finding and information-gathering task performance (dependent variables 1 and 2): descriptive and Wilcoxon test results.
ControlExperimentalWilcoxon Two-Sample Test
Dependent VariableTestMeanMedianMin.Max.MeanMedianMin.Max.p Value
1. Fact-finding task-performance
(Total value: 14 points)
1111171410.311514n.s
291011210.511213400.5 (0.0028)
31313101412.913814n.s.
411.41281412.212.5814n.s
2. Information-gathering task performance11.51031.71.705n.s
22.4205.53.73.707418 (0.0074)
32.52.5053.6307n.s.
(Total value: 7 points)43.33074.5517421 (0.0085)
Note: N = 61 students; N control group = 19; N experimental group = 42.
Table 2. Final essay task performance (dependent variable 3): descriptive and chi-squared test results.
Table 2. Final essay task performance (dependent variable 3): descriptive and chi-squared test results.
ControlExperimentalChi Square Test
TestCategoriesCount%Count%P Value
1No answer1473.682047.62
Facts526.322252.38
Explanation0000
3.603 (0.0577)
2No answer631.581023.81
Facts1368.422457.14
Explanation00819.05
6.531 (0.0382)
3No answer210.53614.29
Facts1368.422559.52
Explanation421.051126.19
n.s.
4No answer842.11614.29
Facts1052.631945.24
Explanation15.261740.48
10.059 (0.0065)
Note: N = 61 students; N control group = 19; N experimental group = 42.
Table 3. Fact-finding and information-gathering task performance (dependent variables 1 and 2): stratified general linear models with repeated measures for different evaluation points.
Table 3. Fact-finding and information-gathering task performance (dependent variables 1 and 2): stratified general linear models with repeated measures for different evaluation points.
ControlExperimental
Quantitative Dependent VariableComparison between TestsLSM Differences95% CI
Lower
95% CI
Upper
p
Value
LSM Differences95% CI
Lower
95% CI
Upper
p
Value
1. Fact-finding task-performance
(Total value: 14 points)
T1 vs. T21.74060.21763.26350.0195−0.3098−1.26700.64740.8320
T1 vs. T3−1.9694−3.4282−0.51060.0043−2.8788−3.6522−2.1054<0.0001
T1 vs. T4−0.8748−2.37400.62440.4115−2.3111−3.1602−1.4619<0.0001
2. Information-gathering task- performance
(Total value: 7 points)
T1 vs. T2−1.1292−2.1397−0.11870.0232−1.6525−2.4248−0.8802<0.0001
T1 vs. T3−1.0808−1.9732−0.18830.0121−1.3526−1.9944−0.7107<0.0001
T1 vs. T4−1.4960−2.4916−0.50030.0013−2.3852−3.0685−1.7018<0.0001
Table 4. Final essay task performance (dependent variable 3): three logistic regression models established for the qualitative nominal dependent variable.
Table 4. Final essay task performance (dependent variable 3): three logistic regression models established for the qualitative nominal dependent variable.
Control vs. Experimental
Qualitative Nominal Dependent Variable 3OR EstimateCI 95% OR LowerCI 95% OR Upperp Value
Model 1. No answer vs. Answer1.961.093.510.0250
Model 2. No answer vs. Explanation5.141.7515.070.0035
Model 3. Facts vs. Explanation3.391.1410.080.0286

Share and Cite

MDPI and ACS Style

Pifarré, M.; Argelagós, E. Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance. Sustainability 2020, 12, 7919. https://doi.org/10.3390/su12197919

AMA Style

Pifarré M, Argelagós E. Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance. Sustainability. 2020; 12(19):7919. https://doi.org/10.3390/su12197919

Chicago/Turabian Style

Pifarré, Manoli, and Esther Argelagós. 2020. "Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance" Sustainability 12, no. 19: 7919. https://doi.org/10.3390/su12197919

APA Style

Pifarré, M., & Argelagós, E. (2020). Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance. Sustainability, 12(19), 7919. https://doi.org/10.3390/su12197919

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop