Next Article in Journal
Principal Component Analysis and Factor Analysis for an Atanassov IF Data Set
Next Article in Special Issue
Designing Training Programs to Introduce Emerging Technologies to Future Workers—A Pilot Study Based on the Example of Artificial Intelligence Enhanced Robotics
Previous Article in Journal
A Chaos Analysis of the Dry Bulk Shipping Market
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Green STEM to Improve Mathematics Proficiency: ESA Mission Space Lab

by
Manuel Garcia-Piqueras
1,2,* and
José-Reyes Ruiz-Gallardo
3,4
1
Department of Mathematics, Faculty of Education of Albacete, University of Castilla-La Mancha, 02071 Albacete, Spain
2
Laboratory for the Integration of ICT in the Classroom Research Group, University of Castilla-La Mancha, 02071 Albacete, Spain
3
Department of Pedagogy—Science Education, Faculty of Education of Albacete, University of Castilla-La Mancha, 02071 Albacete, Spain
4
Botany, Ethnobiology and Education Research Group, Botanic Institute, University of Castilla-La Mancha, 02071 Albacete, Spain
*
Author to whom correspondence should be addressed.
Mathematics 2021, 9(17), 2066; https://doi.org/10.3390/math9172066
Submission received: 19 July 2021 / Revised: 22 August 2021 / Accepted: 23 August 2021 / Published: 26 August 2021
(This article belongs to the Special Issue STEAM Teacher Education: Problems and Proposals)

Abstract

:
The main goal of this study was to improve students’ outcomes and perception in Mathematics. For this, 12 out of 34 voluntary students were involved in an international contest: European Space Agency (ESA) Mission Space Lab. The experience was organized as STEM, under a guided PjBL. Students identified an environmental problem, executed a way to monitor it from the International Space Station (ISS) and interpreted the data received. Students’ final report was awarded by ESA. Additionally, participants increased their performance in their math final exams compared to the control group. Furthermore, the perception of students and their families about the usefulness of mathematics was very positive. The only drawback detected was the increase of workload. Thus, Green STEM, using direct instruction and guide in PjBL, may be a good tool to improve students’ grades and opinion about the importance of mathematics.

1. Introduction

The appropriateness of mathematics for the formulation of the laws of nature is widely acknowledged [1,2]. Furthermore, mathematical concepts are beyond the scope for which they were created [3]. However, there are many students in secondary or high schools that do not get involved in math classes and they may spawn a persistent rejection of mathematics [4,5]. Many students do not understand the importance of mathematics and they just want to pass their math tests [6,7]. That is why it is important to look for alternatives that help them to understand the importance of mathematics. This is the real challenge if the intention is to democratize math education. Accordingly, that is the teachers’ fundamental goal to address the experience presented here.

1.1. STEM Education

There are diverse perspectives about STEM (Science, Technology, Engineering, and Mathematics) Education, though there is certain agreement in considering STEM Education both as curriculum and pedagogy [8], and it has been identified effective in student learning [9].
STEM Education is not only about the knowledge related to the topics included, but also develops key competencies and skills [9,10,11,12], which is a very important contribution to the workforce in STEM related fields [13]. For these reasons, educational reforms at a national level in European and Asian countries and in the USA among others, enact and promote STEM Education [9,14].
However, STEM Education is not a well-defined field yet [8,15], and its implementation is not completely delimited [16,17]. Sciences and mathematics are crucial, but they are not the only bases for STEM Education. More appropriately, STEM education should improve students’ understanding of how things work, increase their technological literacy and involve more engineering [13]. That is why we identify STEM Education with a panel, continuously growing, which involves different pedagogical perspectives, technological tools, and methodologies [15,18].

1.2. Project-Based Learning in STEM Education

Problem-based Learning (PBL), Project-based Learning (PjBL), and Service Learning or Collaborative Learning are, among others, the methodologies employed in STEM education [19,20,21]. It is worth highlighting PjBL does not focus on concepts and procedures, but rather on real-world problem solving [22]. PjBL distinguishes four categories: Knowledge acquisition, Enjoying an aesthetic experience, Problem solving, and Making a product [23,24]. Regarding the latter point, students set an external objective of learning. This approach uses a real-world situation as the focal point of teaching [25].
Originally, PjBL was a teaching method based on constructivist theories [26]. The constructivist model advocates for learnings with no guide or just a minimal assistance, so that students self-learn and construct the basic information by themselves. The environment provides inputs to the students and they build the essential information [27,28,29].
Nevertheless, some researchers [30,31,32,33,34,35,36] suggest that the environment should provide direct instruction and guidance on the concepts and procedures. The cognitive load theory of John Sweller [37] considers that the environment must give inputs but also outputs and specific procedures of a particular discipline. Thus, although PjBL has some general principles, in practice it can assume a variety of forms depending on pedagogical, political, or ethical reasons [38].
PjBL is an interesting method because it seems to foster students’ higher order thinking skills [39], as well as their motivation [40]. However, these types of methods have gathered some critics, even blaming them as a failure [41]. In this vein, recent reviews disagree. Some find inconclusive results and many methodological flaws [42], while others, medium to large mean effect size for students’ achievements [43,44]. The key may be in the role of the amount of teacher guidance [41], as set previously. In fact, the meta-analysis of Walker and Leary [45] for PBL found that when students received more guidance then it fostered a rather good effect size (d = 0.74), while less guidance provoked a negative impact (d = −0.18). Therefore, it seems relevant to properly guide students in PjBL too. Along this line, regarding computational thinking (one of the key points in our study), some research showed that students without direct instruction nor guidance lost too much time in trial-and-error loops or they were even blocked [46,47,48]. That is why such researchers recommend specific instructional interventions, though they acknowledge certain limitations and demand new research thereon.

1.3. Green STEM

On the one hand, there are three transversal dimensions in STEM education: Inclusion, Creativity, and Citizenship [49]. Inclusion strives to attract students towards scientific-technological areas [50]. It also aims at promoting active participation of students with low socio-economic conditions [51]. Research studies in classrooms, such as the exploration of a certain phenomenon, foster creativity, as well as the integration of Art in the scientific-technological scope [52]. About Citizenship, it is treated, for example, through Socio-Scientific Controversies or Environmental Education [53].
On the other hand, an essential condition in PjBL is that students generate a product that solves the original driving question [54]. Furthermore, students are encouraged to solve real meaningful problems that are similar to what professionals do [26]. Real-world situations facilitate learning sciences and mathematics [55]. It is desirable, within the scope of technological instruments, among others, the use of remote data, such as remote sensing or geolocation [56], essential tools to monitor environmental issues at a large scale, as the anthropogenic global warming is one of the problems that concern citizens the most.
An active field of educational research is how to improve mathematics education through real-world problems and STEM Education [57,58,59,60,61]. Systematic research reviews do not recognize Green STEM, i.e., the intersection between STEM and Environmental Education, as a major trend [8,15,62,63,64,65]. However, the dissemination of Green STEM in the academic ecosystem is an expanding tendency [66,67,68]. For instance, there are studies that deal with computational thinking and software development in the context of climate change [69], forest regeneration using remote sensing and intelligent seeds [70] or the relationship between photonics and green future [71], among others. However, above all this, lately, there is an increase in contests addressed to children or young students regarding STEM and Environmental awareness, such as Climate Detectives [72] or Astro Pi Challenge [73], either international (https://www.cstl.org/cleantech/the-challenge/, accessed on 24 August 2021), national (https://www.fundacionendesa.org/es/premios-innovacion-educativa, accessed on 24 August 2021), or local (https://a21escolarab.es/, accessed on 24 August 2021), which promote social and ecological awareness through competitions. These programs promote educational trends [74,75].
As that, the research questions of this study are:
-
Is it possible to increase math performance through STEM, supported by PjBL with direct instruction and guidance, in a context of real environmental problems?
-
Which is the perception of students and their families towards this methodology and about math importance?

2. Materials and Methods

This proposal is a quasi-experimental study case with an intervention in secondary and high school education in the subject of mathematics. STEM Education is implemented through PjBL, but using direct instruction and guidance, in the category of making a product. Participants were selected by non-probabilistic convenience sampling.

2.1. Participants

Students belonged to a math bilingual (English) 4th level of secondary education (16 years old) and first level of high school (17 years old), at Tomás Navarro Tomás (Albacete, Spain). The first author was the math teacher of both classrooms, and the teacher of physics and chemistry also participated. They led the experience in collaboration with a professor of science education from a Faculty of Education, placed close to this school. There were 12 out of 34 students that voluntarily joined the experimental group by participating in the Astro Pi Challenge [73], organized by the European Space Agency (ESA) and the Raspberry Pi Foundation. It was intended for students under 19 years of age.
Knowing the constraint of volunteerism and self-selection bias, before the experience, we ensured that both groups were comparable regarding the dependent variable (math performance) [76,77]. As that, both groups (control and experimental) were compared and we found that there were not statistical differences between students prior to the experience, so the volunteer group was comparable to its counterpart [78]. In short, experimental students were not different when considering math marks than control students before the experience. Overall, any change in this variable may be due to the participation (or not) in the new learning methodology (independent variable).
The competition had two categories and volunteers chose Mission Space Lab (https://astro-pi.org/mission-space-lab/, accessed on 24 August 2021) to study life on Earth’s surface. They agreed to make a proposal related to environmental problems, through geolocation and remote sensing data obtained from the International Space Station (ISS). Students organized themselves firstly in pairs, and afterwards formed two teams, A and B, of six students each, by their own criteria of affinity and friendship. Team A had four students of secondary (one female) and two female students in high school. Team B had three students in secondary (one female) and three female students in high school.

2.2. Organization of the Proposal

The rules of the contest are described on the website https://www.esa.int/Education/AstroPI/European_Astro_Pi_Challenge_is_back_for_its_2018_2019_edition (accessed on 24 August 2021). It was compulsory to make use of the infrared camera, without the possibility of doing it on a predetermined location in advance. In addition, the spatial resolution does not allow to see objects such as buildings, cars, or people. The camera operated through a computer called Izzy (Astro Pi computer), fitted with several sensors, such as a magnetometer, accelerometer, and hydrometer.
In this category, the phases of the contest are as follows:
(1)
Design (selective): the idea of the experiment; it is valued according to its viability, scientific value, and creativity.
(2)
Creation: the design and development of a computer program to run the experiment onboard ISS.
(3)
Validation (selective): ESA staff validate the program to ensure that it will not fail when executed on Izzy and it is given flight status.
(4)
Analysis (selective): the teams’ analyses of the data collected on ISS; specifically, the submission of a report that ESA staff evaluate.
Regarding the organization of the teaching, once a week, experimental students were separated from their counterparts to work on different activities related to the contest. Meanwhile, the control students worked on math in an ordinary classroom, doing reinforcement duties, which did not include new math concepts. It is important to highlight that both groups spent the same contact time with math teachers. Specifically, the overall time dedicated to the experience was 29 sessions of 50 min each.

2.2.1. Approach to the Problem

Firstly, to gain previous knowledge about satellite data and their possibilities, there were two sessions with a university professor specialist in remote sensing applied to environment. In a first approach, he explained the basics of remote sensing and light spectrum, reflectivity, and spectral indexes useful to determine different characteristics of the environment. In a second approach, he showed real cases to contextualize the new learning and give feasible ideas to promote new ones (armed with those concepts and a set of related examples, the students could research a broad set of environmental issues such as wildland fires, vegetation stress, and the evolution of green cover, among others).
After brainstorming and analyzing feasible alternatives, they agreed on two topics: (1) studying oceans as carbon sinks and (2) the health tendency of forests.
Regarding the first idea, it was a question of studying the health tendency of plant life on offshore platforms, which helps monitoring the health of the seas through phytoplankton. Team A’s hypothesis was that marine life was being affected by pollution. Regarding the second idea, the research question was: Is the health tendency of forests in urban areas (or close to them) equal to remote woodlands? Initially, Team B’s hypothesis was that forests in remote areas evolved better than closer to human activity.
One of the essential factors that helped to choose both experiments was their parallel software requirements, since they would share a large part of the program code. So that, Teams A and B could collaborate, both at the design and implementation stages. When the ideas were clear, the teams wrote a proposal with the help of their teachers and submitted it to ESA staff (Phase 1). Since both proposals were accepted, this phase was selective, and teams moved on to the next phase of the competition: Create (Phase 2).

2.2.2. Design of the Algorithm

Firstly, students had to design the algorithm. It was not yet necessary to express this solution in Python, the programming language that would be executed on the Astro Pi computer on board ISS. The students did not have previous knowledge in software development. Therefore, the math teacher chose the required concepts and their distribution. The selection of the concepts and their order was the following:
(1)
Variables, arrays, matrices, and basic mathematical utilities.
(2)
Conditions: If-else-then statements.
(3)
Loops.
(4)
Objects and methods.
(5)
Access and management of the Pi camera.
(6)
Access and management of geolocation.
(7)
Data storage: generation and access to spreadsheets.
(8)
Image analysis: how to obtain the pixels of a photograph.
A set of worked examples were provided for each of the basic elements of the programming language (see examples in Appendix A). For instance, the teacher did not give the description of a variable but provided different input-output worked examples such as the following Code example 1.
Code Example 1. Variables I.
INPUT (Python code)
1:a = 3
2:b = “Richard”
3:a = 5
4:b = “Parker”
5:print(a)
6:print(b)
OUTPUT
1:5
2:Parker
There were other concepts, such as procedures that involved the access to data storage or Pi camera, provided directly to the students as Python subroutines. Therefore, if they wanted to access the Pi Camera they just executed a subroutine (see Appendix A, Code example 3).
Teachers decided to teach the first four sets of concepts mentioned above by working examples, and the last ones by providing the descriptions of the concepts themselves.
At the end of each topic, students applied this new knowledge to some practical exercises. For example, after teaching content (1), they had to define a variable called overall_time, initialize it to 3 h, and then subtract 5 min. Students had access to an emulator to check both the language correctness and the expected execution.
Afterwards, students worked together to model a first approach to the algorithm using diagrams and/or pseudocode. They had to consider a set of constraints previously established by Mission Space Lab: (a) the overall time limitation of 3 h (T), (b) data storage of 3 GB maximum, (c) the sending of a message through the LED matrix of the Astro Pi computer, and (d) the strict use of libraries categorized as flight status.
Thus, through a brainstorming process, students proposed ideas and analyzed their feasibility. The outcome was organized and materialized through the following algorithm (pseudocode):
(1)
Acquire a frame.
(2)
Set the geographical coordinates.
(3)
Get the acquisition time.
(4)
Store the data recorded in the previous steps.
(5)
Show a message on the led screen.
(6)
Subtract a period of t = 5 min from the overall limit T.
(7)
Sleep for a period of t = 5 min and return to the beginning until the overall time T = 3 h.
The program faced many changes until it properly ran on an Astro Pi computer (ESA sent two Astro Pi computers for the teams when they qualified Phase 1). Algorithm A1 (see Appendix A) shows a standard solution.
At this point, teams A and B adapted this basic program to their specific goals: overseas and vegetation cover. In this process, to ensure the proper running of the program aboard ISS, students located and analyzed images of similar characteristics. Regarding the reflectivity of the RGB bands and guided by their teachers, they devised a strategy to recognize land or sea frames. Namely, the algorithm used the ratio blue/red as a criterion. Code example 4 in Appendix A summarizes their approach.
Finally, each team sent the generated program to evaluation by ESA staff, which constituted the third phase and which was selective as well. Since ESA’s software tests validated the programs sent, teams A and B qualified for Phase 4.

2.2.3. Final Report

The last phase of the competition was the analysis of data collected during the experiment. It was compulsory writing a report divided into the following parts: (1) Introduction, (2) Method, (3) Results, and (4) Conclusion.
Teachers reminded the students about spectral indices, such as NDVI, before moving on to data analysis. In addition, they explained how to get images of the Earth’s surface or draw perimeters, using Google Maps (https://maps.google.com/, accessed on 24 August 2021) and/or Google Earth (https://earth.google.com/web/, accessed on 24 August 2021). Teachers directly provided the description, in terms of Maps and Earth commands, of how to draw a perimeter, either by means of video tutorials or by reproducing the task themselves. Similarly, they explained how to export perimeters for EO Browser (https://www.sentinel-hub.com/explore/eobrowser/, accessed on 24 August 2021) and how to obtain the tendency of those areas for a specific spectral index.
Specifically, the perimeter of the Earth’s surface area was made using Google Earth. It helps drawing a polygon on the surface of the Earth and exporting it to ‘.kml’ format. EO Browser can import ‘.kml’ files and recognizes the inside perimeter as an area of study. Thus, for that study region, EO Browser, shows the tendency of a given spectral index (chart icon (https://www.sentinel-hub.com/explore/eobrowser/user-guide/, accessed on 24 August 2021)) for a certain period (Figure 1). Teachers gave a full description of how to perform this task for any given region in ‘.kml’ format.
The interpretation of such charts was introduced through a set of working examples. The teachers provided the students a spectral index graph as input, and the output was a natural process: wildfire, snow, spring flourish, among others, throughout the period given. For instance, teachers showed Figure 1 as input, and “severe wildfire” as output for such an input.
Had it not been for EO Browser, the study of the tendency of spectral indices, using strictly numerical data, would be almost unfeasible. EO Browser helped the students, effectively and with ease, with their remote sensing analysis. Since it was not necessary to know in detail the creation of those graphs, they focused on the interpretations.
Each team analyzed the data obtained from the ISS, armed with the basics of geolocation and remote sensing. Team A identified a frame that showed the mouths of several rivers in the coast of Malaysia. The location of the picture was identified through its coordinates, and it was contrasted using Google Maps. Team A distinguished photosynthetic activity on that area through EO Browser and studied its tendency for a period of several years. Their results were inconclusive regarding the health tendency of the sea.
Team B received just two valid frames from the ISS since most of them were at night or shrouded in clouds. The first reasonable photograph, Figure 2, showed Yosemite Valley in Sierra Nevada (USA), which stood for a forest sample away from human activity. The other valid photograph showed a large part of the border between Canada and the USA. The students spotted the Lost River Environmental Reserve (Canada). It illustrated a forest close to human activity since it was surrounded by intensive farming.
Team B obtained, on both areas of study (Yosemite and Lost River), the tendency for the past five years of the following spectral indices: NDVI, Moisture Index (MI), Normalized Difference Water Index (NDWI), and Normalized Difference Snow Index (NDSI) (see Appendix A for more information).
Some of the graphs obtained, such as the NDVI tendency, showed no clear trends. The students’ first hypothesis was that with such spatial and spectral resolution, changes were not significant enough to be distinguished. The second hypothesis was that there were no changes in the ecosystem. So that, further research was necessary. Interested readers can find more information in the Supplementary Materials and in the students’ report: https://esamultimedia.esa.int/docs/edu/AstroPi_Go.pdf (accessed on 24 August 2021).
However, for the previous two years, the NDWI on both samples indicated that, despite seasonal changes, the minimums were decreasing. Team B concluded a moisture deficit in both areas, and even backed their own investigations with scientific surveys and press releases.
The teams wrote their reports and submitted them to receive ESA feedback; this was the end of the challenge for the teams involved.

2.3. Data Collection

To assess this study, we used the following combination of quantitative and qualitative ways:
(1) ESA’s evaluations and feedback: after every phase, ESA staff gave feedback and determined whether the team would qualify, or if the proposal would be rejected. We employed this external assessment as evidence of the students’ skills and competence acquisition.
(2) Program execution: the math teacher evaluated the level of correctness and use of mathematical language employed in the developed software. The program was assessed using the following criteria, each one evaluated from 1 (minimum) to 5 (maximum) points:
(a)
Eligibility: the software meets the requirements previously set by ESA, such as showing a message in the LED matrix or using the infrared camera. Following the number of instructions included to meet those requirements totally, partially, or their absence.
(b)
Efficiency: less is more; the use of mathematical language to reduce the number of instructions is an asset. Regarding the length of the overall code employed to perform the algorithm: more bits imply less points. The benchmark would be the smallest program that the teacher was able to do.
(c)
Clearness: the program should be well-structured and with sufficiently explanatory comments. Considering the organization of the code: if-then-else are properly aligned, variables are clearly called and stated, among others.
(3) Students’ grades: math assessments permitted to analyze the impact of the experience, since participants took the same final test as their classmates. A statistical survey analyses control vs. the experimental groups’ achievement in math for a period of years.
(4) Students and family opinions: the reward for the winning teams of Mission Space Lab 2018/19 was a webinar with ESA astronaut Frank De Winne. The webinar took place on 18 June 2019 and each team had the chance to ask Frank two questions related to his expertise and experiences as an astronaut. Teachers asked and recorded students and families’ opinions about the experience.

2.4. Data Analysis

Students’ marks were contrasted using non-parametric statistics due to the low number of participants. Mann-Whitney’s U test was used to compare between control and experimental groups (independent groups) and Wilcoxon test for pre-posttest within the same group (paired samples). When there were more than two samples to contrast, Freedman test (groups are paired) was used ( χ 2 -ji-squared-). For post hoc analysis, Bonferroni’s correction was considered.
Interviews comments were transcribed and analyzed using an inductive approach; there were no pre-established categories, but rather they emerged from the data [79,80]. The protocol followed the steps registered in [81]: segmentation, coding and category development. The analysis, segmentation and codification of the questions allowed an objective categorization of the answers. The outcome of that process was a code list that classified the relationships into topics. Finally, we calculated frequencies and percentages based on those topics.

3. Results

The action generated a set of outcomes that, as already stated above, we assessed using four points.

3.1. ESA Assessment

The European Space Agency reported: “a record-breaking number of more than 12,500 people from all 22 ESA Member States, Canada, Slovenia and Malta took part in this year’s challenge across both Mission Space Lab and Mission Zero” (https://www.esa.int/Education/AstroPI/European_Astro_Pi_Challenge_Mission_Space_Lab_winners2, accessed on 24 August 2021).
In its different sequential phases, the results were:
Phase 1 (Design): ESA qualified 381 out of 471 teams from 22 ESA members plus Canada, Slovenia, and Malta.
Phase 2 (Creation): ESA selected 135 teams to participate in phase 3 (https://www.esa.int/Education/AstroPI/Astro_Pi_Mission_Space_Lab_381_teams_selected_for_Phase_2, accessed on 24 August 2021).
Phase 3 (Validation): Here, 82 teams qualified based “on their experiment quality, their code quality, and the feasibility of their experiment idea” (https://www.esa.int/Education/AstroPI/Astro_Pi_Mission_Space_Lab_-_135_teams_will_run_their_experiments_on_the_ISS, accessed on 24 August 2021).
Phase 4 (Analysis): 11 teams out of the original 471 were prized, “the teams were asked to submit a short scientific report to highlight their results and the conclusions from their experiments. The quality of the reports was truly impressive, showing a high level of scientific merit” (https://www.esa.int/Education/AstroPI/European_Astro_Pi_Challenge_Mission_Space_Lab_winners2, accessed on 24 August 2021). Team B was among the winners; Team A completed Phase 4 but did not win (Team A received a diploma since they submitted their final report).

3.2. Software Quality

As we already mentioned, students worked in pairs and developed software that was to be executed on board the ISS to carry out their experiments. Table 1 shows their evaluations in Eligibility, Efficiency, and Clearness for each pair, before each team agreed on a common solution.
All the pairs, except one, did not properly consider the 3 h overall limit. Those pairs calculated the time necessary to get a frame and geographical coordinates. Then, they repeated those instructions as much as they met the overall time limit approximately, in a formative assessment shape. However, the execution of their programs aboard ISS may differ significantly from their tests. That is why most of the pairs did not comply with Eligibility and Efficiency criteria.
There was a pair in Team B that, using a loop, got a solution similar to Algorithm 1.
Algorithm 1 Basic functioning for running the experiment (Team B)
1:Start_time = get_time_now()
2:Now_time = get_time_now()
3:Limit_time = 3 h
4:While(Now_time < Start_time + Limit_time):
5:  Get frame
6:  Get geographical coordinates
7:  Sleep 3 min
8:  Now_time = get_time_now()
9:End While

3.3. Students’ Marks

Table 2 shows the analysis of students’ outcomes on their final tests, contrasting the control vs. the experimental groups, before and after the experience, and a year later. As can be seen, before the experience, students had statistically similar results but after the experience, the experimental group obtained better results than those that did not participate, with a medium effect size. These are kept even after the experience.
Accordingly, the contrast within the same group, both experimental and control groups, obtained statistical differences after applying the Friedman’s test for repeated measures (experimental: χ 2 ( 2 ) = 7.090 , p = 0.028 ; control: χ 2 ( 2 ) = 14.174 , p = 0.000 ). The post-hoc analysis is showed in Table 3.
It shows that the students of the experimental group significantly increased their results from a year before the experience (see Table 2 for average marks) and this improvement is kept a year after. On the contrary, the control group decreased statistically in 2020 referring to 2018. Regarding the contrast with 2019, the effect size is even higher. However, using Bonferroni’s correction ( 0.05 / 3 = 0.016 ), this difference is not statistically significative, although practically.

3.4. Students and Families Interviews

Table 4 shows the categorization of the ideas expressed by the students and their families in the final webinar.
For instance, student B5, (secondary) commented: “It was like an adventure. I was quite nervous about ESA’s feedback.” His mother observed: “We were delighted with the experience and observed that student B5 was more and more motivated after every qualifying round”. In this regard, student B6 (secondary) observed: “We have worked a lot, but it was worth it. Now, I am pretty sure that I want to be a computer engineer”. Furthermore, student B2 (high school), declared: “I never thought that math could be so useful. I didn’t expect that we could employ it to develop programs and analyze data from the ISS”.
About their future, student A6 (secondary), expressed “I enjoyed the experience a lot; now I am thinking about which degree I should study: Mathematics or Physics”. Her mother, points that: “We want her participating in the experience. She is very good at math and physics, but now she is even more motivated. We are extraordinarily proud of her performance but, above all, we are proud that she is quite a hard worker!”. There were some comments such “We are very sorry, but next year she will not be able to join these activities. She shall focus on university entrance tests since she wants to get into medical school”.

4. Discussion

Results indicate that the students involved in this project obtained better results than their counterparts. This coincides with similar STEM educational actions [10,11], but it is not equal to other experiences whose outcomes are not so positive [12,82,83]. Perhaps, one reason is because the action was fully guided [84] in contrast with Dewey’s philosophy [85]; the former relies on teacher guidance and direct instruction of key concepts [86], while the latter is based on students’ self-learning, i.e., through research-based learning [87] (RBL). The main difference between different PjBL implementations (or PBL) is mainly in the role of the teacher. It is noteworthy that the action was fully guided [84], in contrast with the perspectives of Ausubel [88] or Bruner [27], based on the learner’s curiosity.
Concerning the final product and considering that participants were beginners in fields such as software development or electromagnetic spectrum, among others, we should highlight the effectiveness of the methodology employed: PjBL under direct instructional guidance through worked examples.
This kind of methodology is backed by human experiments [89], but is also supported by machine teaching, which is defined just in terms of algorithms and the optimization of the necessary information within the teacher-learner protocol [90]. Such optimization of the teacher-learner protocol helps teachers and learners to choose the best way of dealing with a given concept: either by means of worked examples or by the provision of the concept description [91]. For instance, there are some concepts which are more effectively learned through worked examples, such as the one shown in Section 3.2 related to ‘variables’ in programming languages. On the other hand, sometimes we just employ a certain concept description as a tool. This is the case, for instance, of spectral indexes charts in Section 2.2.3 and their usefulness in identifying natural phenomena. The students exposed to such a procedure increased their performance in math.
On another note, it is worth highlighting teacher background in STEM literacy; it was extremely useful throughout the action and supports the importance of improving teacher education [92,93].
Though it was not specifically analyzed in the results, the unique weakness highlighted by families and students was workload increase. Some research have already warned about this issue [94] because it has sound negative consequences for the students [95]. Further, this extra work affects the teachers. Some studies calculate that the preparation of this type of lessons imply 75% extra work than the ordinary ones [96]. Therefore, it is necessary to consider this amount of work before planning the teaching. Consequently, it should be necessary to measure this time during the experience to not overload students and teachers and avoid harmful consequences.

5. Conclusions

After analyzing the results, the conclusions of the study are the following:
(1) Students’ performance in math improves through STEM supported by PjBL with direct instructional guidance, within the context of real environmental problems.
(2) Students were able to translate mathematical learning into real world applications. Conversely, real-life problems were the entry point to increase mathematical knowledge. Teacher guidance may be key for such a two-way trip.
(3) The quality and interest of these educational actions and the product of students’ skills were recognized by external evaluations (ESA’s contest in this case), which provided additional value as stimulus for increasing students’ learning in mathematics.
(4) The feedback from the students and families confirms that these actions are highly valued and motivating for them, as well as raising visibility for the importance of mathematics. The students’ and teachers’ workload increase was the only weakness identified.
The implications of this case study are that, using real environmental problems, in a PjBL environment and with a proper teacher guidance, students are able not only to learn more mathematics than their classmates, but also to develop a useful product recognized by external entities.
However, these conclusions must be taken with caution due to the nature of the study itself, which is for a particular locality and with specific pupils with distinct characteristics. In addition, the sample is small and non-random, and further studies in this line should be carried out to confirm the results.
Such realistic settings are key in educational studies in sciences, since they look to study learning in realistic contexts rather than using artificial experiments [97,98]. As Brown [99] illustrates, we looked for “improved cognitive productivity under the control of the learners, eventually with minimal expense, and with a theoretical rationale for why things work”. That is why we used common environments and approaches. As an outcome, not every variable involved—such as motivation—could be controlled. This gives verisimilitude to our study and an opportunity for conclusions to be transferred to other teacher education contexts.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/math9172066/s1, Reports Teams A and B; Spreadsheets S1, S2: Final Math Tests, Assessments Programs; Data from ISS Z1: zipped data experiments aboard ISS.

Author Contributions

Conceptualization, M.G.-P.; methodology, M.G.-P. and J.-R.R.-G.; writing—original draft preparation, M.G.-P.; writing—review and editing, M.G.-P. and J.-R.R.-G.; supervision, M.G.-P. and J.-R.R.-G. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been partially financed by funds from the University of Castilla-La Mancha’s own research program, by the Castilla-La Mancha Regional Administration under grant SBPLY/19/180501/000278 and by the European Regional Development Fund (ERDF), within its call for grants to research groups: Ethnobiology, Botany and Education Research Group—Botanic Institute University of Castilla-La Mancha under grant number 2021-GRIN-30982 and Laboratory for the integration of ICT in the classroom under grant number 2021-GRIN-31060.

Institutional Review Board Statement

Ethical review and approval were waived for this study, due to formal protocol established in the Educational Project of the Centre, as well as in the Annual General Programming and in the Department of Mathematics of the high school Tomás Navarro Tomás (Albacete, Spain).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available in Supplementary Material here.

Acknowledgments

We would like to thank ESA Education Team (ESERO) for their commitment with STEM Education and their excellent educational initiatives. We are also very grateful to the entire educational community of the high school Tomás Navarro Tomás (Albacete, Spain) for their support.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

In this section we include some materials to illustrate some code examples employed with the experimental group. In addition, at the end, the reader will find a summary of certain basic concepts, related to the electromagnetic spectrum, provided to the experimental group too.

Appendix A.1. Worked Examples and Algorithms

Regarding the basics of programming, the teacher provided code examples like the ones in this section. For instance, Code example 2 extended the use of variables stated in Code example 1 (see Section 2.2.2).
Code Example 2. Variables II
INPUT (Python code)
1:a = 3
2:b = “Richard”
3:c = 10
4:print(“a: “, a)
5:print(“b: “+b)
6:print(“c: “, c)
7:a = 5
8:b = b + “ “ + “Parker”
9:c = a + c
10:print(“a: “, a)
11:print(“b: “+b)
12:print(“c: “, c)
OUTPUT
1:a: 3
2:b: Richard
3:c: 10
4:a: 5
5:b: Richard Parker
6:c: 15
Code example 3 directly provided a subroutine to take a picture.
Code Example 3. Subroutine for pictures
1:# Getting a PiCamera reference
2:cam = PiCamera()
3:cam.resolution = (1296, 972)
4:cam.capture(“picture_where_the_image_is_stored.jpg”)
Algorithm A1 shows a standard solution to run the experiment.
Algorithm A1. Standard solution for experiment
1:Overall Time = T
2:Frequency = f
3:Time = 0
4:WhileTime < T
5:  Frame shot and storage
6:  Geographical coordinates retrieval and storage
7:  Print ‘Hello, how are you today?’ on to the LED screen matrix
8:  Time = Time +  f Now_time = get_time_now()
9:End While
Code example 4 shows an attempt to categorize images (SEA/LAND) based on the image’s RGB information stored in its pixels.
Code Example 4. Land/Sea categorization
1:Total_red = 0
2:Total_blue = 0
3:Let pixel_1, pixel_2..., and pixel_10 be the elements of pixels.
4:Total_red = red pixel_1 + … + red pixel_10
5:Total_blue = blue pixel_1 + … + blue pixel_10
6:IfTotal_red/Total_red >1.1
7:  then picture is categorized as SEA
8:  else picture is LAND

Appendix A.2. Electromagnetic Spectrum

Visible light’s wavelength corresponds to a relatively narrow interval of the electromagnetic spectrum. That interval is divided in three bands: Red, Green, and Blue. Photosynthesis uses mainly visible light, while it rejects other waves of the spectrum, such as near infrared (NIR). That is why healthy vegetation absorbs a substantial portion of the overall visible light received, while senescent or stressed vegetation absorbs NIR. These two simple characteristics are employed to calculate spectral indexes such as Normalized Difference Vegetation Index (NDVI), which is a ratio given by the following formula:
NDVI = (NIR − RED)/(NIR + RED),
where NIR and RED are wavelengths of near infrared and red, respectively.
There are other useful spectral indices (ratios) such as Moisture Index (MI), Normalized Difference Water Index (NDWI), Normalized Difference Snow Index (NDSI) or Normalized Burn Ratio (NBR). Regarding such concepts, it is possible to research over wildland fires (NBR), vegetation stress (NDVI), and watering income (MI, NDWI or NDSI), among others.

References

  1. Burgin, M. Ideas of Plato in the Context of Contemporary Science and Mathematics. Athens J. Humanit. Arts 2017, 4, 161–182. [Google Scholar] [CrossRef]
  2. Wilson, P. What the applicability of mathematics says about its philosophy. In Technology and Mathematics; Hansson, S., Ed.; Philosophy of Engineering and Technology; Springer: Cham, Switzerland, 2018; Volume 30, pp. 345–373. ISBN 978-3-319-93778-6. [Google Scholar]
  3. Keener, J.P. Principles of Applied Mathematics: Transformation and Approximation; Addison–Wesley: Reading, MA, USA, 1988; ISBN 978-0-429-98314-6. [Google Scholar]
  4. Das, K. Action Research On Mathematics Phobia Among Secondary School Students. Int. J. Indones. Educ. Teach. 2020, 4, 239–250. [Google Scholar] [CrossRef]
  5. Khoshaim, H.B. Mathematics Teaching Using Word-Problems: Is It a Phobia! Int. J. Instr. 2020, 13, 855–868. [Google Scholar] [CrossRef]
  6. Luttenberger, S.; Wimmer, S.; Paechter, M. Spotlight on Math Anxiety. Psychol. Res. Behav. Manag. 2018, 11, 311–322. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Xu, D.; Dadgar, M. How Effective Are Community College Remedial Math Courses for Students with the Lowest Math Skills? Community Coll. Rev. 2018, 46, 62–81. [Google Scholar] [CrossRef]
  8. Margot, K.C.; Kettler, T. Teachers’ Perception of STEM Integration and Education: A Systematic Literature Review. Int. J. STEM Educ. 2019, 6, 1–16. [Google Scholar] [CrossRef] [Green Version]
  9. Wahono, B.; Lin, P.-L.; Chang, C.-Y. Evidence of STEM Enactment Effectiveness in Asian Student Learning Outcomes. Int. J. STEM Educ. 2020, 7, 1–18. [Google Scholar] [CrossRef]
  10. Sorby, S.; Veurink, N.; Streiner, S. Does Spatial Skills Instruction Improve STEM Outcomes? The Answer Is ‘Yes’. Learn. Individ. Differ. 2018, 67, 209–222. [Google Scholar] [CrossRef]
  11. Saw, G. The Impact of Inclusive STEM High Schools on Student Outcomes: A Statewide Longitudinal Evaluation of Texas STEM Academies. Int. J. Sci. Math. Educ. 2019, 17, 1445–1457. [Google Scholar] [CrossRef]
  12. Siregar, N.C.; Rosli, R.; Maat, S.M.; Capraro, M.M. The Effect of Science, Technology, Engineering and Mathematics (STEM) Program on Students’ Achievement in Mathematics: A Meta-Analysis. Int. Electron. J. Math. Educ. 2019, 15, 1–12. [Google Scholar] [CrossRef] [Green Version]
  13. Bybee, R.W. What Is STEM Education? Science 2010, 329, 996. [Google Scholar] [CrossRef] [Green Version]
  14. Yata, C.; Ohtani, T.; Isobe, M. Conceptual Framework of STEM Based on Japanese Subject Principles. Int. J. STEM Educ. 2020, 7, 1–10. [Google Scholar] [CrossRef] [Green Version]
  15. Li, Y.; Wang, K.; Xiao, Y.; Froyd, J.E. Research and Trends in STEM Education: A Systematic Review of Journal Publications. Int. J. STEM Educ. 2020, 7, 1–16. [Google Scholar] [CrossRef] [Green Version]
  16. McDonald, C.V. STEM Education: A Review of the Contribution of the Disciplines of Science, Technology, Engineering and Mathematics. Sci. Educ. Int. 2016, 27, 530–569. [Google Scholar]
  17. Martín-Páez, T.; Aguilera, D.; Perales-Palacios, F.J.; Vílchez-González, J.M. What Are We Talking about When We Talk about STEM Education? A Review of Literature. Sci. Educ. 2019, 103, 799–822. [Google Scholar] [CrossRef]
  18. Maass, K.; Geiger, V.; Romero Ariza, M.; Goos, M. The Role of Mathematics in Interdisciplinary STEM Education. ZDM Math. Educ. 2019, 51, 869–884. [Google Scholar] [CrossRef]
  19. Gough, A.; Gough, N. Beyond Tinkering and Tailoring: Re-de/Signing Methodologies in STEM Education. Can. J. Sci. Math. Technol. Educ. 2018, 18, 284–290. [Google Scholar] [CrossRef]
  20. Scaradozzi, D.; Screpanti, L.; Cesaretti, L.; Storti, M.; Mazzieri, E. Implementation and Assessment Methodologies of Teachers’ Training Courses for STEM Activities. Technol. Knowl. Learning 2019, 24, 247–268. [Google Scholar] [CrossRef] [Green Version]
  21. Johnson, C.C.; Mohr-Schroeder, M.J.; Moore, T.J.; English, L.D. Handbook of Research on STEM Education; Routledge: London, UK, 2020; ISBN 978-0-367-07562-0. [Google Scholar]
  22. Aksela, M.; Haatainen, O. Project-Based Learning (PBL) in Practise: Active Teachers’ Views of Its’ Advantages and Challenges. In Proceedings of the Integrated Education for the Real World 5th International STEM in Education Conference Post-Conference Proceedings, Brisbane, Australia, 21–23 November 2018; Queensland University of Technology: Brisbane, Australia, 2018; pp. 9–16. Available online: https://stem-in-ed2018.com.au/wp-content/uploads/2019/02/5th-International-STEM-in-Education-Post-Conference-Proceedings-2018_1502.pdf (accessed on 24 August 2021).
  23. Kilpatrick, W.H. The Project Method. Teach. Coll. Rec. 1918, 19, 319–335. [Google Scholar]
  24. Hanney, R. Doing, Being, Becoming: A Historical Appraisal of the Modalities of Project-Based Learning. Teach. High. Educ. 2018, 23, 769–783. [Google Scholar] [CrossRef]
  25. Mutakinati, L.; Anwari, I.; Kumano, Y. Analysis of Students’ Critical Thinking Skill of Middle School through Stem Education Project-Based Learning. J. Pendidik. IPA Indones. 2018, 7, 54–65. [Google Scholar] [CrossRef]
  26. Krajcik, J.S.; Blumenfeld, P.C. Project-based learning. In The Cambridge Handbook of the Learning Sciences; Cambridge Handbooks in Psychology; Cambridge University Press: Cambridge, UK, 2006; pp. 317–334. ISBN 978-1-107-03325-2. [Google Scholar]
  27. Bruner, J.S. The Act of Discovery. Harv. Educ. Rev. 1961, 31, 21–32. [Google Scholar]
  28. Papert, S.A. Mindstorms: Children, Computers, and Powerful Ideas; Harvester Studies in Cognitive Science; Harvester Press: Sussex, UK, 1980; ISBN 978-0-7108-0472-3. [Google Scholar]
  29. Steffe, L.P.; Gale, J.E. Constructivism in Education; Lawrence Erlbaum Associates, Inc.: Mahwah, NJ, USA, 1995; ISBN 978-0-8058-1096-7. [Google Scholar]
  30. Cronbach, L.J.; Snow, R.E. Aptitudes and Instructional Methods: A Handbook for Research on Interaction; Irvington Publishers: New York, NY, USA, 1977; ISBN 978-0-470-15066-5. [Google Scholar]
  31. Klahr, D.; Nigam, M. The Equivalence of Learning Paths in Early Science Instruction: Effects of Direct Instruction and Discovery Learning. Psychol. Sci. 2004, 15, 661–667. [Google Scholar] [CrossRef]
  32. Sweller, J. Evolution of human cognitive architecture. In Psychology of Learning and Motivation; Elsevier, Ed.; Academic Press: San Diego, CA, USA, 2003; Volume 43, pp. 215–266. ISBN 978-0-08-052275-3. [Google Scholar]
  33. Mayer, R.E. Should There Be a Three-Strikes Rule against Pure Discovery Learning? Am. Psychol. 2004, 59, 14–19. [Google Scholar] [CrossRef] [Green Version]
  34. Cuspilici, A.; Monforte, P.; Ragusa, M.A. Study of Saharan Dust Influence on PM10 Measures in Sicily from 2013 to 2015. Ecol. Indic. 2017, 76, 297–303. [Google Scholar] [CrossRef]
  35. Hernández-Sabaté, A.; Albarracín, L.; Sánchez, F.J. Graph-Based Problem Explorer: A Software Tool to Support Algorithm Design Learning while Solving the Salesperson Problem. Mathematics 2020, 8, 1595. [Google Scholar] [CrossRef]
  36. Lamb, J.; Marimekala, S.K.V. STEM Projects Using Green Healthcare, Green IT, and Climate Change. In Proceedings of the 2018 9th IEEE Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON), New York, NY, USA, 8–10 November 2018; IEEE: New York, NY, USA, 2018; pp. 95–101. [Google Scholar]
  37. Sweller, J. Cognitive load theory: Recent theoretical advances. In Cognitive Load Theory; Plass, J.L., Moreno, R., Brünken, R., Eds.; Cambridge University Press: Cambridge, UK, 2010; pp. 29–47. ISBN 978-0-511-84474-4. [Google Scholar]
  38. Helle, L.; Tynjälä, P.; Olkinuora, E. Project-Based Learning in Post-Secondary Education–Theory, Practice and Rubber Sling Shots. High. Educ. 2006, 51, 287–314. [Google Scholar] [CrossRef]
  39. Santoso, A.; Primandiri, P.; Zubaidah, S.; Amin, M. The Development of Students’ Worksheets Using Project Based Learning (PjBL) in Improving Higher Order Thinking Skills (HOTs) and Time Management Skills of Students. In Proceedings of the Journal of Physics: Conference Series; IOP Publishing: Jawa Barat, Indonesia, 2020; Volume 1806, pp. 1–5. [Google Scholar]
  40. Shin, M.-H. Effects of Project-Based Learning on Students’ Motivation and Self-Efficacy. Engl. Teach. 2018, 73, 95–114. [Google Scholar] [CrossRef]
  41. Kirschner, P.; Sweller, J.; Clark, R.E. Why Minimal Guidance during Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching. Educ. Psychol. 2006, 41, 75–86. [Google Scholar] [CrossRef]
  42. Ferrero, M.; Vadillo, M.A.; León, S.P. Is Project-Based Learning Effective among Kindergarten and Elementary Students? A Systematic Review. PLoS ONE 2021, 16, e0249627. [Google Scholar] [CrossRef]
  43. Chen, C.-H.; Yang, Y.-C. Revisiting the Effects of Project-Based Learning on Students’ Academic Achievement: A Meta-Analysis Investigating Moderators. Educ. Res. Rev. 2019, 26, 71–81. [Google Scholar] [CrossRef]
  44. Balemen, N.; Keskin, M.Ö. The Effectiveness of Project-Based Learning on Science Education: A Meta-Analysis Search. Int. Online J. Educ. Teach. 2018, 5, 849–865. [Google Scholar]
  45. Walker, A.; Leary, H. A Problem Based Learning Meta Analysis: Differences across Problem Types, Implementation Types, Disciplines, and Assessment Levels. Interdiscip. J. Probl.-Based Learn. 2009, 3, 6–28. [Google Scholar] [CrossRef]
  46. Chevalier, M.; Giang, C.; Piatti, A.; Mondada, F. Fostering Computational Thinking through Educational Robotics: A Model for Creative Computational Problem Solving. Int. J. STEM Educ. 2020, 7, 1–18. [Google Scholar] [CrossRef]
  47. Atmatzidou, S.; Demetriadis, S. Advancing Students’ Computational Thinking Skills through Educational Robotics: A Study on Age and Gender Relevant Differences. Robot. Auton. Syst. 2016, 75, 661–670. [Google Scholar] [CrossRef]
  48. Ioannou, A.; Makridou, E. Exploring the Potentials of Educational Robotics in the Development of Computational Thinking: A Summary of Current Research and Practical Proposal for Future Work. Educ. Inf. Technol. 2018, 23, 2531–2544. [Google Scholar] [CrossRef]
  49. Allina, B. The Development of STEAM Educational Policy to Promote Student Creativity and Social Empowerment. Arts Educ. Policy Rev. 2018, 119, 77–87. [Google Scholar] [CrossRef]
  50. Bennett, J.; Lattuca, L.; Redd, K.; York, T. Strengthening the Pathways to Faculty Careers in STEM Recommendations for Systemic Changes to Support Underrepresented Groups. 2020. Available online: https://www.aplu.org/library/strengthening-pathways-to-faculty-careers-in-stem-recommendations-for-systemic-change-to-support-underrepresented-groups/File (accessed on 24 August 2021).
  51. Miriti, M.N. The Elephant in the Room: Race and STEM Diversity. BioScience 2020, 70, 237–242. [Google Scholar] [CrossRef]
  52. Conradty, C.; Bogner, F.X. From STEM to STEAM: How to Monitor Creativity. Creat. Res. J. 2018, 30, 233–240. [Google Scholar] [CrossRef] [Green Version]
  53. Domènech-Casal, J.; Lope, S.; Mora, L. Qué Proyectos STEM Diseña y Qué Dificultades Expresa El Profesorado de Secundaria Sobre Aprendizaje Basado En Proyectos. Rev. Eureka Sobre Enseñanza Divulg. Las Cienc. 2019, 16, 1–16. [Google Scholar] [CrossRef] [Green Version]
  54. Blumenfeld, P.C.; Soloway, E.; Marx, R.W.; Krajcik, J.S.; Guzdial, M.; Palincsar, A. Motivating Project-Based Learning: Sustaining the Doing, Supporting the Learning. Educ. Psychol. 1991, 26, 369–398. [Google Scholar]
  55. Jalinus, N.; Nabawi, R.A.; Mardin, A. The Seven Steps of Project Based Learning Model to Enhance Productive Competences of Vocational Students. In Proceedings of the Advances in Social Science, Education and Humanities Research, Xi’an, China, 28 September 2017; Atlantis Press: Yogyakarta, Indonesia, 2017; Volume 102, pp. 251–256. [Google Scholar]
  56. Mesas-Carrascosa, F.J.; Pérez Porras, F.; Triviño-Tarradas, P.; Meroño de Larriva, J.E.; García-Ferrer, A. Project-Based Learning Applied to Unmanned Aerial Systems and Remote Sensing. Remote. Sens. 2019, 11, 2413. [Google Scholar] [CrossRef] [Green Version]
  57. Demo, H.; Garzetti, M.; Santi, G.; Tarini, G. Learning Mathematics in an Inclusive and Open Environment: An Interdisciplinary Approach. Educ. Sci. 2021, 11, 199. [Google Scholar] [CrossRef]
  58. Chairil Hikayat, S.; Hairun, Y.; Suharna, H. Design of Realistic Mathematics Education Approach to Improve Critical Thinking Skills. Univers. J. Educ. Res. 2020, 8, 2232–2244. [Google Scholar] [CrossRef]
  59. Masitoh, L.F.; Fitriyani, H. Improving Students’ Mathematics Self-Efficacy through Problem Based Learning. Malikussaleh J. Math. Learn. 2018, 1, 26–30. [Google Scholar] [CrossRef] [Green Version]
  60. Surya, E.; Syahputra, E. Improving High-Level Thinking Skills by Development of Learning PBL Approach on the Learning Mathematics for Senior High School Students. Int. Educ. Stud. 2017, 10, 12–20. [Google Scholar] [CrossRef] [Green Version]
  61. Laurens, T.; Batlolona, F.A.; Batlolona, J.R.; Leasa, M. How Does Realistic Mathematics Education (RME) Improve Students’ Mathematics Cognitive Achievement? Eurasia J. Math. Sci. Technol. Educ. 2017, 14, 569–578. [Google Scholar] [CrossRef]
  62. Thibaut, L.; Ceuppens, S.; De Loof, H.; De Meester, J.; Goovaerts, L.; Struyf, A.; Boeve-de Pauw, J.; Dehaene, W.; Deprez, J.; De Cock, M. Integrated STEM Education: A Systematic Review of Instructional Practices in Secondary Education. Eur. J. STEM Educ. 2018, 3, 1–12. [Google Scholar] [CrossRef]
  63. Kim, A.Y.; Sinatra, G.M.; Seyranian, V. Developing a STEM Identity among Young Women: A Social Identity Perspective. Rev. Educ. Res. 2018, 88, 589–625. [Google Scholar] [CrossRef]
  64. Minichiello, A.; Hood, J.R.; Harkness, D.S. Bringing User Experience Design to Bear on STEM Education: A Narrative Literature Review. J. STEM Educ. Res. 2018, 1, 7–33. [Google Scholar] [CrossRef]
  65. Schreffler, J.; Vasquez, E., III; Chini, J.; James, W. Universal Design for Learning in Postsecondary STEM Education for Students with Disabilities: A Systematic Literature Review. Int. J. STEM Educ. 2019, 6, 1–10. [Google Scholar] [CrossRef]
  66. Musavi, M.; Friess, W.A.; James, C.; Isherwood, J.C. Changing the Face of STEM with Stormwater Research. Int. J. STEM Educ. 2018, 5, 1–12. [Google Scholar] [CrossRef] [PubMed]
  67. Gao, X.; Li, P.; Shen, J.; Sun, H. Reviewing Assessment of Student Learning in Interdisciplinary STEM Education. Int. J. STEM Educ. 2020, 7, 1–14. [Google Scholar] [CrossRef]
  68. Wheland, E.R.; Donovan, W.J.; Dukes, J.T.; Qammar, H.K.; Smith, G.A.; Williams, B.L. Green Action through Education: A Model for Fostering Positive Attitudes about STEM. J. Coll. Sci. Teach. 2013, 42, 46–51. [Google Scholar]
  69. Tucker-Raymond, E.; Puttick, G.; Cassidy, M.; Harteveld, C.; Troiano, G.M. “I Broke Your Game!”: Critique among Middle Schoolers Designing Computer Games about Climate Change. Int. J. STEM Educ. 2019, 6, 1–16. [Google Scholar] [CrossRef]
  70. Garcia-Piqueras, M. Aventuras STEAM. Ciencia, Tecnología, Ingeniería y Arte: Un Universo de Conexiones Matemáticas; Miradas Matemáticas; Los libros de la Catarata—Instituto de Ciencias Matemáticas (ICMAT): Madrid, Spain, 2021; ISBN 978-84-13-52153-4. [Google Scholar]
  71. Pompea, S.M.; Fine, L.W.; Meystre, P. Photonics Education for a Green Future: Connecting the Dots of the Arizona STEM Education Experiment. In Proceedings of the SPIE Eco-Photonics 2011: Sustainable Design, Manufacturing, and Engineering Workforce Education for a Green Future, Strasbourg, France, 28–30 March 2011; Ambs, P., Curticapean, D., Emmelmann, C., Knapp, W., Kuznicki, Z.T., Meyrueis, P.P., Eds.; International Society for Optics and Photonics: Leiden, The Netherlands, 2011; Volume 8065, pp. 1–13. [Google Scholar]
  72. Garcia-Piqueras, M.; Sotos-Serrano, M. Regeneración Forestal Tras Un Incendio: Complejidad y Protocolos En Una Aproximación STEM Transversal. Rev. Eureka Sobre Enseñanza Divulg. Las Cienc. 2021, 18, 1–20. [Google Scholar] [CrossRef]
  73. Honess, D.; Quinlan, O. Astro Pi: Running Your Code Aboard the International Space Station. Acta Astronaut. 2017, 138, 43–52. [Google Scholar] [CrossRef]
  74. Miller, K.; Sonnert, G.; Sadler, P. The Influence of Students’ Participation in STEM Competitions on Their Interest in STEM Careers. Int. J. Sci. Educ. Part B 2018, 8, 95–114. [Google Scholar] [CrossRef]
  75. Chen, C.-H.; Hwang, G.-J. Effects of the Team Competition-Based Ubiquitous Gaming Approach on Students’ Interactive Patterns, Collective Efficacy and Awareness of Collaboration and Communication. J. Educ. Technol. Soc. 2017, 20, 87–98. [Google Scholar]
  76. Young, L.M.; Gauci, S.; Scholey, A.; White, D.J.; Pipingas, A. Self-Selection Bias: An Essential Design Consideration for Nutrition Trials in Healthy Populations. Front. Nutr. 2020, 7, 1–5. [Google Scholar] [CrossRef]
  77. Heckman, J. Varieties of Selection Bias. Am. Econ. Rev. 1990, 80, 313–318. [Google Scholar]
  78. Luth, L. An Empirical Approach to Correct Self-Selection Bias of Online Panel Research. In Proceedings of the 2008 CASRO Panel Conference, Miami, FL, USA, 5–6 February 2008; pp. 1–15. [Google Scholar]
  79. Znaniecki, F. The Method of Sociology; Social Theory; Farrar & Rinehart: New York, NY, USA, 1934; ISBN 978-0-374-98873-9. [Google Scholar]
  80. Bryman, A. Social Research Methods; Oxford University Press: Oxford, UK, 2016; ISBN 978-0-19-968945-3. [Google Scholar]
  81. Johnson, M.W.; Christensen, C.M.; Kagermann, H. Reinventing Your Business Model. Harv. Bus. Rev. 2008, 86, 57–68. [Google Scholar]
  82. Sheffield, R.S.; Koul, R.; Blackley, S.; Fitriani, E.; Rahmawati, Y.; Resek, D. Transnational Examination of STEM Education. Int. J. Innov. Sci. Math. Educ. 2018, 26, 67–80. [Google Scholar]
  83. Franco, M.S.; Patel, N.H. Exploring Student Engagement in STEM Education: An Examination of STEM Schools, STEM Programs, and Traditional Schools. Res. Sch. 2017, 24, 10–30. [Google Scholar]
  84. Clark, R.E.; Kirschner, P.A.; Sweller, J. Putting Students on the Path to Learning: The Case for Fully Guided Instruction. Am. Educ. 2012, 36, 6–11. [Google Scholar]
  85. Pavlis, D.; Gkiosos, J. John Dewey, from Philosophy of Pragmatism to Progressive Education. J. Arts Humanit. 2017, 6, 23–30. [Google Scholar] [CrossRef] [Green Version]
  86. Zambrano, J.; Kirschner, F.; Sweller, J.; Kirschner, P.A. Effects of Prior Knowledge on Collaborative and Individual Learning. Learn. Instr. 2019, 63, 1–8. [Google Scholar] [CrossRef]
  87. Herman, W.E.; Pinard, M.R. Critically examining inquiry-based learning: John Dewey in theory, history, and practice. In Inquiry-Based Learning for Multidisciplinary Programs: A Conceptual and Practical Resource for Educators; Innovations in Higher Education Teaching and Learning; Emerald Group Publishing Limited: Bingley, UK, 2015; Volume 3, pp. 43–62. ISBN 978-1-78441-848-9. [Google Scholar]
  88. Ausubel, D.P. Educational Psychology: A Cognitive View; Holt, Rinehart & Winston of Canada Ltd.: New York, NY, USA, 1968; ISBN 978-0-03-069640-4. [Google Scholar]
  89. Sweller, J. Working Memory, Long-Term Memory, and Instructional Design. J. Appl. Res. Mem. Cogn. 2016, 5, 360–367. [Google Scholar] [CrossRef]
  90. Telle, J.A.; Hernández-Orallo, J.; Ferri, C. The Teaching Size: Computable Teachers and Learners for Universal Languages. Mach. Learn. 2019, 108, 1653–1675. [Google Scholar] [CrossRef]
  91. Garcia-Piqueras, M.; Hernández-Orallo, J. Optimal Teaching Curricula with Compositional Simplicity Priors. In Proceedings of the 2021 Machine Learning and Knowledge Discovery in Databases, Bilbao, Spain, 13–17 September 2021; pp. 1–16, in press. [Google Scholar]
  92. Hobbs, L.; Clark, J.C.; Plant, B. Successful students–STEM program: Teacher learning through a multifaceted vision for STEM education. In STEM Education in the Junior Secondary; Springer: Singapore, 2018; pp. 133–168. ISBN 978-981-10-5447-1. [Google Scholar]
  93. Aslam, F.; Adefila, A.; Bagiya, Y. STEM Outreach Activities: An Approach to Teachers’ Professional Development. J. Educ. Teach. 2018, 44, 58–70. [Google Scholar] [CrossRef]
  94. Ruiz-Gallardo, J.-R.; Castaño, S.; Gómez-Alday, J.J.; Valdés, A. Assessing Student Workload in Problem Based Learning: Relationships among Teaching Method, Student Workload and Achievement. A Case Study in Natural Sciences. Teach. Teach. Educ. 2011, 27, 619–627. [Google Scholar] [CrossRef]
  95. Smith, A.P. Student Workload, Wellbeing and Academic Attainment. In Proceedings of the H-WORKLOAD 2019: Human Mental Workload: Models and Applications, Rome, Italy, 14–15 November 2019; Longo, L., Leva, M., Eds.; Springer: Cham, Switzerland, 2019; Volume 1107, pp. 35–47. [Google Scholar]
  96. Morcillo, J.M.; Martínez, E.P.; Ruiz-Gallardo, J.-R. Microorganismos y Hábitos de Higiene. Estudio Longitudinal En Los Cursos Iniciales de Educación Primaria. Rev. Eureka Sobre Enseñanza Divulg. Las Cienc. 2021, 18, 1–19. [Google Scholar] [CrossRef]
  97. National Research Council. How People Learn: Brain, Mind, Experience, and School; The National Academies Press: Washington, DC, USA, 1999; ISBN 978-0-309-07036-2. [Google Scholar]
  98. Reimann, P. Design-based research. In Methodological Choice and Design; Markauskaite, L., Freebody, P., Irwin, J., Eds.; Methodos Series (Methodological Prospects in the Social Sciences); Springer: Dordrecht, The Netherlands, 2010; Volume 9, pp. 37–50. ISBN 978-90-481-8932-8. [Google Scholar]
  99. Brown, A.L. Design Experiments: Theoretical and Methodological Challenges in Creating Complex Interventions in Classroom Settings. J. Learn. Sci. 1992, 2, 141–178. [Google Scholar] [CrossRef]
Figure 1. EO Browser shows the tendency of NDVI for a certain area affected by a severe wildfire.
Figure 1. EO Browser shows the tendency of NDVI for a certain area affected by a severe wildfire.
Mathematics 09 02066 g001
Figure 2. Photo of the Yosemite Valley in Sierra Nevada (USA), taken from the ISS by Team B.
Figure 2. Photo of the Yosemite Valley in Sierra Nevada (USA), taken from the ISS by Team B.
Mathematics 09 02066 g002
Table 1. This table shows the results of the program’s evaluation (Score range is 1–5).
Table 1. This table shows the results of the program’s evaluation (Score range is 1–5).
Team AGradeEligibilityEfficiencyClearnessMean
A1 and A2High school4354.00
A3 and A4Secondary4233.00
A5 and A6Secondary4343.67
Team BGradeEligibilityEfficiencyClearnessMean
B1 and B2High school4354.00
B3 and B4High school4323.00
B5 and B6Secondary5555.00
Table 2. Analysis of students’ assessments in final math tests.
Table 2. Analysis of students’ assessments in final math tests.
nMeanSDUZpr
2018Control235.62.594.5−1.6440.101
Experimental127.12.5
2019Control245.62.5314.5−2.1900.0280.37
Experimental127.92.0
2020Control234.92.4242−1.9740.0480.34
Experimental118.12.1
Note: n = number of students in each group; S D = Standard Deviation; U = Mann-Whitney’s statistic; Z = Comparison rank; p = Significance value; r = Effect size.
Table 3. Post-hoc analysis of students’ marks in final math exams for the experimental and control groups.
Table 3. Post-hoc analysis of students’ marks in final math exams for the experimental and control groups.
2018 vs. 20192018 vs. 20202019 vs. 2020
WZprWZprWZpr
Experimental12.7010.0070.7876−2.530.0000.7315.5−0.830.618
Control541.0650.284 38.5−2.480.0130.517−3.660.0190.74
Note: W = Wilcoxon statistic; Z = Comparison rank; p = Significance value; r = Effect size.
Table 4. Categories obtained after the analysis of the interviews and examples of comments from students and families.
Table 4. Categories obtained after the analysis of the interviews and examples of comments from students and families.
Category% Students% FamiliesQuotes
Motivation83.3366.67B6 is pretty much motivated in the math class, and this is something quite surprising for us (B6′s father).
It is worth58.3375.00B5 worked a lot, but it was worth it, since it is something important for his curriculum (B5′s father).
Utility100.0075.00Math is useful to understand how nature works (A1).
Importance of math83.3375.00It is surprising how useful is basic math to monitor climate change (B3).
Enjoy58.3375.00We always want more ‘special’ math sessions (B5).
Pride50.0066.67A5 is a very good student, and we are very proud of his achievements (A5′s father).
Workload66.6791.67We work hard and the workload was, at some moments, exhausting (A2).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Garcia-Piqueras, M.; Ruiz-Gallardo, J.-R. Green STEM to Improve Mathematics Proficiency: ESA Mission Space Lab. Mathematics 2021, 9, 2066. https://doi.org/10.3390/math9172066

AMA Style

Garcia-Piqueras M, Ruiz-Gallardo J-R. Green STEM to Improve Mathematics Proficiency: ESA Mission Space Lab. Mathematics. 2021; 9(17):2066. https://doi.org/10.3390/math9172066

Chicago/Turabian Style

Garcia-Piqueras, Manuel, and José-Reyes Ruiz-Gallardo. 2021. "Green STEM to Improve Mathematics Proficiency: ESA Mission Space Lab" Mathematics 9, no. 17: 2066. https://doi.org/10.3390/math9172066

APA Style

Garcia-Piqueras, M., & Ruiz-Gallardo, J. -R. (2021). Green STEM to Improve Mathematics Proficiency: ESA Mission Space Lab. Mathematics, 9(17), 2066. https://doi.org/10.3390/math9172066

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop