Education in Programming and Mathematical Learning: Functionality of a Programming Language in Educational Processes
Round 1
Reviewer 1 Report
- In the anova tests: As far as I know, the anova tests only indicate that the means in some groups are different to the means in other groups, without specifying which. Then one uses another method (such as Fisher LSD) to find which means are different to the others.
- I found it unusual that the paper does not mention the use of machine learning for enhancing the learning and teaching experience of complex tasks. For example, I was thinking in the use of recommender systems to gradually adapt the difficulty of a series of questions or tasks that the students must perform. Please also reference relevant methods which charaterize the memory of the human operator with fractional systems, that could be use for the task of establishing memorization patterns in human-machine systems: "Communication and interaction with semiautonomous ground vehicles by force control steering".
- Table 3 could be presented in a more compact manner.
- English grammar and style should be improved. Some sentences feel unusual.
Author Response
Dear Reviewer,
Thank you very much for your review and proposed input. We attach the revised article and the corrections made.
We are at your disposal.
Best regards
------
- The contributions made by the reviewers are indicated in black.
- The clarifications of the authors are in yellow.
Reviewer 1
Comments and Suggestions for Authors
- In the anova tests: As far as I know, the anova tests only indicate that the means in some groups are different to the means in other groups, without specifying which. Then one uses another method (such as Fisher LSD) to find which means are different to the others.
We have carried out a comparison of means (t test for independent samples) between the experimental group and the control group for each item of the instrument, and analyzed whether the difference in means has been statistically significant (p), also analyzing the effect size (d ). Post hoc tests, we use them when we perform an ANOVA and compare a polytomous variable, not dichotomous.
- I found it unusual that the paper does not mention the use of machine learning for enhancing the learning and teaching experience of complex tasks. For example, I was thinking in the use of recommender systems to gradually adapt the difficulty of a series of questions or tasks that the students must perform. Please also reference relevant methods which charaterize the memory of the human operator with fractional systems, that could be use for the task of establishing memorization patterns in human-machine systems: "Communication and interaction with semiautonomous ground vehicles by force control steering".
A mention is included in the Discussion section.
- Table 3 could be presented in a more compact manner.
Tables 3, 4 and 5 are presented in a more compact manner.
- English grammar and style should be improved. Some sentences feel unusual.
Improved English grammar and style.
Reviewer 2 Report
The introduction is a potpourri of different topics with hardly any mutual relationship: TIC, programming education, mathematical education, and Sustainable Development Goals. The manuscript seems to ignore that computer science education (CSE) is a well-established research field with decades of research on programming education. Furthermore, the concept of computational thinking seems to be promoted by many people worldwide… with the notable exception of computer scientists. Consequently, there is a majority of CSE researchers who consider that it is better to discard the hype associated the term “computational thinking” and to speak about more tangible concepts, skills or competences, such as programming education, debugging or testing. In other words, they demand the same rigor as researchers in mathematics education, who do not use the term “mathematical thinking” as a wildcard.
I have several severe concerns about the manuscript. First, we may wonder whether the manuscript intends to make a contribution to either the learning of programming, or the learning of mathematics or broader sustainable goals. Making a contribution in the three issues simultaneously looks very ambitious but blurring, therefore its value is controversial, if any.
I am also concerned about the scopes of the study and of the journal. There is no sustainability in the research. The manuscript does not make any contribution to programming literacy either. Therefore, the research may only make a contribution to mathematics education, therefore it should be submitted to a journal whose scope includes mathematics education or education in general.
A number of severe concerns deal with the report provided of the research conducted. For instance, it is not explained how the sample of students were selected, and the criteria and procedure used to split the sample into the two groups.
However, the most severe drawback comes from the lack of details about the instruction received by each group of students. In particular, I wonder whether both groups dedicated the same time length to study (both in-class and out-class), which topics were instructed each one, how the students in the control group were skilled in these topics by non-programming means, how the students in the experimental group were skilled in these topics by Scratch, and whether the test and the instruction were aligned in both groups. Furthermore, given the very high number of students and instructors involved, there is no confidence on the homogeneity of each group. Under these circumstances, it seems that the research is not reproducible and therefore the value of its findings are controversial.
The presentation of the results also is deficient. Four tables are included with details about pretest and posttest scores in each item. However, I have not found the average score of each group in the pretest and the posttest, therefore, the “the trees don't let you see the forest”, and it is difficult to have a global vision of the differences between both groups.
Surprisingly, no discussion is provided about the items with higher or lower differences. Consequently, no specific conclusion is provided but a general improvement in mathematics learning.
Let me conclude my review with a general educational consideration. If you intend to enhance mathematics learning of students, you may think on innovative instructional procedures which may actively engage students in the topic. However, if the use of innovative means implies that students have to learn concepts and skills from a different field, notice that they may experience higher cognitive load. As a consequence, the potential benefits of your approach may be counteracted by such load. This is the case of “computational thinking”, which is just a hype term for computer programming. In this case, students must learn new concepts (concurrency, iteration, sequence, etc.). I would have like to see a discussion about how the higher cognitive load of students in the experimental group did not harm their learning of mathematics. Alternatively, we might wonder why the control group did not make use of their lower cognitive load to get more actively involved in the mathematics topics addressed, for instance by using software tools for learning geometry.
Finally I expected to find an explanation of your differences with respect to the group who authored the Scratch Maths project. Their humbler results are more credible to me than yours.
With respect to the presentation of the manuscript, a severe authors’ fault is the inclusion of the following text: “Research manuscripts reporting large datasets that are deposited in a publicly available database should specify here the data have been deposited and provide the relevant accession numbers. If the accession numbers have not yet been obtained at the time of submission, please state that they will be provided during review. They must be provided prior to publication.” Obviously, this text is a remaining of a manuscript template and shows severe negligence on the authors’ side when preparing the manuscript.
Note that you may find In page 3, lines 122-124 the following Spanish text: “Real 123 Decreto 126/2014, de 28 de febrero, por el que se establece el currículo básico de la Educación 124 Primaria”.
Author Response
Dear Reviewer,
Thank you very much for your review and proposed input. We attach the revised article and the corrections made.
We are at your disposal.
Best regards
------
- The contributions made by the reviewers are indicated in black.
- The clarifications of the authors are in yellow.
Reviewer 2
Comments and Suggestions for Authors
- The introduction is a potpourri of different topics with hardly any mutual relationship: TIC, programming education, mathematical education, and Sustainable Development Goals. The manuscript seems to ignore that computer science education (CSE) is a well-established research field with decades of research on programming education. Furthermore, the concept of computational thinking seems to be promoted by many people worldwide… with the notable exception of computer scientists. Consequently, there is a majority of CSE researchers who consider that it is better to discard the hype associated the term “computational thinking” and to speak about more tangible concepts, skills or competences, such as programming education, debugging or testing. In other words, they demand the same rigor as researchers in mathematics education, who do not use the term “mathematical thinking” as a wildcard.
In the Introduction section, contributions made are integrated. An attempt is made to improve the connection between concepts. “Computational thinking” is replaced by “programming education”. References 2 and 3 are replaced and two others are included.
- I have several severe concerns about the manuscript. First, we may wonder whether the manuscript intends to make a contribution to either the learning of programming, or the learning of mathematics or broader sustainable goals. Making a contribution in the three issues simultaneously looks very ambitious but blurring, therefore its value is controversial, if any.
The following is indicated in the Introduction: “The main contribution of this article is to offer a contribution to the learning of Mathematics, in this case through the programming education”.
- I am also concerned about the scopes of the study and of the journal. There is no sustainability in the research. The manuscript does not make any contribution to programming literacy either. Therefore, the research may only make a contribution to mathematics education, therefore it should be submitted to a journal whose scope includes mathematics education or education in general.
The article is directed to the Special Issue "ICT and Sustainable Education". This Special Issue aims to open a debate on how to integrate ICTs into the educational context, seeking to achieve a sustainable development from different perspectives, for example, innovative proposals for pedagogical practices and use of ICTs for an inclusive education. With this article, it is intended to show a learning experience in the area of Mathematics through programming education. On the other hand, a way of doing evaluation is developed through ICT, in this case adapting an evaluation battery to a Google Docs questionnaire, which allows a comprehensive evaluation to be carried out on a whole school population to know their level of competence and potentialities for Mathematics (this last information is included in the article).
- A number of severe concerns deal with the report provided of the research conducted. For instance, it is not explained how the sample of students were selected, and the criteria and procedure used to split the sample into the two groups.
The following is indicated in the Materials and Methods: “The sample was selected from 147 Spanish schools, each autonomous regions and municipalities willing to participate, selected their participating schools, mainly based on the existence of ICT resources necessary, quality and quantity, to carry out the research. This sample was balanced between schools financed with public and private or concerted funds, and from both urban and rural environments”. It is also included in the number of schools in the experimental group and the control group.
- However, the most severe drawback comes from the lack of details about the instruction received by each group of students. In particular, I wonder whether both groups dedicated the same time length to study (both in-class and out-class), which topics were instructed each one, how the students in the control group were skilled in these topics by non-programming means, how the students in the experimental group were skilled in these topics by Scratch, and whether the test and the instruction were aligned in both groups. Furthermore, given the very high number of students and instructors involved, there is no confidence on the homogeneity of each group. Under these circumstances, it seems that the research is not reproducible and therefore the value of its findings are controversial.
The following is indicated in the Materials and Methods: “Both the students in the experimental group and the control group received the same hours of mathematics class according to the established curriculum, however, the experimental group in those class hours also learned programming education with Scratch. None of the participating students had worked with Scratch before”.
The following is indicated in the Discussion: “An attempt will be made to maintain the homogeneity of the characteristics and circumstances of the previous investigation as much as possible”.
- The presentation of the results also is deficient. Four tables are included with details about pretest and posttest scores in each item. However, I have not found the average score of each group in the pretest and the posttest, therefore, the “the trees don't let you see the forest”, and it is difficult to have a global vision of the differences between both groups.
Tables 3, 4, 5 and 6 show the mean scores of each group, experimental and control, both in the pretest and in the post-test (row labeled "Total"). The authors of the research remain at the disposal of the Editorial Team and reviewers to better clarify these data.
- Surprisingly, no discussion is provided about the items with higher or lower differences. Consequently, no specific conclusion is provided but a general improvement in mathematics learning.
The following is indicated in the Discussion: “The items in which the differences between groups were greater, favorable to the experimental group, were 10, 11, 12, 19, 29 and 30; the first four items are of arithmetic content, the last two of geometric content”.
- Let me conclude my review with a general educational consideration. If you intend to enhance mathematics learning of students, you may think on innovative instructional procedures which may actively engage students in the topic. However, if the use of innovative means implies that students have to learn concepts and skills from a different field, notice that they may experience higher cognitive load. As a consequence, the potential benefits of your approach may be counteracted by such load. This is the case of “computational thinking”, which is just a hype term for computer programming. In this case, students must learn new concepts (concurrency, iteration, sequence, etc.). I would have like to see a discussion about how the higher cognitive load of students in the experimental group did not harm their learning of mathematics. Alternatively, we might wonder why the control group did not make use of their lower cognitive load to get more actively involved in the mathematics topics addressed, for instance by using software tools for learning geometry.
The following is indicated in the Discussion: “The inclusion of new didactic methodologies in the teaching and learning processes favors educational innovation and the assumption of active roles by the student. This requires a greater cognitive load from the students but, in the case of the present investigation, their interest and motivation towards learning in the area of Mathematics was not affected. This was analyzed after asking the students at both stages of the research to rate from 0, the lowest value, to 10, the highest value, what was their interest and motivation towards the area of Mathematics. The results showed little variability between stages of the investigation, in the pretest, mean value of 7.71 (SD = 2.37) for the experimental group and 7.90 (SD = 2.63) for the control group; in the post-test, the results were 7.75 (SD = 2.37) for the experimental group and 7.97 (SD = 2.20) for the control group”.
- Finally I expected to find an explanation of your differences with respect to the group who authored the Scratch Maths project. Their humbler results are more credible to me than yours.
This research is part of the Scratch Maths Project developed by the National Institute of Educational Technologies and Faculty Training (INTEF), the National University of Distance Education (UNED), and the University of Castilla-La Mancha (UCLM). You can expand information on the following links:
- Presentation of the program: http://code.intef.es/aprende-matematicas-y-otras-cosas-con-scratch-3-0/
- First results report: http://code.intef.es/wp-content/uploads/2019/12/Impacto_EscueladePensamientoComputacional_Curso2018-2019.pdf
- With respect to the presentation of the manuscript, a severe authors’ fault is the inclusion of the following text: “Research manuscripts reporting large datasets that are deposited in a publicly available database should specify here the data have been deposited and provide the relevant accession numbers. If the accession numbers have not yet been obtained at the time of submission, please state that they will be provided during review. They must be provided prior to publication.” Obviously, this text is a remaining of a manuscript template and shows severe negligence on the authors’ side when preparing the manuscript.
This text is deleted. The database remains at the disposal of the Editorial Team and reviewers of the journal for its consultation.
- Note that you may find In page 3, lines 122-124 the following Spanish text: “Real 123 Decreto 126/2014, de 28 de febrero, por el que se establece el currículo básico de la Educación 124 Primaria”.
Translation included: “Royal Decree 126/2014, of 28 February 2014, on the enactment of the basic curriculum of the Primary Education”
Reviewer 3 Report
I suggest, the authors need to elaborate more on the results and not just present us descriptive statistics. They are intending to prove that the use and effectiveness of computational thinking enhance the mathematical development and this is challenging.
Author Response
Dear Reviewer,
Thank you very much for your review and proposed input. We attach the revised article and the corrections made.
We are at your disposal.
Best regards
-------
Reviewer 3
Comments and Suggestions for Authors
- I suggest, the authors need to elaborate more on the results and not just present us descriptive statistics. They are intending to prove that the use and effectiveness of computational thinking enhance the mathematical development and this is challenging.
This section is improved and are included results for the sex variable (also in the Discussion).
The following is indicated in the Discussion: “The inclusion of new didactic methodologies in the teaching and learning processes favors educational innovation and the assumption of active roles by the student. This requires a greater cognitive load from the students but, in the case of the present investigation, their interest and motivation towards learning in the area of Mathematics was not affected. This was analyzed after asking the students at both stages of the research to rate from 0, the lowest value, to 10, the highest value, what was their interest and motivation towards the area of Mathematics. The results showed little variability between stages of the investigation, in the pretest, mean value of 7.71 (SD = 2.37) for the experimental group and 7.90 (SD = 2.63) for the control group; in the post-test, the results were 7.75 (SD = 2.37) for the experimental group and 7.97 (SD = 2.20) for the control group”.
Reviewer 4 Report
The research does not reflect significant results regarding the 3 ODS cited in the theoretical and objective justification of the work.
It is assumed that the results of applying a mathematical programme in primary education to the ODS, specifically to 5, 4, and 12, are presented.
However, the only thing that can be observed is that there have been changes; it is not described what changes and even less how these affect gender, quality education and sustainability in daily life (ODS). At the educational level, no results are provided suggesting to review and assess whether it can be approached from educational aspects or better only as quantitative results in relation to the use of the Scthec programme.
For this reason the objectives and title of the work do not match the results presented.
Author Response
Dear Reviewer,
Thank you very much for your review and proposed input. We attach the revised article and the corrections made.
We are at your disposal.
Best regards
---------
Reviewer 4
Comments and Suggestions for Authors
- The research does not reflect significant results regarding the 3 ODS cited in the theoretical and objective justification of the work.
It is assumed that the results of applying a mathematical programme in primary education to the ODS, specifically to 5, 4, and 12, are presented.
However, the only thing that can be observed is that there have been changes; it is not described what changes and even less how these affect gender, quality education and sustainability in daily life (ODS). At the educational level, no results are provided suggesting to review and assess whether it can be approached from educational aspects or better only as quantitative results in relation to the use of the Scthec programme.
For this reason the objectives and title of the work do not match the results presented.
Results are included for the sex variable.
Round 2
Reviewer 1 Report
The paper has improved with respect to the previous version. Generally I'm positive towards eventual publication. Particularly I appreciate the effort in enhancing the statistical analysis.
However some issues still remain. Mostly with respect to presentation quality. I still think that the tables should be made more compact, perhaps by reducing the font size and the spacing between rows and columns. For example, table 6 could be implemented in a wider and shorter format. Perhaps also some of the tables can be in an appendix, since most readers will only read the relevant statistics. The figures are of very low resolution and quality. Further the English style and grammar style require improvement.
Hence, I choose another major revision to allow the authors to perform this changes.
Author Response
Dear Reviewer,
Thank you very much for your review and proposed input. We attach the revised article and the corrections made.
We are at your disposal.
Best regards
- The contributions made by the reviewers are indicated in black.
- The clarifications of the authors are in yellow.
Reviewer 1
Comments and Suggestions for Authors
We appreciate the contributions made, they have been very valuable to improve the article.
- I still think that the tables should be made more compact, perhaps by reducing the font size and the spacing between rows and columns. For example, table 6 could be implemented in a wider and shorter format. Perhaps also some of the tables can be in an appendix, since most readers will only read the relevant statistics. The figures are of very low resolution and quality.
The line spacing is changed from a minimum of 13 to a single.
The font size is changed from 10 to 9.
Table 1: Experimental Group (EG) and Control Group (CG). Tables 2-8 EG and CG.
Tables 2-8 are presented in a more compact manner.
- Further the English style and grammar style require improvement.
The text is checked by a native person.
Reviewer 2 Report
Thank you for your improvements of the original manuscript. However, I still have severe concerns about the manuscript. Some concerns have a conceptual nature:
- The manuscript does not make a difference between ICT and informatics. Programming is at the core of informatics education, not of ICT. The difference can be more easily understood if you adopt two different points of views. Consider mathematics itself. All of us develop an intuitive understanding of counting and other basic math skills. However, there is a consensus about the benefits that the explicit learning mathematics provides to all of us. Not only we learn new concepts and procedures but they provide us with a different means to think, analyze reality and solve problems. The same can be claimed about many other fields, in particular about informatics. ICT deals with swallow skills that are useful in the new society. However, programming is a part (and a core) of education in informatics.
The difference between ICT and informatics has been explained in full detail by reports and manifestos by scientific societies. Their intent was to let know education experts and politicians this basic difference, which they systematically ignore when they develop public policies. Examples of reports were elaborated by the European and the Spanish scientific computing societies (Informatics Europe and the Sociedad Científica Informática de España, SCIE, respectively), which are the equivalents of the Spanish Mathematics Royal Society, RSME:
* Carpensen, M.E., Gal-Ezer, J., McGettrick, A., & Nardelli, E. (2018). Informatics for All. The Strategy. ACM Europe & Informatics Europe. Retrieved from https://www.informaticseurope.org/component/phocadownload/category/10-reports.html?download=75:informatics_for_all_2018
* Velázquez Iturbide, J. Á. (coord.) (2018). Informe del grupo de trabajo SCIE/CODDII sobre la enseñanza preuniversitaria de la informática. Sociedad Científica Informática de España (SCIE). Retrieved from http://www.scie.es/wp-content/uploads/2018/07/informe-scie-coddii-2018.pdf
Sadly, many official institutions, such as INTEF, are simply not aware of this basic distinction and of the relevance of informatics education for childhood and youth.
- The obvious consequence of considering mathematics a basic competence that all citizens must develop is to schedule Mathematics courses in the children curriculum, not just scheduling extra-scholar activities. The same can be claimed for informatics education (including programming) if we take it seriously. There even exists a report jointly elaborated by RSME and SCIE which advocates for both interdisciplinary education in informatics and mathematics, but also on the need of formal education in informatics.
Luis J. Rodríguez Muniz, Ana Serrado Bayes, Elena Thibaut Tadeo, J. Angel Velázquez Iturbide, David López Álvarez & Antonio Bahamonde (2020). Hacia una nueva educación en matemáticas e informática en la Educación Secundaria. Retrieved from https://www.rsme.es/wp-content/uploads/2020/06/Educacion-matematicas-e-informatica-secundaria.pdf
- In this context, “computational thinking” is hype term coined by J. Wing that has been successful worldwide. However, a number of reputed researchers in informatics and informatics education research have criticized its deficient definition in past years, which only diverts from the more humble and honest target as educators of advocating informatics education (including programming). For instance:
Hemmendinger (2010): “A plea for modesty”, ACM Inroads, 1(2):4-7, junio 2010, DOI 10.1145/1805724.1805725
Armoni (2016): “Computing in schools: Computer science, computational thinking, programming, coding: The anomalies of transitivity in K-12 computer science education”, ACM Inroads, 7(4):24-27, DOI 10.1145/3011071.
J. Denning (2017): “Remaining trouble spots with computational thinking”, Communications of the ACM, 60(6):33-39, DOI 10.1145/2998438.
I still have several concerns about the research conducted:
- Hardly any detail is given about the instruction provided in the experimental and control groups. The manuscript only reports that “Both the students in the experimental group and the 134 control group received the same hours of mathematics class according to the established curriculum, 135 however, the experimental group in those class hours also learned programming education with 136 Scratch.”
However, what math topics were addressed in the study? The new version of manuscript suggests that instruction to children lasted three months in both groups. Is this right?
With respect to the experimental group, what kinds of Scratch projects were asked to students to construct? What was their relation to the topics under study? Did they spent extra time out of the school working on their projects?
With respect to the control group, how did they practice with the concepts under study? Did they solve problems? Did they use alternative educational tools aimed at math education? Did their instructors receive any training or clarifications about the study?
- Hardly any information is given about the questionnaire. Please, explain its main features: total number of items, and number of items of the different topics.
- The conclusions contain the following claim: “The results show that the 5th grade students of Primary Education who participated in the 284 project and who worked on mathematical competence through computer programming activities, 285 developed this competence to a greater extent compared to the students who did so with other 286 activities and usual resources in the Mathematics area”. However, according to Tables 3 and 4, no significant global differences exist between both groups in the pretest (p=0.454) or in the posttest (p=0.07). Differences do exist for several items. A more nuanced conclusion should be rewritten.
Author Response
Dear Reviewer,
Thank you very much for your review and proposed input. We attach the revised article and the corrections made.
We are at your disposal.
Best regards
- The contributions made by the reviewers are indicated in black.
- The clarifications of the authors are in yellow.
Reviewer 2
Comments and Suggestions for Authors
We appreciate the contributions made, they have been very valuable to improve the article.
- The manuscript does not make a difference between ICT and informatics. Programming is at the core of informatics education, not of ICT. The difference can be more easily understood if you adopt two different points of views. Consider mathematics itself. All of us develop an intuitive understanding of counting and other basic math skills. However, there is a consensus about the benefits that the explicit learning mathematics provides to all of us. Not only we learn new concepts and procedures but they provide us with a different means to think, analyze reality and solve problems. The same can be claimed about many other fields, in particular about informatics. ICT deals with swallow skills that are useful in the new society. However, programming is a part (and a core) of education in informatics.
The difference between ICT and informatics has been explained in full detail by reports and manifestos by scientific societies. Their intent was to let know education experts and politicians this basic difference, which they systematically ignore when they develop public policies. Examples of reports were elaborated by the European and the Spanish scientific computing societies (Informatics Europe and the Sociedad Científica Informática de España, SCIE, respectively), which are the equivalents of the Spanish Mathematics Royal Society, RSME:
* Carpensen, M.E., Gal-Ezer, J., McGettrick, A., & Nardelli, E. (2018). Informatics for All. The Strategy. ACM Europe & Informatics Europe. Retrieved from https://www.informaticseurope.org/component/phocadownload/category/10-reports.html?download=75:informatics_for_all_2018
* Velázquez Iturbide, J. Á. (coord.) (2018). Informe del grupo de trabajo SCIE/CODDII sobre la enseñanza preuniversitaria de la informática. Sociedad Científica Informática de España (SCIE). Retrieved from http://www.scie.es/wp-content/uploads/2018/07/informe-scie-coddii-2018.pdf
The following is indicated in the Introduction: “The essential core of computer education is the incorporation of programming into school contexts, rather than ICT, which is more focused on useful skills for the knowledge society. Various reports influence this conceptual differentiation [2,3]”. Two references provided by the reviewer are included.
- The obvious consequence of considering mathematics a basic competence that all citizens must develop is to schedule Mathematics courses in the children curriculum, not just scheduling extra-scholar activities. The same can be claimed for informatics education (including programming) if we take it seriously. There even exists a report jointly elaborated by RSME and SCIE which advocates for both interdisciplinary education in informatics and mathematics, but also on the need of formal education in informatics.
Luis J. Rodríguez Muniz, Ana Serrado Bayes, Elena Thibaut Tadeo, J. Angel Velázquez Iturbide, David López Álvarez & Antonio Bahamonde (2020). Hacia una nueva educación en matemáticas e informática en la Educación Secundaria. Retrieved from https://www.rsme.es/wp-content/uploads/2020/06/Educacion-matematicas-e-informatica-secundaria.pdf
The reference provided is included when the text indicates, for education in programming, its instrumental and transversal nature in the acquisition of other skills.
- In this context, “computational thinking” is hype term coined by J. Wing that has been successful worldwide. However, a number of reputed researchers in informatics and informatics education research have criticized its deficient definition in past years, which only diverts from the more humble and honest target as educators of advocating informatics education (including programming). For instance:
Armoni (2016): “Computing in schools: Computer science, computational thinking, programming, coding: The anomalies of transitivity in K-12 computer science education”, ACM Inroads, 7(4):24-27, DOI 10.1145/3011071.
A reference provided is included when the text indicates that there are multiple experiences of integration of programming in schools.
Hardly any detail is given about the instruction provided in the experimental and control groups. The manuscript only reports that “Both the students in the experimental group and the control group received the same hours of mathematics class according to the established curriculum, however, the experimental group in those class hours also learned programming education with Scratch.” However, what math topics were addressed in the study? The new version of manuscript suggests that instruction to children lasted three months in both groups. Is this right? With respect to the experimental group, what kinds of Scratch projects were asked to students to construct? What was their relation to the topics under study? Did they spent extra time out of the school working on their projects?
The following information is included in the article:
- During the study, the students worked mainly on the contents of the Geometry block.
- The work with Scracht by the experimental group lasted three months.
- The work with Scratch was divided into three modules: mosaic patterns, the Geometry of the beetle, and interacting objects. Each module included practice activities for students to learn how, to handle the program, together with the key vocabulary worked on. All of this learning content was given exclusively in schools.
With respect to the control group, how did they practice with the concepts under study? Did they solve problems? Did they use alternative educational tools aimed at math education? Did their instructors receive any training or clarifications about the study?
The following information is included in the article:
- “The control group were taught via the regular teaching-learning process without using any alternative educational tools for mathematics education, such as Geogebra. Education in programming was only given to the experimental group”.
- Hardly any information is given about the questionnaire. Please, explain its main features: total number of items, and number of items of the different topics.
The following information is included in the article:
- The items are divided into content blocks: Arithmetic (14 items), Geometry (5 items), Magnitudes and Proportionality (6 items) and Statistics and Probability (5 items).
- The conclusions contain the following claim: “The results show that the 5th grade students of Primary Education who participated in the project and who worked on mathematical competence through computer programming activities, developed this competence to a greater extent compared to the students who did so with other activities and usual resources in the Mathematics area”. However, according to Tables 3 and 4, no significant global differences exist between both groups in the pretest (p=0.454) or in the posttest (p=0.07). Differences do exist for several items. A more nuanced conclusion should be rewritten.
It is clarified: “These differences were more apparent in specific items than in the global differences between the two groups at pretest (p = .454) and at post-test (p = .070)”.
Round 3
Reviewer 1 Report
The authors have implemented most of the recommendations. I accept the paper but I think the presentation should be improved prior to publication (especially the quality of the images).
Author Response
Dear reviewer,
All proposed changes have been incorporated, including improving the quality of the images. We thank you very much for the review process, it has been very helpful to us in improving our article.
We are at your disposal.
Best regards
Reviewer 2 Report
I am glad to check that they have proceeded the changes in the new manuscript . If this is the version to be published, I would agree with accepting it.
Author Response
Dear reviewer,
All proposed changes have been incorporated. We thank you very much for the review process, it has been very helpful to us in improving our article.
We are at your disposal.
Best regards