1. Introduction
Holistic understanding about educational research is important for researchers and educational developers, as well as for anyone mentoring and tutoring pre- and post-graduate students. Gaining such knowledge calls for carrying out years of systematic research on the different fields of education. Due to the scope of such work, most researchers specialize in a narrow area of expertise in order to gain deep insight in that area. However, for the research community as a whole, achieving a versatile and holistic understanding about the field is a much more feasible goal. This may seem self-evident, but in this paper, we wish to raise the question about whether this actually holds.
A holistic understanding can be interpreted in several different ways. It can concern content, i.e., whether teaching and learning of various topical areas and key concepts in science are well covered in the research literature, e.g., [
1]. It can concern some cross-cutting themes the education of a specific topic, e.g., incorporating sustainability [
2], teaching argumentation [
3] or scaffolding in the education of a science topic [
4]. It can also concern how that specific topic, e.g. electricity, is taught at various levels of education, from pre-school education to universities and teacher training [
1,
5]. Moreover, it can concern various aspects of the instructional process, including relevant actors, their relations and carried out activities. Additional points of view can cover research in different countries, e.g., [
6,
7,
8], and their educational systems and cultures, as well as whether the research focuses on activities carried out in a single classroom session or course, or in some wider context, such as curriculum or program-level activities or even national-level initiatives.
In this paper, our focus is on surveying science education literature from the point of view of the instructional process. We have taken a holistic view of the process by considering the most relevant actors (teachers, students and learning goals/content) and their various relations in the didactical triangle, which we call didactic foci. We augmented the results by also considering the perspectives of the level of education, the breadth of context, as well as research methodologies used.
Previous analyses of focus areas in science education research (SER) have been based on using data-driven approaches, in which the categorization frameworks were derived from the existing data. Such methods obviously show what research is being published. However, the major limitation of such approaches is that they can reveal only those aspects that are found in the data. Contrasting this, our approach is theory driven. We identified categories based on what a theoretical framework of the instructional process suggests: what could be found, or even should be found? Theory driven analysis can thus reveal gaps or overlooked areas in the research literature in a way which is not generally possible in data-driven analyses. Moreover, data-driven analysis methods typically map rare findings into categories such as “other”, considering them unimportant. In our approach, we are specifically interested in identifying these areas, because they allow us to generate relevant new research questions and points of view on researching the instructional process.
Our methodological framework is based on the didactic triangle [
9] and it has been used successfully to analyze many research papers in computing education [
10], engineering education [
11], and some venues in science education [
12,
13,
14,
15]. The previous findings have clearly indicated that in these areas, research is heavily biased on investigating the relations between students and learning goals/content, such as students’ motivation, initial knowledge of the subject domain, learning practice and learning results. Another aspect widely covered is various pedagogical actions designed to support learning. While these are certainly highly relevant topics for investigation, it is somewhat surprising that some aspects have been researched very little. For example, the research, cited above has shown that little research has been published about in-service teachers’ background, their relation to students, their perceptions on students’ learning or the learning content. However, teachers are a significant factor in the instructional process and thus gaps in this area are worth investigating.
The goal of this study is to extend the previous analyses in SER by identifying the most and the least studied didactic foci in two important publications venues in Europe and how these relate to other attributes of research, i.e., discipline, the scope of data collection, educational level, teacher education, and applied research methodology. This study is based on the previous analysis of the papers published in the
Nordic Studies in Science Education (NorDiNa) journal in the years from 2006 to 2013 [
14], which is the leading science education publication in the Nordic countries (Denmark, Finland, Iceland, Norway, and Sweden). This paper revealed several interesting findings about the most and the least studied aspects and we became curious about whether similar trends appear more widely in European forums and perhaps beyond. Therefore, we selected another key venue for further analysis, the latest European Science Education Research Association (ESERA) conference proceedings in the same period. Our historical survey also complements the recent survey by O’Toole and others [
16], which analyzed some similar aspects in SER papers in different venues, while using a different categorization system over a period (2005–2014).
Our main research questions were:
What are the didactical focus areas in science education research in our data pool?
How does the research appear in different disciplines (biology, chemistry, physics)?
Which scopes for data collection are used (course, organization, society, international)?
How is the research distributed to different educational levels (International Standard Classification of Education, ISCED)?
How does the teacher education context appear in the data pool?
What research methodologies are used?
We see that our results could benefit the field in several ways. For individual researchers, our work can pinpoint areas where there is space for interesting relevant work with a research topic. Moreover, the applied analysis framework helps to build new research questions which focus on identified research gaps or little-studied areas. For instance, Kinnunen and others [
10] used the method to categorize research papers on phenomena related to dropping out of an introductory programming course and found that all papers focus only on student-related issues (e.g., students’ characteristics and what students do) and left other aspects such as teachers’ role in teaching and learning process or curriculum planning unstudied. While our analysis in this paper does not focus on any sub-areas of physics, chemistry and biology, it is straightforward enough for it to be applied in some narrow areas to discuss gaps in current research and generate research ideas. On the other hand, the results could be used by educational decision-making bodies to identify aspects in the instructional process which merit further investigation in a wider scale and thus could be used for targeted research funding.
2. Findings from Earlier Studies
Below, we discuss relevant related work looking separately at various dimensions of how the SER literature has been analyzed.
2.1. Foci Areas
Tsai and Wen [
6], Lee and others [
7], and Lin and others [
8] analyzed 2661 papers published in the years 1998–2012 in the
International Journal of Science Education (IJSE),
Science Education (SE), and
Journal of Research in Science Teaching (JRST). They developed the following categorization scheme, hereafter called Tsai’s and Wen’s categorization method, which was later used by several other researchers.
Teacher Education;
Teaching (e.g., teacher cognition, pedagogical content knowledge, leadership, teacher behaviors and strategies);
Learning—Students’ Conceptions and Conceptual Change (Learning-Conception);
Learning—Classroom Contexts and Learner Characteristics (Learning-Context), e.g., student motivation, background factors. Learning and laboratory environments, learning approaches, student–teacher and student–peer interactions, soft skills in learning;
Goals and Policy, Curriculum, Evaluation, and Assessment;
Cultural, Social and Gender Issues;
History, Philosophy, Epistemology and Nature of Science;
Educational Technology;
Informal Learning.
The three most common topics in the period 2008–2012 were Teaching (19%), Learning-Conception (15%) and Learning-Context (37%). There were significant changes over the 15 year period. The Teaching aspect almost tripled from 7% to 19%, with the biggest increase happening in IJSE. In addition, the Learning-Context aspect doubled from 18% to 37%. The share of Learning-Conception, Goals, Policy and Curriculum, and Culture, Social, and Gender dropped from 50% to 60% to roughly 25% in all journals.
Tsai and others [
5] continued the work with a slightly modified analysis. They analyzed 228 papers published in four major journals (IJSE/JRST/SE/
Research in Science Education (RISE)) during 2000–2009 that reported work on Asian students (including Turkish). They used Tsai’s and Wen’s [
6] categories; however, in this case, each paper could be included in multiple categories, whereas in the earlier works only the best-fit category for each paper was counted. Despite the difference, the studies of Teaching were equally common if compared with Tsai’s and Wen’s [
6], Lee’s and others’ [
7], and Lin’s and others’ [
8] studies. Learning-Conception studies were much more frequent, and over half of the papers were included in this category; no decrease was visible during this period. Learning-Context was the most frequent category, with more than two-thirds of the papers being included in this category. There were no major differences between the four journals and no clear trends were visible during the 10 year period.
More recently, O’Toole and others [
16] analyzed ten years (2005–2014) of abstracts from IJSE/JRST/SE/RISE and
Studies in Higher Education. They used a different, but overlapping categorization method, identifying papers in five major categories: Scientific literacy (52%), Teaching methods (45%), Learning focus (42%), Teachers (39%), and Relation between Science and Education (16%). Comparing these results with the former works using Tsai’s and Wen’s categories [
6] is difficult because there is a lot of overlap. For example, scientific literacy was the largest group of papers, but it included works which would also fit within the Learning Contexts category, as well as the Goals and Policy category in the other system. Further, the Teachers category included teacher training and professional development, counting half of the papers in this category. Finally, each abstract could be categorized within several categories, further making any numeric comparison difficult. An interesting finding was that there were substantial differences between the journals, as well as between the periods 2005–2009 and 2010–2014, which likely reflect differences and changes in journal editorial policies.
Another similar type but small-scale study was done by Cavas [
17], also applying Tsai’s & Wen’s method [
7], who analyzed 126 papers published in the
Science Education International (SEI) journal between the years 2011 and 2015. Considering the research topics, the SEI journal has a different profile than IJSE/JRST/SE. The most common topics were Teacher education (23%), Learning-Conception (20%), followed by Teaching, Learning-Context, Goals, Policy, Curriculum and Culture, Social and Gender, each with 10–11% of the total.
All these studies clearly indicated that pedagogical activities and conceptions, as well as students’ learning activities and results, have had a major role in research, with well over half of the papers falling into these areas, whereas other topics have much less emphasis. For example, while educational technology could be an important tool for supporting learning, only 3–6% of the papers considered it [
7,
9]. Another little-studied area is informal learning.
A very different approach was used by Chang and others [
18]. They analyzed an overlapping data pool: papers published IJSE, JRST, SE and RISE between 1990 and 2007, also focusing on the research topics but using scientometric methods, which automatically clustered research papers based on the most common terms in the data. Since the clusters were quite different from Tsai’s and Wen’s [
7] categories, no direct comparison with their results is possible. However, the largest clusters in this work were Conceptual change & concept mapping (40%), Nature of science & socio-scientific issues (14%), Professional development (11%), and Conceptual change & analogy (11%). The remaining clusters (Scientific concept, Instructional practice, Reasoning skill & problem solving, Design based & urban education, Attitude & gender) covered a small proportion of the papers each. During this period, a clear increase appeared in the Professional development and Nature of science & socio-scientific issue clusters, while the proportion of Scientific concept and Reasoning skill & problem solving clusters decreased to a fraction of their original size.
2.2. Discipline Characteristics and Trends
The previous surveys covered science education holistically. Two other major surveys have focused on chemistry and biology education (we did not find a similar survey for physics education).
Teo and others [
19] studied chemistry education research papers using Tsai’s and Wen’s [
6] categorization method. The data pool covered 650 papers published during from 2004 to 2013—of which, 446 were from two major journals in the field,
Chemistry Education Research and Practice (CERP) and
Journal of Chemical Education (JCE), and the other 204 papers from IJSE, JRST, SE, and RISE. The most popular topics were Teaching (20%), Learning-Conception (26%), Learning-Contexts (19%), whereas Goals, Policy, Curriculum, Evaluation and Assessment (16%), and Educational Technology (11%) were studied less. The rest of the categories were small, amounting to a few percent each. Trends emerging from their data were an increase in papers focusing on Teaching and Learning-Conception. They observed no major differences in chemistry education research journals if compared with the SER journals.
Gul and Sozbilir [
1] studied 1376 biology education papers published in eight journals, including IJSE/SE/JRST/SE and
Journal of Biology Education (JBE),
Journal of Science Education and Technology (JSET),
Research in Science & Technological Education (RSTE) and
Studies in Science Education (SSE) in the years 1997–2014. They used their own categorization scheme and found that the most common aspects in this area were Learning (21%) comprising learning results, learning styles and misconceptions, Teaching (19%) comprising effects on achievement, attitudes, scientific process skills, and comparing different methods, Studies on attitude/perception/self-efficacy (17%), followed by Computer-aided instruction (9%), Studies on teaching materials (8%), Nature of science (7%), and several small categories. They did not report any trends concerning these results.
Thus, here, teaching and learning also comprise a large proportion of the papers. Interestingly educational technology was more commonly researched in both chemistry education and biology education than in the previous analyses focusing on the science education field overall.
2.3. Scope of Data Collection
Earlier analyses of the literature present very little information about the scope of the data collection. Gul and Sozbilir [
1] are one of the few that report sample sizes. They report that sample sizes were most common in the range of 11–30 (16%), 30–100 (23%), 101–300 (20%) or 301–1000 (11%) participants. Large-scale (N > 1000) and small-scale studies (N ≤ 10) were rare. Moreover, 17% of the papers analyzed did not report sample sizes. Similarly, there is little information about the organizational units from which the data were collected, e.g., from a single course, degree program, national or international survey. Teo and others [
19] reported on the educational levels and countries where data had been collected, but there is no information about the organizational scope.
2.4. Educational Level
A few survey papers have reported information about the sample population in the target papers. To allow these results to be compared, we have presented them in terms of the ISCED reference framework, which is used to define the educational levels in different educational systems and disciplines. It is used by Eurostat, Unesco and OECD and it has been implemented into all EU data collections since 2014.
In their analysis of biology education research papers, Gul and Sozbilir [
1] found that 20% of the study targets were at ISCED 1–2 levels (grades 1–8), 34% at ISCED 3 level (grades 9–12), and 23% at ISCED 6–7 levels (undergraduate). Further, educators were in focus in 18% of the studies, whereas at the postgraduate level (ISCED 8), parents, pre-school, admin people or others were rarely among the target population. Some papers reported more than one sample type. The study by Teo and others [
19] of chemistry education research papers gives different results: ISCED 2 level (years 7–9, 14%), ISCED 3 levels (years 10–12/13, 25%), and ISCED 6–8 levels (54%), Preservice teachers were used in 8% of research papers and in-service teachers in 12%. Pre-school and elementary school samples were rare.
The study by Tsai and others [
6] (of science education papers in IJSE/JRST/SE/RISE with Asian students) reported that ISCED 1 level was in focus in 19% of the studies, ISCED 2 level in 33%, ISCED 3 level in 41%, ISCED 6 level (colleges) in 14%, and ISCED 6–7 (pre-service teacher) in 8% of the studies. Again, in some papers more than one sample type was used. They did not find major differences between the journals. This is in line with O’Toole and others’ (2018) work, which categorized the same journals, but all papers, within a somewhat overlapping period: 31% secondary (ISCED 2 and 3), 18% post-secondary (ISCED 6 and 7), 17% primary (ISCED 1) and 2% Early childhood (ISCED 0).
Lin and others [
9] reviewed papers focusing on scaffolding for science education and reported that over half of the studies focused on the upper secondary level (ISCED 3). On the other hand, Wu and others [
20] reported in their study on intervention studies on technology assisted instruction that in half of the studies, the sample was from the tertiary level (ISCED 6–8), 21% from the primary level (ISCED 1) and 20% from lower and upper secondary levels (ISCED 2–3).
Here, it seems that the results are mixed. Some fields focus more on school-level education whereas some, such as chemistry, on higher education. Other target groups, such as teacher students or in-service teachers, are studied less frequently, or are not reported separately.
2.5. Research Methodology
Tsai and Wen [
6], Lee and others [
7], and Lin and others [
8] analyzed the research methodologies that have been used. They identified five categories, including empirical work, theoretical work, reviews, position papers and others. Overall, some 90% of all papers were considered to be empirical research and the other categories covered the balance. There were no clear differences between the IJSE, JRST and SE journals in this aspect.
Gul and Sozbilir [
1], in turn, found that 53% of biology education research papers used qualitative designs. A closer analysis revealed that more than half of these were descriptive papers and case studies; the rest of the papers were distributed among a wide variety of more advanced methods. Quantitative approaches were used in 43% of the papers, and less than one-third of these used an experimental method and the rest were non-experimental, e.g., descriptive, surveys, comparative or correlational. Only 4.2% of the papers used a mixed methods design. Lin and others [
8] reported a similar finding that qualitative methods were the dominant analysis method in SER papers. This matches the findings of O’Toole and others [
16].
Teo and others [
19] noticed that more than half the chemistry education research papers (52%) used a mixed methods approach, while a pure qualitative approach was used in 22% of the papers and a pure quantitative approach in 26% of the papers.
In summary, it is clear that a large proportion of the papers in SER included empirical research and both quantitative and qualitative methods were widely used.
Table 1 summarizes the previous analyses.
3. Materials and Methods
3.1. Study Target
The focus in this study is papers in biology, chemistry and physics education research papers at the upper secondary and tertiary levels (ISCED 3, 6–8) published in the NorDiNa journal from 2005 to 2013 and in the ESERA conference proceedings in the year 2013. These educational levels and disciplines were selected as they match the scientific background of the authors of this paper. The researchers are experienced tertiary level teacher educators, and they have expertise in biology, chemistry, physics, and computer sciences and subject pedagogics. Jarkko Lampiselkä is specialized in chemistry and physics didactics instruction and Arja Kaasinen is specialized in biology didactics instruction at the University of Helsinki. Both Päivi Kinnunen at the University of Helsinki, and Lauri Malmi at Aalto University, are specialized in computing education research and they are experts in the analysis methodology used.
The NorDiNa data pool comprises 138 papers from the years 2005–2013—of which, 52 were included in our data pool. The ESERA proceedings papers are based on the presentations at the ESERA 2013 conference. The analysis started in 2015 and, at that time, the proceedings from 2013 were the latest release available. A total of 960 manuscripts were submitted—of which, 339 were included in the proceedings [
21], and 138 in our data pool. The data pool from the ESERA proceedings is comprehensive and extensive even though the data are from one year only.
The NorDiNa journal was selected as it is the leading peer-reviewed science education research journal in the Nordic countries. The ESERA conference proceedings were selected as they are from one of the largest SER conferences in the world, which attracts over 1500 visitors and over 1000 presentation proposals each time the conference is held. None of these publication venues had been analyzed in related works before this study. These venues provide an excellent overview of the area of educational research, and show the blind spots, if such exist.
3.2. Analysis Method
We used the didactic focus-based categorization method (DFCM) to analyze the papers. It has its origin in J. F. Herbart’s didactical triangle as presented by Kansanen and Meri [
22] and Kansanen [
9] and it describes the formal instructional process in a holistic manner (
Figure 1). The triangle presents the relationships between the three main actors in the instructional process: the content to be learned, the student and the teacher.
The didactic triangle was further developed by Kinnunen [
23] and Kinnunen and others [
14] to consider the teacher’s impact on the student–content relationship, the teacher’s self-reflection and the student’s feedback (shown as arrows 7 and 8 in
Figure 1). In addition, as a part of further development, the didactic triangle was placed in a wider educational scope, so that it enabled the researchers to discuss the instructional processes at the institution, society and international. That is, whereas the original didactic triangle described the instructional process from a classroom point of view, the extended version of this triangle can also be used to describe instructional phenomena that take place at the educational organization level, society level, or even the international level. For instance, the extended didactic triangle can capture phenomena such as the goals a society sets for the level of education for its citizens (node 1 in
Figure 1) and the degree to which citizens achieve these goals (arrow 5 in
Figure 1).
The improved didactic triangle was used as the basis for the DFCM. It includes eight foci—of which, two have sub-foci (
Table 2). The sub-foci were developed to improve the resolution of the DFCM. The development of the categories that are based on the didactic triangle have been described in more detail in our previous publications [
10,
11,
14,
23].
3.3. Additional Classification Attributes
In addition to foci analysis, we also investigated the educational scope, educational level, teacher education, discipline characteristics and research methodology points of view.
The educational scope attribute is based on the educational context in which the research was carried out. The attributes were the course, organization, society, and international levels. The course attribute means that the study was carried out in one or in a couple of classes at the same educational institution. The organization attribute means that the study was carried out in two or more classes at two or more educational institutions. The society attribute means that the study was carried out in two or more regions in the country. The international attribute means that the study involved two or more countries. The educational scope of the studies analyzed may reveal something about how generalizable the results are, but on the other hand, many education-related phenomena manifest themselves more only if a wide-enough scope is applied to the study. For instance, trends related to educational policy making and its consequences to the national science education curriculum require stepping out of the single course context.
The scope of the educational level attribute is on which school level the study was carried out. The classifying attributes were primary (ISCED 1), lower secondary (ISCED 2), upper secondary (ISCED 3) and tertiary (ISCED 6–8). On some occasions, the lower levels were also incorporated in the study and therefore appear in the data pool.
The teacher education (TE) attribute is concerned with whether the research was carried out in a teacher education context. A typical context could be the pedagogically oriented subject studies given in the department of teacher education or the subject department, or the guided teaching practice carried out in a training school.
The scope of the discipline characteristic attribute is how the research foci are distributed among the disciplines. The classification attributes were biology, chemistry, physics and science. The classification was a straightforward process in most cases. Some research papers focused on several disciplines, such as biology and physics, and in these cases the paper was classified as a science paper. The number of the papers in this category was small and we see that they did not have a significant impact on the reliability and validity of the distributions.
The concern with the research methodology attribute is the kind of research methodology used in the data collection and/or in the data analysis of an empirical study. The categories were quantitative, qualitative, mixed and descriptive methodology. The classification was based on the information given in the methodology and the research results sections. Some studies were marked as N/A denoting that no empirical study design was applied. These could be a historical review of evolution of the force concept during centuries, a synopsis of a PhD thesis, for example, or some other theoretical paper.
3.4. Using the Method
The team comprised four researchers who worked in pairs. First, each researcher read through all the papers assigned to him or her and did the preliminary coding. Next, the pair compared and discussed their coding, and jointly agreed on which didactic foci described the paper the best. If the pair could not reach consensus or they were uncertain about the didactic foci of the paper, the whole research team read the paper and discussed it as long as needed until they could come up with a collective decision. The research group worked in this way volume-by-volume and strand by strand in the conference proceedings. We emphasize the importance of the collective analysis process. Discussion with other researchers is essential to ensure the quality of the analysis process.
It is important to read through the entire article and the classification should not be based only on the title and/or the abstract of the study. For example, even though we limited our data pool to the upper secondary and tertiary levels, some primary level education studies were included because the study focused on the education from the tertiary level teaching practice point of view. On the other hand, STEM education covers several educational contexts—some of which were within the scope of our study, but others fell outside the scope, such as geography or technology education.
On some occasions, the teacher education attribute was difficult to apply. The difficulties emerge in the manifold roles that the teachers and the students have in the educational context. An in-service teacher can be in a student role in a continuing education course and a tertiary level student can be in a teacher’s role in teaching practice. We included school education studies if their focus was the upper secondary or tertiary levels. Instead, a tertiary level teaching practice course focusing on the student teacher’s teaching in a primary level classroom was excluded unless the study focused on formal teacher education. Moreover, non-formal teacher activities, such as a summer camp in astronomy for physics teachers, were also excluded.
Other reasons for excluding papers were a lack of clarity on educational level, empirical data, or the theoretical nature of the study unless the topic was relevant to secondary and tertiary levels. Justification of the educational level based simply on the students’ age can be difficult. For example, a 16-year-old student could be in a lower or an upper secondary level depending on the country or the date of birth. Some students begin school earlier or later than expected, they can repeat or skip the grade, and immigrant students are sometimes placed in a lower educational level than their relevant age group. Further, conceptual difficulties exist. The term “high school” could refer to the lower or upper secondary level depending on the researcher’s vocabulary habits. If there was any unclarity, the paper would be excluded. If there were no empirical data, it would be difficult to justify which educational level the study was focusing on. In many cases, this related to the theoretical nature of the paper. For example, a study on the historical evolution of quantum mechanical concepts was regarded as relevant to upper and tertiary level education and the study was included in the data pool, but on the other hand, a study on comprehensive school curriculum (ISCED 1–2 levels) was excluded.
5. Discussion
The results showed that research in science education in this data pool focused much on students’ understanding, attitudes, learning results, and on the teacher’s impact on these aspects. Much less studied aspects were the teacher characteristics, what happens in the classroom while students are studying, how the teachers perceive the students’ actions, attitudes and understanding, the students’ feedback to the teacher, and teachers’ self-reflection. These findings are in line with our previous findings in computing education research [
10], science education research [
14], and engineering education research [
37].
This leads us to wonder why some aspects are studied more than the others in our data pool (ESERA, NorDiNa). We sought the answer to this question, but our results did not give clear answers. One explanation could be the research policy and the policy making. The Ministries of Education, the National Science Academies, and the National Boards of Education can have a significant impact on what research is done in the universities. Politicians need up-to-date information on teaching and learning and therefore these aspects could be funded more often. The researchers might even strengthen reciprocity and propose topics that will probably get the most funding. It is also possible that the less studied aspects will be published in other forums that focus more on the general pedagogy, such as the European Conference on Educational Research (ECER). However, we see that the ESERA conference should provide a forum broad enough for all kinds of research to appear to some extent at least. Therefore, the alternative publication channels would not seem to be a probable explanation, especially as the same aspects are absent from both the ESERA and the NorDiNa forums.
Nonetheless, we are not alone with our question, as earlier researchers searched for an answer to this too. For example, O’Toole and others [
16] proposed that perhaps there has been an ongoing generational shift of some kind as the number of studies about teachers’ pedagogical content knowledge, constructivism and public understanding about science are declining and other topics are increasing. This seems plausible and studies on the teacher’s pedagogical content knowledge were uncommon in our data pools as well. On the other hand, this is also good news for those who are looking for less-studied research topics. It is unlikely that these issues have been fully researched and there is nothing new to discover. On the contrary, there is much to be discovered and one of the aims of our research was to find the less-studied aspects. A fundamental change in classroom dynamics is underway as learning environments are digitalizing. Clearly teachers’ pedagogical content knowledge and students’ study habits should be investigated much more.
The investigations published in NorDiNa and ESERA focus on course- and organizational-level investigations, whereas society and international-level studies were scarce. The finding has some novelty as this aspect has not been reported in earlier studies [
16]. There are several tentative explanations ranging from ease and inexpensiveness of small-scale studies to the generalizability of the results. Readily collectable data and a faster publication phase are positive factors for small-scale studies. However, small scale is not a synonym for low quality. Results of a small-scale case study can be as relevant as those from a large survey. Society and international-level studies can be complex to arrange and expensive to carry out. These studies require research groups in which each researcher has their own topic. Consequently, larger studies are potentially divided into smaller investigations and specific study designs, which in turn can increase the number of course- and organizational-level studies.
Our data pool was limited to specific educational levels and therefore comparisons to earlier studies are tentative. We noticed that the studies in the NorDiNa journal focus more on school education and less on tertiary level education. The frequency of school education level studies and the infrequency of continuing education studies in NorDiNa is in line with the study by O’Toole and others [
16]. Most of the studies reported were carried out at secondary and post-secondary levels (>40%), whereas primary (15%) and especially early childhood studies (2.5%) were less frequent. However, studies focusing on both the upper secondary and tertiary levels were rare, and the trend was similar in both journals. We conclude that this might indicate some sort of missing spot in science education research in Europe.
We sought to establish whether there are any discipline-based characteristics. Distributions between chemistry and physics education were similar and focused on the same foci areas, but biology education papers seemed to focus more on conceptual understanding than chemistry and physics education research. This view is supported by Asshoff and Hammann [
36], who showed in their study that the most frequently investigated topic was the pupils’ conceptual understanding—more specifically, their understanding about genetics and ecology. We propose that this might originate from the subject matter itself, which has taken giant leaps in DNA technology, molecular biology, virology, climate change, cloning, and health care over in recent decades. Our knowledge of human biology and environmental systems has been updated rapidly and the teachers, the pupils and the curricula have difficulties staying up to date. Consequently, perhaps, research on conceptual understanding has a more central role in biology education and therefore is a slightly more popular topic in our data pool as well.
We were interested to determine whether the teacher education (TE) context appeared in the data pool somehow. In other words, we asked ourselves whether the studies carried out in the TE context have any characteristic features. We found that the impact of the TE context appears in the data pool to some extent; however, the differences were not as apparent as one might have expected. The TE-related studies were distributed more evenly in different foci in NorDiNa, but they were more frequent in ESERA. This could corroborate the above mentioned finding that the NorDiNa journal focuses more on school education, and higher education studies are published elsewhere. The finding is new, as the earlier researchers had not expressed much interest in it.
Why are some methodologies more frequently used than others? In contrast with the general belief, studies using quantitative methodologies are not as frequent as we think, and qualitative methodologies have become predominant. There can be several reasons, such as research tradition for using a particular methodology. For example, qualitative research has been a popular research methodology in the Nordic countries for a long time, perhaps partially due to the influence of Marton’s tradition in Sweden. We noticed the same trend in the ESERA data pool when we isolated the Nordic studies from the other European studies. The infrequency of mixed methodology studies is presumably due to its complexity as it calls for knowhow in both the quantitative and qualitative methodologies. These findings are in line with the previous researchers’ findings [
6,
7,
8,
16], but also dissimilarities appeared. For example, the others reported a small number of theoretical papers, but in the NorDiNa data pool we noticed a somewhat notably large proportion, reaching 19% of all studies. We took a closer look into the data pool and found that there was curriculum review underway in Norway during 2005–2013 which solely explained the anomaly. As we deducted this distortion effect, the proportion of descriptive and theoretical studies was scarce and in line with the other studies in the field. However, the editorial line of the NorDiNa journal may have some additional impact on the frequency of publishing theoretical papers. The editorial board endorses authors to send descriptions of their ongoing projects and short abstracts of dissertations in the field wherein methodology descriptions might be limited or absent.
Also, the research team discussed the possible impact of the editorial line on the distribution of papers in different forums. True, to some extent the given topics published can be influenced by journal editors, and thus, with changes in editorship, the emphases of a journal or conference may change simply because of the views of the editors. However, we see that this is merely a theoretical note as the journal editors are not changed very often, and the editorial line is better seen as a joint vision of the editorial board rather than of one editor. On the contrary, it is more plausible that the editorial board will try to keep the editorial line the same from year to year, instead of having changes in the editorship. The situation might be different in conferences which typically have a thematic emphasis and which change from conference to conference and can have some impact on the versatility and manifoldness of the research papers.
6. Recommendations
We advise the authors to pay more attention to the internal coherence and clarity in their manuscripts. The didactic foci were not always clearly described in the paper, not even in the journal articles. The theoretical framework might introduce several foci, but not all of them were reported in the results section or one focus was emphasized more than the others. In several cases, we noticed some disparity between the title, the written research questions and what the results were about. Consequently, a classification method based solely on the title, the keywords, the abstract, or the research questions is apt to lead to misinterpretation. Comprehensive reading is therefore suggested. The approach is time consuming but improves the quality of the analysis.
The theoretical underpinnings of the DFCM are in school education and therefore it works best to illuminate the relations between a teacher, student, and the content in a formal education context. In the future, the DFCM will be further developed to cope better with informal learning phenomena. The resolution of the DFCM could be further developed with more subcategories. Currently, some interesting aspects such as teacher’s pedagogical knowledge and evaluation do not show in the results to an extent which relates to their significance in education. In particular, the evaluation aspect could be addressed better.
Based on the findings from this study, it seems that science education research could be more versatile and comprehensive, and currently misses some important aspects from the didactic triangle point of view. Polarization of the research into fewer aspects may not have been this apparent within the academic community before our study.