Next Article in Journal
Re-Thinking the “Problem” in Inquiry-Based Pedagogies through Exemplarity and World-Oriented
Previous Article in Journal
‘It Depends’: Technology Use by Parent and Family Educators in the United States
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Didactic Focus Areas in Science Education Research

1
Faculty of Educational Sciences, University of Helsinki, Siltavuorenpenger 5A, 00014 Helsinki, Finland
2
Department of Computer Science, School of Science, Aalto University, P.O. Box 15400, 00076 AALTO, Finland
*
Author to whom correspondence should be addressed.
Educ. Sci. 2019, 9(4), 294; https://doi.org/10.3390/educsci9040294
Submission received: 24 November 2019 / Revised: 8 December 2019 / Accepted: 10 December 2019 / Published: 12 December 2019

Abstract

:
This study provides an overview of the didactic focus areas in educational research in biology, chemistry and physics, seeking to identify the focus areas that are investigated frequently and those that have been studied rarely or not at all. We applied the didactic focus-based categorization analysis method (DFCM), which is based on an extension of the didactic triangle. As the data set, we used 250 papers published in the Nordic Studies in Science Education (NorDiNa) between 2005 and 2013, and the European Science Education Research Association (ESERA) 2013 conference proceedings covering education at upper secondary and tertiary levels. The results show that the teacher’s pedagogical actions and the student–content relationship were the most frequently studied aspects. On the other hand, teachers’ reflections on the students’ perceptions and attitudes about goals and content, and teachers’ conceptions of the students’ actions towards achieving the goals were studied least. Irrespective of the publication forum, the distributions of foci to different categories were quite similar. Our historical analysis completes the recent studies in the field as it is based on the theory driven categorization system instead of the data driven approaches used by the previous researchers. Moreover, our further observations on more recent publications suggest that no significant changes have taken place, and therefore wider discussion about the scope and the coverage of the research in science education is needed.

1. Introduction

Holistic understanding about educational research is important for researchers and educational developers, as well as for anyone mentoring and tutoring pre- and post-graduate students. Gaining such knowledge calls for carrying out years of systematic research on the different fields of education. Due to the scope of such work, most researchers specialize in a narrow area of expertise in order to gain deep insight in that area. However, for the research community as a whole, achieving a versatile and holistic understanding about the field is a much more feasible goal. This may seem self-evident, but in this paper, we wish to raise the question about whether this actually holds.
A holistic understanding can be interpreted in several different ways. It can concern content, i.e., whether teaching and learning of various topical areas and key concepts in science are well covered in the research literature, e.g., [1]. It can concern some cross-cutting themes the education of a specific topic, e.g., incorporating sustainability [2], teaching argumentation [3] or scaffolding in the education of a science topic [4]. It can also concern how that specific topic, e.g. electricity, is taught at various levels of education, from pre-school education to universities and teacher training [1,5]. Moreover, it can concern various aspects of the instructional process, including relevant actors, their relations and carried out activities. Additional points of view can cover research in different countries, e.g., [6,7,8], and their educational systems and cultures, as well as whether the research focuses on activities carried out in a single classroom session or course, or in some wider context, such as curriculum or program-level activities or even national-level initiatives.
In this paper, our focus is on surveying science education literature from the point of view of the instructional process. We have taken a holistic view of the process by considering the most relevant actors (teachers, students and learning goals/content) and their various relations in the didactical triangle, which we call didactic foci. We augmented the results by also considering the perspectives of the level of education, the breadth of context, as well as research methodologies used.
Previous analyses of focus areas in science education research (SER) have been based on using data-driven approaches, in which the categorization frameworks were derived from the existing data. Such methods obviously show what research is being published. However, the major limitation of such approaches is that they can reveal only those aspects that are found in the data. Contrasting this, our approach is theory driven. We identified categories based on what a theoretical framework of the instructional process suggests: what could be found, or even should be found? Theory driven analysis can thus reveal gaps or overlooked areas in the research literature in a way which is not generally possible in data-driven analyses. Moreover, data-driven analysis methods typically map rare findings into categories such as “other”, considering them unimportant. In our approach, we are specifically interested in identifying these areas, because they allow us to generate relevant new research questions and points of view on researching the instructional process.
Our methodological framework is based on the didactic triangle [9] and it has been used successfully to analyze many research papers in computing education [10], engineering education [11], and some venues in science education [12,13,14,15]. The previous findings have clearly indicated that in these areas, research is heavily biased on investigating the relations between students and learning goals/content, such as students’ motivation, initial knowledge of the subject domain, learning practice and learning results. Another aspect widely covered is various pedagogical actions designed to support learning. While these are certainly highly relevant topics for investigation, it is somewhat surprising that some aspects have been researched very little. For example, the research, cited above has shown that little research has been published about in-service teachers’ background, their relation to students, their perceptions on students’ learning or the learning content. However, teachers are a significant factor in the instructional process and thus gaps in this area are worth investigating.
The goal of this study is to extend the previous analyses in SER by identifying the most and the least studied didactic foci in two important publications venues in Europe and how these relate to other attributes of research, i.e., discipline, the scope of data collection, educational level, teacher education, and applied research methodology. This study is based on the previous analysis of the papers published in the Nordic Studies in Science Education (NorDiNa) journal in the years from 2006 to 2013 [14], which is the leading science education publication in the Nordic countries (Denmark, Finland, Iceland, Norway, and Sweden). This paper revealed several interesting findings about the most and the least studied aspects and we became curious about whether similar trends appear more widely in European forums and perhaps beyond. Therefore, we selected another key venue for further analysis, the latest European Science Education Research Association (ESERA) conference proceedings in the same period. Our historical survey also complements the recent survey by O’Toole and others [16], which analyzed some similar aspects in SER papers in different venues, while using a different categorization system over a period (2005–2014).
Our main research questions were:
  • What are the didactical focus areas in science education research in our data pool?
  • How does the research appear in different disciplines (biology, chemistry, physics)?
  • Which scopes for data collection are used (course, organization, society, international)?
  • How is the research distributed to different educational levels (International Standard Classification of Education, ISCED)?
  • How does the teacher education context appear in the data pool?
  • What research methodologies are used?
We see that our results could benefit the field in several ways. For individual researchers, our work can pinpoint areas where there is space for interesting relevant work with a research topic. Moreover, the applied analysis framework helps to build new research questions which focus on identified research gaps or little-studied areas. For instance, Kinnunen and others [10] used the method to categorize research papers on phenomena related to dropping out of an introductory programming course and found that all papers focus only on student-related issues (e.g., students’ characteristics and what students do) and left other aspects such as teachers’ role in teaching and learning process or curriculum planning unstudied. While our analysis in this paper does not focus on any sub-areas of physics, chemistry and biology, it is straightforward enough for it to be applied in some narrow areas to discuss gaps in current research and generate research ideas. On the other hand, the results could be used by educational decision-making bodies to identify aspects in the instructional process which merit further investigation in a wider scale and thus could be used for targeted research funding.

2. Findings from Earlier Studies

Below, we discuss relevant related work looking separately at various dimensions of how the SER literature has been analyzed.

2.1. Foci Areas

Tsai and Wen [6], Lee and others [7], and Lin and others [8] analyzed 2661 papers published in the years 1998–2012 in the International Journal of Science Education (IJSE), Science Education (SE), and Journal of Research in Science Teaching (JRST). They developed the following categorization scheme, hereafter called Tsai’s and Wen’s categorization method, which was later used by several other researchers.
  • Teacher Education;
  • Teaching (e.g., teacher cognition, pedagogical content knowledge, leadership, teacher behaviors and strategies);
  • Learning—Students’ Conceptions and Conceptual Change (Learning-Conception);
  • Learning—Classroom Contexts and Learner Characteristics (Learning-Context), e.g., student motivation, background factors. Learning and laboratory environments, learning approaches, student–teacher and student–peer interactions, soft skills in learning;
  • Goals and Policy, Curriculum, Evaluation, and Assessment;
  • Cultural, Social and Gender Issues;
  • History, Philosophy, Epistemology and Nature of Science;
  • Educational Technology;
  • Informal Learning.
The three most common topics in the period 2008–2012 were Teaching (19%), Learning-Conception (15%) and Learning-Context (37%). There were significant changes over the 15 year period. The Teaching aspect almost tripled from 7% to 19%, with the biggest increase happening in IJSE. In addition, the Learning-Context aspect doubled from 18% to 37%. The share of Learning-Conception, Goals, Policy and Curriculum, and Culture, Social, and Gender dropped from 50% to 60% to roughly 25% in all journals.
Tsai and others [5] continued the work with a slightly modified analysis. They analyzed 228 papers published in four major journals (IJSE/JRST/SE/Research in Science Education (RISE)) during 2000–2009 that reported work on Asian students (including Turkish). They used Tsai’s and Wen’s [6] categories; however, in this case, each paper could be included in multiple categories, whereas in the earlier works only the best-fit category for each paper was counted. Despite the difference, the studies of Teaching were equally common if compared with Tsai’s and Wen’s [6], Lee’s and others’ [7], and Lin’s and others’ [8] studies. Learning-Conception studies were much more frequent, and over half of the papers were included in this category; no decrease was visible during this period. Learning-Context was the most frequent category, with more than two-thirds of the papers being included in this category. There were no major differences between the four journals and no clear trends were visible during the 10 year period.
More recently, O’Toole and others [16] analyzed ten years (2005–2014) of abstracts from IJSE/JRST/SE/RISE and Studies in Higher Education. They used a different, but overlapping categorization method, identifying papers in five major categories: Scientific literacy (52%), Teaching methods (45%), Learning focus (42%), Teachers (39%), and Relation between Science and Education (16%). Comparing these results with the former works using Tsai’s and Wen’s categories [6] is difficult because there is a lot of overlap. For example, scientific literacy was the largest group of papers, but it included works which would also fit within the Learning Contexts category, as well as the Goals and Policy category in the other system. Further, the Teachers category included teacher training and professional development, counting half of the papers in this category. Finally, each abstract could be categorized within several categories, further making any numeric comparison difficult. An interesting finding was that there were substantial differences between the journals, as well as between the periods 2005–2009 and 2010–2014, which likely reflect differences and changes in journal editorial policies.
Another similar type but small-scale study was done by Cavas [17], also applying Tsai’s & Wen’s method [7], who analyzed 126 papers published in the Science Education International (SEI) journal between the years 2011 and 2015. Considering the research topics, the SEI journal has a different profile than IJSE/JRST/SE. The most common topics were Teacher education (23%), Learning-Conception (20%), followed by Teaching, Learning-Context, Goals, Policy, Curriculum and Culture, Social and Gender, each with 10–11% of the total.
All these studies clearly indicated that pedagogical activities and conceptions, as well as students’ learning activities and results, have had a major role in research, with well over half of the papers falling into these areas, whereas other topics have much less emphasis. For example, while educational technology could be an important tool for supporting learning, only 3–6% of the papers considered it [7,9]. Another little-studied area is informal learning.
A very different approach was used by Chang and others [18]. They analyzed an overlapping data pool: papers published IJSE, JRST, SE and RISE between 1990 and 2007, also focusing on the research topics but using scientometric methods, which automatically clustered research papers based on the most common terms in the data. Since the clusters were quite different from Tsai’s and Wen’s [7] categories, no direct comparison with their results is possible. However, the largest clusters in this work were Conceptual change & concept mapping (40%), Nature of science & socio-scientific issues (14%), Professional development (11%), and Conceptual change & analogy (11%). The remaining clusters (Scientific concept, Instructional practice, Reasoning skill & problem solving, Design based & urban education, Attitude & gender) covered a small proportion of the papers each. During this period, a clear increase appeared in the Professional development and Nature of science & socio-scientific issue clusters, while the proportion of Scientific concept and Reasoning skill & problem solving clusters decreased to a fraction of their original size.

2.2. Discipline Characteristics and Trends

The previous surveys covered science education holistically. Two other major surveys have focused on chemistry and biology education (we did not find a similar survey for physics education).
Teo and others [19] studied chemistry education research papers using Tsai’s and Wen’s [6] categorization method. The data pool covered 650 papers published during from 2004 to 2013—of which, 446 were from two major journals in the field, Chemistry Education Research and Practice (CERP) and Journal of Chemical Education (JCE), and the other 204 papers from IJSE, JRST, SE, and RISE. The most popular topics were Teaching (20%), Learning-Conception (26%), Learning-Contexts (19%), whereas Goals, Policy, Curriculum, Evaluation and Assessment (16%), and Educational Technology (11%) were studied less. The rest of the categories were small, amounting to a few percent each. Trends emerging from their data were an increase in papers focusing on Teaching and Learning-Conception. They observed no major differences in chemistry education research journals if compared with the SER journals.
Gul and Sozbilir [1] studied 1376 biology education papers published in eight journals, including IJSE/SE/JRST/SE and Journal of Biology Education (JBE), Journal of Science Education and Technology (JSET), Research in Science & Technological Education (RSTE) and Studies in Science Education (SSE) in the years 1997–2014. They used their own categorization scheme and found that the most common aspects in this area were Learning (21%) comprising learning results, learning styles and misconceptions, Teaching (19%) comprising effects on achievement, attitudes, scientific process skills, and comparing different methods, Studies on attitude/perception/self-efficacy (17%), followed by Computer-aided instruction (9%), Studies on teaching materials (8%), Nature of science (7%), and several small categories. They did not report any trends concerning these results.
Thus, here, teaching and learning also comprise a large proportion of the papers. Interestingly educational technology was more commonly researched in both chemistry education and biology education than in the previous analyses focusing on the science education field overall.

2.3. Scope of Data Collection

Earlier analyses of the literature present very little information about the scope of the data collection. Gul and Sozbilir [1] are one of the few that report sample sizes. They report that sample sizes were most common in the range of 11–30 (16%), 30–100 (23%), 101–300 (20%) or 301–1000 (11%) participants. Large-scale (N > 1000) and small-scale studies (N ≤ 10) were rare. Moreover, 17% of the papers analyzed did not report sample sizes. Similarly, there is little information about the organizational units from which the data were collected, e.g., from a single course, degree program, national or international survey. Teo and others [19] reported on the educational levels and countries where data had been collected, but there is no information about the organizational scope.

2.4. Educational Level

A few survey papers have reported information about the sample population in the target papers. To allow these results to be compared, we have presented them in terms of the ISCED reference framework, which is used to define the educational levels in different educational systems and disciplines. It is used by Eurostat, Unesco and OECD and it has been implemented into all EU data collections since 2014.
In their analysis of biology education research papers, Gul and Sozbilir [1] found that 20% of the study targets were at ISCED 1–2 levels (grades 1–8), 34% at ISCED 3 level (grades 9–12), and 23% at ISCED 6–7 levels (undergraduate). Further, educators were in focus in 18% of the studies, whereas at the postgraduate level (ISCED 8), parents, pre-school, admin people or others were rarely among the target population. Some papers reported more than one sample type. The study by Teo and others [19] of chemistry education research papers gives different results: ISCED 2 level (years 7–9, 14%), ISCED 3 levels (years 10–12/13, 25%), and ISCED 6–8 levels (54%), Preservice teachers were used in 8% of research papers and in-service teachers in 12%. Pre-school and elementary school samples were rare.
The study by Tsai and others [6] (of science education papers in IJSE/JRST/SE/RISE with Asian students) reported that ISCED 1 level was in focus in 19% of the studies, ISCED 2 level in 33%, ISCED 3 level in 41%, ISCED 6 level (colleges) in 14%, and ISCED 6–7 (pre-service teacher) in 8% of the studies. Again, in some papers more than one sample type was used. They did not find major differences between the journals. This is in line with O’Toole and others’ (2018) work, which categorized the same journals, but all papers, within a somewhat overlapping period: 31% secondary (ISCED 2 and 3), 18% post-secondary (ISCED 6 and 7), 17% primary (ISCED 1) and 2% Early childhood (ISCED 0).
Lin and others [9] reviewed papers focusing on scaffolding for science education and reported that over half of the studies focused on the upper secondary level (ISCED 3). On the other hand, Wu and others [20] reported in their study on intervention studies on technology assisted instruction that in half of the studies, the sample was from the tertiary level (ISCED 6–8), 21% from the primary level (ISCED 1) and 20% from lower and upper secondary levels (ISCED 2–3).
Here, it seems that the results are mixed. Some fields focus more on school-level education whereas some, such as chemistry, on higher education. Other target groups, such as teacher students or in-service teachers, are studied less frequently, or are not reported separately.

2.5. Research Methodology

Tsai and Wen [6], Lee and others [7], and Lin and others [8] analyzed the research methodologies that have been used. They identified five categories, including empirical work, theoretical work, reviews, position papers and others. Overall, some 90% of all papers were considered to be empirical research and the other categories covered the balance. There were no clear differences between the IJSE, JRST and SE journals in this aspect.
Gul and Sozbilir [1], in turn, found that 53% of biology education research papers used qualitative designs. A closer analysis revealed that more than half of these were descriptive papers and case studies; the rest of the papers were distributed among a wide variety of more advanced methods. Quantitative approaches were used in 43% of the papers, and less than one-third of these used an experimental method and the rest were non-experimental, e.g., descriptive, surveys, comparative or correlational. Only 4.2% of the papers used a mixed methods design. Lin and others [8] reported a similar finding that qualitative methods were the dominant analysis method in SER papers. This matches the findings of O’Toole and others [16].
Teo and others [19] noticed that more than half the chemistry education research papers (52%) used a mixed methods approach, while a pure qualitative approach was used in 22% of the papers and a pure quantitative approach in 26% of the papers.
In summary, it is clear that a large proportion of the papers in SER included empirical research and both quantitative and qualitative methods were widely used. Table 1 summarizes the previous analyses.

3. Materials and Methods

3.1. Study Target

The focus in this study is papers in biology, chemistry and physics education research papers at the upper secondary and tertiary levels (ISCED 3, 6–8) published in the NorDiNa journal from 2005 to 2013 and in the ESERA conference proceedings in the year 2013. These educational levels and disciplines were selected as they match the scientific background of the authors of this paper. The researchers are experienced tertiary level teacher educators, and they have expertise in biology, chemistry, physics, and computer sciences and subject pedagogics. Jarkko Lampiselkä is specialized in chemistry and physics didactics instruction and Arja Kaasinen is specialized in biology didactics instruction at the University of Helsinki. Both Päivi Kinnunen at the University of Helsinki, and Lauri Malmi at Aalto University, are specialized in computing education research and they are experts in the analysis methodology used.
The NorDiNa data pool comprises 138 papers from the years 2005–2013—of which, 52 were included in our data pool. The ESERA proceedings papers are based on the presentations at the ESERA 2013 conference. The analysis started in 2015 and, at that time, the proceedings from 2013 were the latest release available. A total of 960 manuscripts were submitted—of which, 339 were included in the proceedings [21], and 138 in our data pool. The data pool from the ESERA proceedings is comprehensive and extensive even though the data are from one year only.
The NorDiNa journal was selected as it is the leading peer-reviewed science education research journal in the Nordic countries. The ESERA conference proceedings were selected as they are from one of the largest SER conferences in the world, which attracts over 1500 visitors and over 1000 presentation proposals each time the conference is held. None of these publication venues had been analyzed in related works before this study. These venues provide an excellent overview of the area of educational research, and show the blind spots, if such exist.

3.2. Analysis Method

We used the didactic focus-based categorization method (DFCM) to analyze the papers. It has its origin in J. F. Herbart’s didactical triangle as presented by Kansanen and Meri [22] and Kansanen [9] and it describes the formal instructional process in a holistic manner (Figure 1). The triangle presents the relationships between the three main actors in the instructional process: the content to be learned, the student and the teacher.
The didactic triangle was further developed by Kinnunen [23] and Kinnunen and others [14] to consider the teacher’s impact on the student–content relationship, the teacher’s self-reflection and the student’s feedback (shown as arrows 7 and 8 in Figure 1). In addition, as a part of further development, the didactic triangle was placed in a wider educational scope, so that it enabled the researchers to discuss the instructional processes at the institution, society and international. That is, whereas the original didactic triangle described the instructional process from a classroom point of view, the extended version of this triangle can also be used to describe instructional phenomena that take place at the educational organization level, society level, or even the international level. For instance, the extended didactic triangle can capture phenomena such as the goals a society sets for the level of education for its citizens (node 1 in Figure 1) and the degree to which citizens achieve these goals (arrow 5 in Figure 1).
The improved didactic triangle was used as the basis for the DFCM. It includes eight foci—of which, two have sub-foci (Table 2). The sub-foci were developed to improve the resolution of the DFCM. The development of the categories that are based on the didactic triangle have been described in more detail in our previous publications [10,11,14,23].

3.3. Additional Classification Attributes

In addition to foci analysis, we also investigated the educational scope, educational level, teacher education, discipline characteristics and research methodology points of view.
The educational scope attribute is based on the educational context in which the research was carried out. The attributes were the course, organization, society, and international levels. The course attribute means that the study was carried out in one or in a couple of classes at the same educational institution. The organization attribute means that the study was carried out in two or more classes at two or more educational institutions. The society attribute means that the study was carried out in two or more regions in the country. The international attribute means that the study involved two or more countries. The educational scope of the studies analyzed may reveal something about how generalizable the results are, but on the other hand, many education-related phenomena manifest themselves more only if a wide-enough scope is applied to the study. For instance, trends related to educational policy making and its consequences to the national science education curriculum require stepping out of the single course context.
The scope of the educational level attribute is on which school level the study was carried out. The classifying attributes were primary (ISCED 1), lower secondary (ISCED 2), upper secondary (ISCED 3) and tertiary (ISCED 6–8). On some occasions, the lower levels were also incorporated in the study and therefore appear in the data pool.
The teacher education (TE) attribute is concerned with whether the research was carried out in a teacher education context. A typical context could be the pedagogically oriented subject studies given in the department of teacher education or the subject department, or the guided teaching practice carried out in a training school.
The scope of the discipline characteristic attribute is how the research foci are distributed among the disciplines. The classification attributes were biology, chemistry, physics and science. The classification was a straightforward process in most cases. Some research papers focused on several disciplines, such as biology and physics, and in these cases the paper was classified as a science paper. The number of the papers in this category was small and we see that they did not have a significant impact on the reliability and validity of the distributions.
The concern with the research methodology attribute is the kind of research methodology used in the data collection and/or in the data analysis of an empirical study. The categories were quantitative, qualitative, mixed and descriptive methodology. The classification was based on the information given in the methodology and the research results sections. Some studies were marked as N/A denoting that no empirical study design was applied. These could be a historical review of evolution of the force concept during centuries, a synopsis of a PhD thesis, for example, or some other theoretical paper.

3.4. Using the Method

The team comprised four researchers who worked in pairs. First, each researcher read through all the papers assigned to him or her and did the preliminary coding. Next, the pair compared and discussed their coding, and jointly agreed on which didactic foci described the paper the best. If the pair could not reach consensus or they were uncertain about the didactic foci of the paper, the whole research team read the paper and discussed it as long as needed until they could come up with a collective decision. The research group worked in this way volume-by-volume and strand by strand in the conference proceedings. We emphasize the importance of the collective analysis process. Discussion with other researchers is essential to ensure the quality of the analysis process.
It is important to read through the entire article and the classification should not be based only on the title and/or the abstract of the study. For example, even though we limited our data pool to the upper secondary and tertiary levels, some primary level education studies were included because the study focused on the education from the tertiary level teaching practice point of view. On the other hand, STEM education covers several educational contexts—some of which were within the scope of our study, but others fell outside the scope, such as geography or technology education.
On some occasions, the teacher education attribute was difficult to apply. The difficulties emerge in the manifold roles that the teachers and the students have in the educational context. An in-service teacher can be in a student role in a continuing education course and a tertiary level student can be in a teacher’s role in teaching practice. We included school education studies if their focus was the upper secondary or tertiary levels. Instead, a tertiary level teaching practice course focusing on the student teacher’s teaching in a primary level classroom was excluded unless the study focused on formal teacher education. Moreover, non-formal teacher activities, such as a summer camp in astronomy for physics teachers, were also excluded.
Other reasons for excluding papers were a lack of clarity on educational level, empirical data, or the theoretical nature of the study unless the topic was relevant to secondary and tertiary levels. Justification of the educational level based simply on the students’ age can be difficult. For example, a 16-year-old student could be in a lower or an upper secondary level depending on the country or the date of birth. Some students begin school earlier or later than expected, they can repeat or skip the grade, and immigrant students are sometimes placed in a lower educational level than their relevant age group. Further, conceptual difficulties exist. The term “high school” could refer to the lower or upper secondary level depending on the researcher’s vocabulary habits. If there was any unclarity, the paper would be excluded. If there were no empirical data, it would be difficult to justify which educational level the study was focusing on. In many cases, this related to the theoretical nature of the paper. For example, a study on the historical evolution of quantum mechanical concepts was regarded as relevant to upper and tertiary level education and the study was included in the data pool, but on the other hand, a study on comprehensive school curriculum (ISCED 1–2 levels) was excluded.

4. Results

4.1. General Trends

The distributions within ESERA and NorDiNa are in line with each other. The most frequently studied aspects were the student–content/goals relationship (Focus 5, abbreviated F5) and teachers’ impact on this relationship (F7) (see Table 3).
Gul and Solzbilir [1] refer to Asshoff and Hammann [36] who found that distributions of biology education research papers at the ERIDOB conference and in the IJSE differ considerably from each other. The publications from the ERIDOB conference focused more on learning (similar to focus 5 in our study) whereas the papers in the IJSE were distributed more equally between categories. In general, our finding shows that the ESERA conference and the NorDiNa journal are better in line with each other. However, the students’ learning (focus 5) is to some extent more frequently reported in the ESERA conference (51%) than in the NorDiNa journal (42%). Gul and Sozbilir explained the difference by noting the audiences of different kinds: journals are aimed at international readers, and the aim of conferences is sharing the recent findings among the participants. Taking this as hypothesis for a conclusion, it seems that audiences for NorDiNa and ESERA do not differ much from each other.
The least studied aspects were student characteristics (F2), teacher characteristics (F3), and the teacher–student relationship (F4). The proportion of the most frequently studied aspects was in line with the studies produced by Tsai and Wen [6], Lee and others [7], Lin and others [8], Tsai and others [5], Cavas [17] and Chang and others [18], but contradictory to some extent with the less-studied aspects. For example, students’ understanding was one of the most frequently studied aspects in our study, but infrequent in the earlier studies. On the other hand, the educational goal was one of the less-studied aspects both in our study, the above-mentioned studies and in Cavas’s [17] study. Nevertheless, we were not able to find a coherent pattern. In addition, we were not able to compare the missing gaps, as the earlier studies did not pay interest to this aspect.
On the other hand, compared with the O’Toole and others study [16] is challenging as they have used a categorization system more abstract than our system. For example, they use ‘Scientific Literacy’ as the main category but divide it into several sub-categories—some of which are like our foci areas, such as curriculum, and several others fall beyond the scope of our categorization system. Some other similarities also exist, for example, they have ‘Teacher’ and ‘Student’ categories as we have, but their category system comprises more sub-categories than our system (three in student foci areas 5.1–5.3 and four in teacher foci areas 7.1–7.4). Despite the differences in the categorization system, they also found that teachers and students are among the more frequently studied aspects. They show that teaching strategies are one of the more frequently studied aspects, which is similar to our finding (focus 7.3, teacher’s pedagogical actions).

4.2. Discipline Characteristics

The results show (Table 4) that there are differences between the ESERA and NorDiNa forums as well as among disciplines. In general, the research in ESERA seems to be more versatile when compared with NorDiNa. There are fewer empty cells in Table 4 in the ESERA data pool than in NorDiNa data pool and hence the research in ESERA has better coverage than in NorDiNa. On the other hand, it can be said that the research is more focused in the NorDiNa than in the ESERA forum. The difference is even more apparent if we compare the distributions of the subject-oriented papers between ESERA and NorDiNa and leave out the science-oriented papers. If we compare the subject orientation within the forum and between forums, it seems that the distribution of the biology oriented papers differs from the others to some extent. There, the research seems to focus more on pupils’ preconceptions and attitudes (F5.1) and to the teachers’ relation to the content (F6) than in the other disciplines.
The distribution of chemistry education papers is in line with the findings by Teo and others [19]. They found that conceptual understanding is the most popular topic, reaching 26% of all papers, which is similar to our data pool. Further, they found that teachers’ teaching was the second most frequently studied focus area, which is also in line with our finding. The similarity goes even deeper as Teo and others show that one of the most frequently studied specific focus areas was the participants’ attitudes and beliefs, which is in line with our finding. Even though the number of chemistry education research papers in the study by Teos and others (N = 650) is much bigger than in ours, the many similarities infer that research in chemistry education focuses on the same foci in different publication forums. Similar coherence appears between our data and those of Gul and Sozbilir [1]’s study on biology education. Common to their study and ours is emphasis on the students’ understanding, their learning results, and the teachers’ teaching.

4.3. Educational Scope

Table 5 shows that course-level studies were more frequent in ESERA than in NorDiNa. In general, it seems that science education studies in our data pool focused on smaller course- and organizational-level studies, whereas society and international-level studies are rare. The earlier studies do not provide a comprehensive database for comparison, as few studies have paid any attention to this topic. Some studies give information on the educational levels, such as Gul and Sozbilir [1], Teo and others [19] and O’Toole and others [16]. However, it is a different classification attribute than the educational scope. Nor are the sample sizes [1] comparable with the educational scope, as a small sample size does not self-evidently refer to course-level study and a large sample size to international-level study, but they depend on the practical arrangements and the context of the study. Hence, the following finding has some novelty as it has not been reported well before our study. First, most of the studies in ESERA focus on the course level and less on the other levels, whereas the organization-level studies were the most frequent in the NorDiNa journal. Secondly, international-level studies are rare in both publication forums.
There are several plausible explanations for the differences, but we propose that the foundational difference lies in the characteristic features of the publication channels. It is typical among SER forums that the quality requirements and the threshold level for a manuscript to be published in the journal are higher than in conference proceedings. This means that studies with smaller sample sizes and a more local context are more likely to exceed the threshold level of conference proceedings than a journal. More comprehensive data and bigger sample sizes are needed in order to meet the journal threshold level which is better achieved in two or more classrooms/schools study design. This trend produces a larger proportion of organizational- and society level studies in NorDiNa.
The small number of international-level studies may originate from the reporting style. When the international-level studies are reported as a whole, they tend to be massive reports and therefore reported as monographs or similar publications, such as a PISA report or a TIMMS report. When international-level studies comprise research teams from different countries, such as in EU projects, it is plausible that each country reports their own case studies and therefore are classified as either society level studies or as organizational-level studies. Consequently, international-level studies may be rare, but this does not mean that international co-operation would be scarce. Nevertheless, the number of international-level studies is underrepresented in these publication forums and represents a gap in the data.

4.4. Educational Level

Table 6 shows that tertiary level studies are more frequent in ESERA than in NorDiNa, whereas secondary level studies are more frequent in NorDiNa than in ESERA. Studies focusing on both the upper secondary and tertiary levels and from primary to tertiary are uncommon in both forums. Our data pool has been restricted to upper secondary and tertiary levels and therefore the comparison to earlier studies is limited. The distribution of studies in the NorDiNa journal seems to be more in line with studies by Gul and Sozbilir [1], Tsai and others [5] and Lin and others [8] studies than with the ESERA proceedings. The conclusion is tentative, but it seems that NorDiNa focuses more on school education whereas the tertiary level is better-represented in the ESERA conference proceedings. Further, this is corroborated with the findings of O’Toole and others [16] that studies focusing on secondary level education are the most frequent.
It seems that the ESERA conference focuses more on tertiary level education than the NorDiNa journal. This could be explained by the different audiences. Conferences are aimed at a more research-oriented audience and the participants are typically researchers working in the tertiary sector, while the journal audience is more international and also aimed at school educators as well. However, the findings of the previous studies are not strictly comparable in terms of the correlation between the distribution of educational levels and the nature of the audience, if compared with each other and therefore the conclusions are highly tentative. This topic could be investigated more and represents some a gap in the field of educational research.

4.5. Teacher Education

Table 7 shows that the impact of the teacher education (TE) context appears in both data pools. In general, 53% of the research papers in ESERA proceedings focused on TE and 29% in NorDiNa. The teachers’ impact on how the students’ study (F7.3) is more frequent in the TE context than in the non-TE context. The trend is similar in the NorDiNa journal, but more emphasis is paid on teachers’ self-reflection (F7.4), the teacher–content relationship (F6), teacher characteristics (F3) and teachers’ understanding about students’ actions towards achieving goals (F7.2).
The proportions of TE studies in our data pool are notably bigger if compared with the studies by Gul and Sozbilir [1] on biology (2.8%), Teo and others [19] on chemistry (5%), Lin and others [8] (6%) or O’Toole and others [16] study (7%). Only the Cavas [17] study (23%) is in line with our study, but still reaches a smaller proportion than NorDiNa (29%) and an especially smaller proportion than ESERA (53%). The earlier studies do not provide comprehensive background information in order to carry out a comparative analysis, but it seems that the NorDiNa and ESERA forums are oriented more to teacher education publications than the IJSE, the JRST or SE [9], SSE or RSE [1], JSET or RSTE [2], or CERP or JCE [19].

4.6. Research Methodology

Table 8 shows that a clear majority of the studies published in the ESERA proceedings and the NorDiNa journal are empirical: 87% and 81%, respectively. However, distributions in the ESERA and NorDiNa forums also differ in terms of research methodology. The quantitative and the qualitative research methodologies are equally represented in the ESERA proceedings, whereas the qualitative method is clearly the most frequent methodology in NorDiNa. Mixed method studies are similarly represented in the ESERA and NorDiNa data pools, but the numbers of mixed methodology studies differ across these publication channels. Descriptive studies were absent in NorDiNa, and studies with no empirical results (no methodology used, N/A) were much more frequent in NorDiNa than in ESERA.
The proportions of empirical studies compared with other research papers are in line with many previously mentioned studies, such as those by Tsai and Wen [6] (87%), Lee and others [7] (87%), or Lin and others [8] (91%). However, the distributions of quantitative and qualitative research methodologies vary between our data pools and among our research and that of other researchers. It seems that a quantitative methodology is more frequent in the ESERA data pool than in many of the reference studies. In this sense, the distribution in the NorDiNa journal is closer to the reference journals used in O’Toole’s and others’ [1] study. Lee and others [8] noted that most empirical studies were qualitative among the highly cited papers. The proportion of qualitative vs. quantitative studies in ESERA is closer to that found by Gul and Sozbilir [1] and Teo and others [19] where these methodologies are equally as common. The mixed method studies were equally frequent in ESERA and NorDiNa, but the propotion of mixed methodology studies differ considerably in all forums. This varies from Gul’s and Sozbilir’s [1] 4% to Teo’s and others’ 52% [19]. Maybe this indicates journal characteristics or trends of some sort. Descriptive studies were absent from NorDiNa, and studies with no empirical results (no methodology used, N/A) were much more frequent in NorDiNa than ESERA. However, this is easily explained with a relatively large number of curriculum studies reported in NorDiNa as a curriculum renovation was underway in Norway during 2005–2013.

5. Discussion

The results showed that research in science education in this data pool focused much on students’ understanding, attitudes, learning results, and on the teacher’s impact on these aspects. Much less studied aspects were the teacher characteristics, what happens in the classroom while students are studying, how the teachers perceive the students’ actions, attitudes and understanding, the students’ feedback to the teacher, and teachers’ self-reflection. These findings are in line with our previous findings in computing education research [10], science education research [14], and engineering education research [37].
This leads us to wonder why some aspects are studied more than the others in our data pool (ESERA, NorDiNa). We sought the answer to this question, but our results did not give clear answers. One explanation could be the research policy and the policy making. The Ministries of Education, the National Science Academies, and the National Boards of Education can have a significant impact on what research is done in the universities. Politicians need up-to-date information on teaching and learning and therefore these aspects could be funded more often. The researchers might even strengthen reciprocity and propose topics that will probably get the most funding. It is also possible that the less studied aspects will be published in other forums that focus more on the general pedagogy, such as the European Conference on Educational Research (ECER). However, we see that the ESERA conference should provide a forum broad enough for all kinds of research to appear to some extent at least. Therefore, the alternative publication channels would not seem to be a probable explanation, especially as the same aspects are absent from both the ESERA and the NorDiNa forums.
Nonetheless, we are not alone with our question, as earlier researchers searched for an answer to this too. For example, O’Toole and others [16] proposed that perhaps there has been an ongoing generational shift of some kind as the number of studies about teachers’ pedagogical content knowledge, constructivism and public understanding about science are declining and other topics are increasing. This seems plausible and studies on the teacher’s pedagogical content knowledge were uncommon in our data pools as well. On the other hand, this is also good news for those who are looking for less-studied research topics. It is unlikely that these issues have been fully researched and there is nothing new to discover. On the contrary, there is much to be discovered and one of the aims of our research was to find the less-studied aspects. A fundamental change in classroom dynamics is underway as learning environments are digitalizing. Clearly teachers’ pedagogical content knowledge and students’ study habits should be investigated much more.
The investigations published in NorDiNa and ESERA focus on course- and organizational-level investigations, whereas society and international-level studies were scarce. The finding has some novelty as this aspect has not been reported in earlier studies [16]. There are several tentative explanations ranging from ease and inexpensiveness of small-scale studies to the generalizability of the results. Readily collectable data and a faster publication phase are positive factors for small-scale studies. However, small scale is not a synonym for low quality. Results of a small-scale case study can be as relevant as those from a large survey. Society and international-level studies can be complex to arrange and expensive to carry out. These studies require research groups in which each researcher has their own topic. Consequently, larger studies are potentially divided into smaller investigations and specific study designs, which in turn can increase the number of course- and organizational-level studies.
Our data pool was limited to specific educational levels and therefore comparisons to earlier studies are tentative. We noticed that the studies in the NorDiNa journal focus more on school education and less on tertiary level education. The frequency of school education level studies and the infrequency of continuing education studies in NorDiNa is in line with the study by O’Toole and others [16]. Most of the studies reported were carried out at secondary and post-secondary levels (>40%), whereas primary (15%) and especially early childhood studies (2.5%) were less frequent. However, studies focusing on both the upper secondary and tertiary levels were rare, and the trend was similar in both journals. We conclude that this might indicate some sort of missing spot in science education research in Europe.
We sought to establish whether there are any discipline-based characteristics. Distributions between chemistry and physics education were similar and focused on the same foci areas, but biology education papers seemed to focus more on conceptual understanding than chemistry and physics education research. This view is supported by Asshoff and Hammann [36], who showed in their study that the most frequently investigated topic was the pupils’ conceptual understanding—more specifically, their understanding about genetics and ecology. We propose that this might originate from the subject matter itself, which has taken giant leaps in DNA technology, molecular biology, virology, climate change, cloning, and health care over in recent decades. Our knowledge of human biology and environmental systems has been updated rapidly and the teachers, the pupils and the curricula have difficulties staying up to date. Consequently, perhaps, research on conceptual understanding has a more central role in biology education and therefore is a slightly more popular topic in our data pool as well.
We were interested to determine whether the teacher education (TE) context appeared in the data pool somehow. In other words, we asked ourselves whether the studies carried out in the TE context have any characteristic features. We found that the impact of the TE context appears in the data pool to some extent; however, the differences were not as apparent as one might have expected. The TE-related studies were distributed more evenly in different foci in NorDiNa, but they were more frequent in ESERA. This could corroborate the above mentioned finding that the NorDiNa journal focuses more on school education, and higher education studies are published elsewhere. The finding is new, as the earlier researchers had not expressed much interest in it.
Why are some methodologies more frequently used than others? In contrast with the general belief, studies using quantitative methodologies are not as frequent as we think, and qualitative methodologies have become predominant. There can be several reasons, such as research tradition for using a particular methodology. For example, qualitative research has been a popular research methodology in the Nordic countries for a long time, perhaps partially due to the influence of Marton’s tradition in Sweden. We noticed the same trend in the ESERA data pool when we isolated the Nordic studies from the other European studies. The infrequency of mixed methodology studies is presumably due to its complexity as it calls for knowhow in both the quantitative and qualitative methodologies. These findings are in line with the previous researchers’ findings [6,7,8,16], but also dissimilarities appeared. For example, the others reported a small number of theoretical papers, but in the NorDiNa data pool we noticed a somewhat notably large proportion, reaching 19% of all studies. We took a closer look into the data pool and found that there was curriculum review underway in Norway during 2005–2013 which solely explained the anomaly. As we deducted this distortion effect, the proportion of descriptive and theoretical studies was scarce and in line with the other studies in the field. However, the editorial line of the NorDiNa journal may have some additional impact on the frequency of publishing theoretical papers. The editorial board endorses authors to send descriptions of their ongoing projects and short abstracts of dissertations in the field wherein methodology descriptions might be limited or absent.
Also, the research team discussed the possible impact of the editorial line on the distribution of papers in different forums. True, to some extent the given topics published can be influenced by journal editors, and thus, with changes in editorship, the emphases of a journal or conference may change simply because of the views of the editors. However, we see that this is merely a theoretical note as the journal editors are not changed very often, and the editorial line is better seen as a joint vision of the editorial board rather than of one editor. On the contrary, it is more plausible that the editorial board will try to keep the editorial line the same from year to year, instead of having changes in the editorship. The situation might be different in conferences which typically have a thematic emphasis and which change from conference to conference and can have some impact on the versatility and manifoldness of the research papers.

6. Recommendations

We advise the authors to pay more attention to the internal coherence and clarity in their manuscripts. The didactic foci were not always clearly described in the paper, not even in the journal articles. The theoretical framework might introduce several foci, but not all of them were reported in the results section or one focus was emphasized more than the others. In several cases, we noticed some disparity between the title, the written research questions and what the results were about. Consequently, a classification method based solely on the title, the keywords, the abstract, or the research questions is apt to lead to misinterpretation. Comprehensive reading is therefore suggested. The approach is time consuming but improves the quality of the analysis.
The theoretical underpinnings of the DFCM are in school education and therefore it works best to illuminate the relations between a teacher, student, and the content in a formal education context. In the future, the DFCM will be further developed to cope better with informal learning phenomena. The resolution of the DFCM could be further developed with more subcategories. Currently, some interesting aspects such as teacher’s pedagogical knowledge and evaluation do not show in the results to an extent which relates to their significance in education. In particular, the evaluation aspect could be addressed better.
Based on the findings from this study, it seems that science education research could be more versatile and comprehensive, and currently misses some important aspects from the didactic triangle point of view. Polarization of the research into fewer aspects may not have been this apparent within the academic community before our study.

Author Contributions

Writing—original draft, J.L., A.K., P.K. and L.M.

Funding

This research received no external funding.

Acknowledgments

Open access funding provided by University of Helsinki.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gul, S.; Sozbilir, M. International Trends in Biology Education Research from 1997 to 2014: A Content Analysis of the papers in Selected Journals. Eurasia J. Math. Sci. Technol. Educ. 2016, 12, 1631–1651. [Google Scholar] [CrossRef]
  2. Barth, M.; Rieckmann, M. State of the art in research on higher education for sustainable development. In Routledge Handbook of Higher Education for Sustainable Development, 1st ed.; Barth, M., Michelsen, G., Rieckmann, M., Thomas, I., Eds.; Routledge: London, UK, 2016; pp. 100–113. [Google Scholar]
  3. Erduran, S.; Ozdem, Y.; Park, J.Y. Research trends on argumentation in science education: A journal content analysis from 1998–2014. Int. J. STEM Educ. 2015, 2, 5. [Google Scholar] [CrossRef] [Green Version]
  4. Lin, T.C.; Hsu, Y.S.; Lin, S.S.; Changlai, M.L.; Yang, K.Y.; Lai, T.L. A review of empirical evidence on scaffolding for science education. Int. J. Sci. Math. Educ. 2012, 10, 437–455. [Google Scholar] [CrossRef]
  5. Tsai, C.-C.; Wu, Y.-T.; Lin, Y.-C.; Liang, J.-C. Research Regarding Science Learning in Asia: An Analysis of Selected Science Education Journals. APAC Educ. Res. 2011, 20, 352–363. [Google Scholar]
  6. Tsai, C.-C.; Wen, M.L. Research and trends in Science Education from 1998 to 2002: A content analysis of publication in selected journals. Int. J. Sci. Educ. 2005, 27, 3–14. [Google Scholar] [CrossRef]
  7. Lee, M.-H.; Wu, Y.-T.; Tsai, C.-C. Research trends in Science Education from 2003 to 2007: A content analysis of publications in selected journals. Int. J. Sci. Educ. 2009, 31, 1999–2020. [Google Scholar] [CrossRef]
  8. Lin, T.-C.; Lin, T.-J.; Tsai, C.-C. Research trends in Science Education from 2008 to 2012: A systematic content analysis of publications in selected journals. Int. J. Sci. Educ. 2014, 36, 1346–1372. [Google Scholar] [CrossRef]
  9. Kansanen, P. Studying-the Realistic Bridge Between Instruction and Learning. An Attempt to a Conceptual Whole of the Teaching-Studying-Learning Process. Educ. Stud. 2003, 29, 221–232. [Google Scholar] [CrossRef]
  10. Kinnunen, P.; Meisalo, V.; Malmi, L. Have we missed something? Identifying missing types of research in computing education. In Proceedings of the Sixth International Workshop on Computing Education Research, Aarhus, Denmark, 9–10 August 2010; ACM: New York, NY, USA, 2010; pp. 13–22. [Google Scholar] [CrossRef]
  11. Kinnunen, P.; Malmi, L. Pedagogical Focus of Recent Engineering Education Research Papers. In Proceedings of the SEFI conference, Leuven, Belgium, 16–20 September 2013; SEFL: Brussels, Belgium, 2013. [Google Scholar]
  12. Kinnunen, P.; Lampiselkä, J.; Malmi, L.; Meisalo, V. Pedagogical Aspects in Finnish Science Education Research Publications. In Proceedings of the 2012 Annual Conference of Finnish Mathematics and Science Education Research Association, Jyväskylä, Finland, 8–9 November 2012; Hähkiöniemi, M., Leppäaho, H., Nieminen, P., Viiri, J., Eds.; University of Jyväskylä: Jyväskylä, Finland, 2013; pp. 153–164. [Google Scholar]
  13. Kinnunen, P.; Lampiselkä, J.; Malmi, L.; Meisalo, V. Identifying Missing Types of Nordic Research in Science Education. In E-Book Proceedings of the ESERA 2013 Conference: Science Education Research for Evidence-Based Teaching and Coherence in Learning, Proceedings of the European Science Education Research Association Conference, Nicosia, Cyprus, 2–7 September 2013; Constantinou, C.P., Papadouris, N., Hadjigeorgiou, A., Eds.; European Science Education Research Association: Nicosia, Cyprus, 2014. [Google Scholar]
  14. Kinnunen, P.; Lampiselkä, J.; Meisalo, V.; Malmi, L. Research on teaching and learning in Physics and Chemistry in NorDiNa Papers. NorDiNa 2016, 12, 3–20. [Google Scholar] [CrossRef]
  15. Lampiselkä, J.; Kaasinen, A.; Kinnunen, P.; Malmi, L. Research on Teaching and Learning in Biology, Chemistry and Physics. In ESERA 2013 Conference, Proceedings of the 12th ESERA Conference, Dublin, Ireland, 21–25 August 2017; Finlayson, O., McLoughlin, E., Erduran, S., Childs, P., Eds.; Dublin City University: Dublin, Ireland, 2018. [Google Scholar]
  16. O’Toole, J.M.; Freestone, M.; McKoy, K.S.; Duckworth, B. Types, Topics and Trends: A Ten-Year Review of Research Journals in Science Education. Educ. Sci. 2018, 8, 73. [Google Scholar] [CrossRef] [Green Version]
  17. Cavas, B. Research Trends in Science Education International: A Content Analysis for the Last Five Years (2011–2015). Sci. Educ. Int. 2015, 26, 573–588. [Google Scholar]
  18. Chang, Y.-H.; Chang, C.-Y.; Tseng, Y.-H. Trends in Science Education research: An automatic content analysis. J. Sci. Educ. Technol. 2010, 19, 315–331. [Google Scholar] [CrossRef]
  19. Teo, T.W.; Goh, M.T.; Yeo, L.W. Chemistry education research trends: 2004–2013. Chem. Educ. Res. Pract. 2014, 15, 470–487. [Google Scholar] [CrossRef]
  20. Wu, Y.-T.; Hou, H.-T.; Hwang, F.-K.; Lee, M.-H.; Lai, C.-H.; Chiou, G.-L.; Lee, S.W.-Y.; Hsu, Y.-C.; Liang, J.-C.; Chen, N.-S.; et al. A Review of Intervention Studies on Technology-assisted Instruction from 2005–2010. Educ. Technol. Soc. 2013, 16, 191–203. [Google Scholar]
  21. Constantinou, C.P.; Papadouris, N.; Hadjigeorgiou, A. (Eds.) E-Book Proceedings of the ESERA 2013 Conference: Science Education Research for Evidence-Based Teaching and Coherence in Learning, Proceedings of the European Science Education Research Association Conference, Nicosia, Cyprus, 2–7 September 2013; European Science Education Research Association: Nicosia, Cyprus, 2014. [Google Scholar]
  22. Kansanen, P.; Meri, M. The didactic relation in the teaching-studying-learning process. In Didaktik/Fachdidaktik as the Science(-s) of the Teaching Profession? Hudson, B., Buchberger, F., Kansanen, P., Steel, H., Eds.; Thematic Network on Teacher Education in Europe: Umeå, Sweden, 1999; Volume 2, pp. 107–116. [Google Scholar] [CrossRef]
  23. Kinnunen, P. Challenges of Teaching and Studying Programming at a University of Technology—Viewpoints of Students, Teachers and the University. Doctoral Dissertation, Helsinki University of Technology, Espoo, Finland, 2009. [Google Scholar]
  24. Simmie, G.M.; Lang, M. Deliberative Innovation in Science Education. In E-Book Proceedings of the ESERA 2013 Conference: Science Education Research for Evidence-Based Teaching and Coherence in Learning, Proceedings of the European Science Education Research Association Conference, Nicosia, Cyprus, 2–7 September 2013; Constantinou, C.P., Papadouris, N., Hadjigeorgiou, A., Eds.; European Science Education Research Association: Nicosia, Cyprus, 2014. [Google Scholar]
  25. Vulperhorst, J.P.; Wessels, K.R.; Bakker, A.; Akkerman, S.F. How Do Stem-interested Students Pursue Multiple Interests in Their Higher Educational Choice? Int. J. Sci. Educ. 2018, 40, 828–846. [Google Scholar] [CrossRef] [Green Version]
  26. Kollas, S.; Halkia, K. Second Chance School S in Greece: Science Teachers’ Views and Practices on Designing Scientific Literacy Curricula. In E-Book Proceedings of the ESERA 2013 Conference: Science Education Research for Evidence-Based Teaching and Coherence in Learning, Proceedings of the European Science Education Research Association Conference, Nicosia, Cyprus, 2–7 September 2013; Constantinou, C.P., Papadouris, N., Hadjigeorgiou, A., Eds.; European Science Education Research Association: Nicosia, Cyprus, 2014. [Google Scholar]
  27. Elstad, E.; Turmo, A. Kjønnsforskjeller I Motivasjon, Læringsstrategibruk Og Selvregulering I Naturfag. NorDiNa 2007, 3, 57–75. [Google Scholar] [CrossRef] [Green Version]
  28. Young, A.M.; Wendel, P.J.; Esson, J.M.; Plank, K.M. Motivational Decline and Recovery in Higher Education Stem Courses. Int. J. Sci. Educ. 2018, 40, 1016–1033. [Google Scholar] [CrossRef] [Green Version]
  29. Xenofontos, N.; Theocharous, M.; Manoli, C.; Zacharia, Z. Which Information Resources Do Students Use When They Produce Learning Artifacts in Science. In E-Book Proceedings of the ESERA 2013 Conference: Science Education Research for Evidence-Based Teaching and Coherence in Learning, Proceedings of the European Science Education Research Association Conference, Nicosia, Cyprus, 2–7 September 2013; Constantinou, C.P., Papadouris, N., Hadjigeorgiou, A., Eds.; European Science Education Research Association: Nicosia, Cyprus, 2014. [Google Scholar]
  30. Maidou, A.; Polatoglo, H.M. Motivating Weak Secondary Vocational School Students through Participating with Scientific Projects in A Students’ Conference. In E-Book Proceedings of the ESERA 2013 Conference: Science Education Research for Evidence-Based Teaching and Coherence in Learning, Proceedings of the European Science Education Research Association Conference, Nicosia, Cyprus, 2–7 September 2013; Constantinou, C.P., Papadouris, N., Hadjigeorgiou, A., Eds.; European Science Education Research Association: Nicosia, Cyprus, 2014. [Google Scholar]
  31. Tröger, H.; Strübe, M.; Sumfleth, E.; Tepner, O. Professional Knowledge of Chemistry Teachers Video Analysis of Chemistry Lessons. In E-Book Proceedings of the ESERA 2013 Conference: Science Education Research for Evidence-Based Teaching and Coherence in Learning, Proceedings of the European Science Education Research Association Conference, Nicosia, Cyprus, 2–7 September 2013; Constantinou, C.P., Papadouris, N., Hadjigeorgiou, A., Eds.; European Science Education Research Association: Nicosia, Cyprus, 2014. [Google Scholar]
  32. Broman, K.; Ekborg, M.; Johnels, D. Chemistry in Crisis? Perspectives on Teaching and Learning Chemistry in Swedish Upper Secondary Schools. NorDiNa 2011, 7, 43–60. [Google Scholar] [CrossRef] [Green Version]
  33. Nilsson, P. Recognizing the Needs—Student Teachers’ Learning to Teach from Teaching. NorDiNa 2008, 4, 92–107. [Google Scholar] [CrossRef]
  34. Bogner, F.X.; Sotiriou, S. Pathway towards a Standard-Based Approach to Teaching Science by Inquiry. In E-Book Proceedings of the ESERA 2013 Conference: Science Education Research for Evidence-Based Teaching and Coherence in Learning, Proceedings of the European Science Education Research Association Conference, Nicosia, Cyprus, 2–7 September 2013; Constantinou, C.P., Papadouris, N., Hadjigeorgiou, A., Eds.; European Science Education Research Association: Nicosia, Cyprus, 2014. [Google Scholar]
  35. Bölsterli, K.; Rehm, M.; Wilhelm, M. The Importance of Competence Oriented Textbook Standards: The Gap Between Teacher Trainers’ and Teachers’ Points of View. In E-Book Proceedings of the ESERA 2013 Conference: Science Education Research for Evidence-Based Teaching and Coherence in Learning, Proceedings of the European Science Education Research Association Conference, Nicosia, Cyprus, 2–7 September 2013; Constantinou, C.P., Papadouris, N., Hadjigeorgiou, A., Eds.; European Science Education Research Association: Nicosia, Cyprus, 2014. [Google Scholar]
  36. Asshoff, R.; Hammann, M. Content analysis of the ERIDOB proceedings and its comparison to an international journal of science education. In Proceedings of the VIIth Conference of European Researchers in Didactics of Biology (ERIDOB), Utrecht, The Netherlands, 16–20 September 2008; Hammann, M., Boersma, K., Waarlo, A.J., Eds.; Utrecht University Press: Utrecht, The Netherlands, 2009. [Google Scholar]
  37. Malmi, L.; Adawi, T.; Curmi, R.; de Graaff, E.; Duffy, G.; Kautz, C.; Kinnunen, P.; Williams, B. How authors did it—A methodological analysis of recent engineering education research papers in the European Journal of Engineering Education. Eur. J. Eng. Educ. 2018, 43, 171–189. [Google Scholar] [CrossRef]
Figure 1. The didactic focus-based categorization method (DFCM) is based on the didactic triangle (left). The analysis units are described as foci and comprise the main actors of the instructional process (1 = content/goal, 2 = student, 3 = teacher) and the interactions between them (4–8).
Figure 1. The didactic focus-based categorization method (DFCM) is based on the didactic triangle (left). The analysis units are described as foci and comprise the main actors of the instructional process (1 = content/goal, 2 = student, 3 = teacher) and the interactions between them (4–8).
Education 09 00294 g001
Table 1. Summary of survey research papers in science education 1.
Table 1. Summary of survey research papers in science education 1.
ReferenceJournalYearsPapers AnalyzedField/SubareaFocus of Analysis/Method
Tsai and Wen [6]IJSE, JRST, and SE1998–2002802Science educationauthors, research type, and research topic
Lee, Wu and Tsai [7]IJSE, JRST, and SE2003–2007869Science educationauthors, research type, research topic, and highly cited papers
T.-C. Lin, T.-J. Lin and Tsai [8]IJSE, JRST, and SE2008–2012990Science educationauthors, research type, research topic, and highly cited papers
Y.-H. Chang, C.-Y. Chang and Tseng [18]IJSE, JRST, RISE, and SE1990–20073039Science educationresearch topic (scientometric analysis)
Cavas [17]SEI2011–2015126Science educationauthors and research topic
Tsai, Wu, Lin and Liang [5]IJSE, JRST, RISE, and SE2000–2009228Science education/Asian studentsauthors, research topic, sample’s educational level, and highly cited papers
Gul and Sozbilir [1]IJSE, JBE, JSET, JRST, RISE, RSTE, SE, and SSE1997–20141376Biology education researchresearch topics, subject matter in BER, research designs, data collection tools, sample’s educational level, and data analysis methods
Teo, Goh, and Yeo [19]CERP, IJSE, JRST, JCE, RISE, and SE2004–2013650Chemistry education researchresearch topic, research methods, sample’s educational level and location, and most contributing authors
O’Toole, Freestone, McKoy and Duckworth [16]IJSE, JRST, RISE, SE, and SSE2005–20142294Science educationresearch topic, research type, research methods, and sample’s educational level
1 CERP = Chemistry Education Research & Practice, IJSE = International Journal of Science Education, IS = Instructional Science, JBE = Journal of Biology Education, JCS = Journal of Curriculum Studies, JLS = Journal of the Learning Sciences, JRST = Journal of Research in Science Teaching, JTE = Journal of Teacher Education, JSET = Journal of Science Education and Technology, JCE = Journal of Chemical Education, RISE = Research in Science Education, RSTE = Research in Science & Technological Education, SE = Science Education, SEI = Science Education International, and SSE = Studies in Science Education.
Table 2. The list of didactic foci and their definitions. Each didactic focus operates at four levels: the course, organization, society, and international level. Examples include references to publications in which the corresponding focus has been a research topic.
Table 2. The list of didactic foci and their definitions. Each didactic focus operates at four levels: the course, organization, society, and international level. Examples include references to publications in which the corresponding focus has been a research topic.
Name of The Didactic FocusDefinitionExample
1. Goals and contentsThe goals and/or contents of a course, study module, and goals of a degree program.[24]
2. StudentsThe students’ characteristics (e.g., gender, level of education).[25]
3. TeachersThe teachers’ characteristics.[26] (also includes Focus 6)
4. Relations between students and teacherHow students perceive the teacher (e.g., studies on how competent students think the teacher is) or the teacher perceives the students.[27] (also includes Focus 5.1 and 5.2)
5.1 Students’ understanding of and attitude about goals and contentsHow students understand a central concept in the course or how interesting students find the topic.[28]
5.2 The actions (e.g., studying) the students do to achieve the goalsStudents’ actions include all actions/lack of actions that are in relation to learning and achieving the goals.[29]
5.3 The results of the students’ actionsThe outcome of the study process, e.g., a study that includes a discussion of the learning outcomes after using a new teaching method.[30]
6. Relation between the goals/contents and the teacherHow teachers understand, perceive or value different aspects of the goals and contents.[31]
7.1 Teachers’ conceptions of students’ understanding of/attitude to goals/contents.What teachers think about student’s perceptions and attitudes towards goals and content, e.g., studies on the knowledge teachers have about students’ understanding of some central concept/process.[32]
7.2 Teachers’ conceptions of students’ actions towards achieving goalsTeachers’ perceptions of students’ actions (e.g., studying).[33]
7.3 Teachers’ didactic activitiesTeachers’ didactic actions (e.g., lecturing, providing a learning environment, and assessment methods).[34]
7.4 Teachers’ reflections on his/her own didactic actionsE.g., to what degree teachers think the new teaching method was successful.[35]
Table 3. Distribution of research foci in ESERA, NorDiNa and IJSE.
Table 3. Distribution of research foci in ESERA, NorDiNa and IJSE.
Foci
Publication12345.15.25.367.17.27.37.48
ESERA
N (paper) = 176
4%3%2%-24%7%22%6%--24%2%6%
NorDiNa
N (paper) = 52
11%1%3%1%23%9%10%6%1%1%19%6%8%
Table 4. Distribution of the research foci in the ESERA and NorDiNa forums, based on emphasis on the disciplines.
Table 4. Distribution of the research foci in the ESERA and NorDiNa forums, based on emphasis on the disciplines.
Foci
12345.15.25.367.17.27.37.48
ESERA
Biology
N (papers) = 23
8%3%--32%-16%13%3%-26%--
Chemistry
N (papers) = 32
2%5%--25%8%23%7%--20%2%8%
Physics
N (papers) = 40
5%1%1%-29%5%24%3%--27%-4%
Science
N (papers) = 81
3%3%4%-18%8%22%5%--24%3%8%
NorDiNa
Biology
N (papers) = 10
----50%17%17%8%--8%--
Chemistry
N (papers) = 4
17%---33%17%17%-17%----
Physics
N (paper) = 15
6%---33%11%11%---33%6%-
Science
N (papers) = 23
17%2%5%2%10%5%7%10%-2%20%10%10%
Table 5. Distribution of research foci into different scopes.
Table 5. Distribution of research foci into different scopes.
ScopeESERANorDiNa
Course61%29%
Organization20%48%
Society9%17%
International9%6%
Table 6. Distribution of foci based on educational level.
Table 6. Distribution of foci based on educational level.
Educational LevelESERANorDiNa
ISCED 1–3: Primary + secondary3%12%
ISCED 2–3: Secondary8%10%
ISCED 3: Upper secondary20%37%
ISCED 3, 6–8: Upper sec. + tert.1%0%
ISCED 1–3, 6–8: Prim. + sec. + tert.1%2%
ISCED 6–8: Tertiary (major subject)19%19%
ISCED 6–8: Tertiary (teacher ed.)24%15%
ISCED 6–8: Tertiary (continuing ed.)16%2%
Multiple mix of above mentioned7%4%
Table 7. Distribution of the research foci in ESERA and NorDiNa based on emphasis on teacher education (TE).
Table 7. Distribution of the research foci in ESERA and NorDiNa based on emphasis on teacher education (TE).
Foci
Publication12345.15.25.367.17.27.37.48
ESERA, TE
N (papers) = 93
3%2%3%-22%4%24%3%--28%2%8%
ESERA, non-TE
N (papers) = 83
5%4%1%-26%9%19%9%1%-19%2%4%
NorDiNa, TE
N (papers) = 15
12%-8%-16%-12%16%-4%12%12%8%
NorDiNa, non-TE
N (papers) = 37
11%2%-2%26%13%9%2%2%-22%4%7%
Table 8. Distribution of research methodologies used in ESERA and NorDiNa.
Table 8. Distribution of research methodologies used in ESERA and NorDiNa.
Research MethodologyESERANorDiNa
Quantitative36%13%
Qualitative32%56%
Mixed19%12%
Descriptive10%0%
N/A3%19%

Share and Cite

MDPI and ACS Style

Lampiselkä, J.; Kaasinen, A.; Kinnunen, P.; Malmi, L. Didactic Focus Areas in Science Education Research. Educ. Sci. 2019, 9, 294. https://doi.org/10.3390/educsci9040294

AMA Style

Lampiselkä J, Kaasinen A, Kinnunen P, Malmi L. Didactic Focus Areas in Science Education Research. Education Sciences. 2019; 9(4):294. https://doi.org/10.3390/educsci9040294

Chicago/Turabian Style

Lampiselkä, Jarkko, Arja Kaasinen, Päivi Kinnunen, and Lauri Malmi. 2019. "Didactic Focus Areas in Science Education Research" Education Sciences 9, no. 4: 294. https://doi.org/10.3390/educsci9040294

APA Style

Lampiselkä, J., Kaasinen, A., Kinnunen, P., & Malmi, L. (2019). Didactic Focus Areas in Science Education Research. Education Sciences, 9(4), 294. https://doi.org/10.3390/educsci9040294

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop