Next Article in Journal
Unpacking the Shortcomings of “College and Career Readiness” as an Educative Approach in Urban Schools as Preparation for Tomorrow’s Economy
Next Article in Special Issue
Virtual Laboratories in Tertiary Education: Case Study Analysis by Learning Theories
Previous Article in Journal
Bridging Theory and Practice Using Facebook: A Case Study
Previous Article in Special Issue
Higher Education Students’ Perception of the E-Portfolio as a Tool for Improving Their Employability: Weaknesses and Strengths
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Implementing Digital Competencies in University Science Education Seminars Following the DiKoLAN Framework

1
Chair of Science Education, University of Konstanz, 78464 Konstanz, Germany
2
Department of Physics, University of Konstanz, 78464 Konstanz, Germany
3
Chair of Science Education, Thurgau University of Education, 8280 Kreuzlingen, Switzerland
*
Authors to whom correspondence should be addressed.
Educ. Sci. 2022, 12(5), 356; https://doi.org/10.3390/educsci12050356
Submission received: 20 March 2022 / Revised: 6 May 2022 / Accepted: 9 May 2022 / Published: 18 May 2022

Abstract

:
Prospective teachers must acquire subject-specific digital competencies to design contemporary lessons and to promote digital competencies among students themselves. The DiKoLAN framework (Digital Competencies for Teaching in Science Education) describes basic digital competencies for the teaching profession in the natural sciences precisely for this purpose. In this article, we describe the development, implementation, and evaluation of a university course based on DiKoLAN which promotes the digital competencies of science teachers. As an example, the learning module Data Processing in Science Education is presented, and its effectiveness is investigated. For this purpose, we used a questionnaire developed by the Working Group Digital Core Competencies to measure self-efficacy, which can also be used in the future to promote digital competencies among pre-service teachers. The course evaluation showed a positive increase in the students’ self-efficacy expectations. Overall, the paper thus contributes to teacher education by using the course as a best-practice example—a blueprint for designing new courses and for implementing a test instrument for a valid evaluation.

1. Introduction

More and more schools are equipped with a continuously improving digital infrastructure including school-wide wireless network access, school cloud storage, interactive whiteboards, video projectors, and devices such as computers, laptops, or tablet computers. This opens up a lot of new opportunities but at the same time requires teachers to be trained in new or adapted competencies to fruitfully utilise these digital tools. These competencies are described in various frameworks such as UNESCO’s ICT Competency Framework for Teachers [1], the ISTE Standards for Educators [2], or the European Competence Framework for Educators (DigCompEdu) [3], all of which focus on slightly different aspects of the competence needed by teachers for making maximum use of the digital environment. In addition to those generic non-subject-specific frameworks, the DiKoLAN framework (Digital Competencies for Teaching in Science Education) focuses on digital competence for teaching the natural sciences [4,5].
Despite belonging to the generation of so called ‘digital natives,’ today’s young teachers need explicit instruction on how to productively use digital technology in schools [6,7]. Most researchers agree that digital technology needs to be integrated in teacher education curricula, and numerous strategies have been proposed in the literature to facilitate this effort [8]. To address the specific needs of science teachers, the DiKoLAN framework (Figure 1) gives a comprehensive guideline on the topics to be addressed [5]. This guideline has been used to design, teach, and evaluate a course for students in teacher education in the three natural sciences at the University of Konstanz.
The aim of this research paper is to provide an overview of the current research on the DiKoLAN framework, as well as to present the design and the evaluation of a special pre-service teacher training course tailored to foster the digital competencies described in DiKoLAN. Additionally, the investigation of the effectiveness of the individual learning modules offers a blueprint for future research on the effectiveness of university teacher training on the subject-specific use of ICT in science education.

2. Research following the DiKoLAN Framework

The DiKoLAN framework was first presented in 2020 by the Working Group Digital Core Competencies [4]. The framework was first developed for Germany and Austria and later introduced in Switzerland [9]. It was based on initiatives to promote digitisation in schools and to promote the digital competencies of prospective teachers and also based on DigiCompEdu [3], the TPACK framework [10,11], and the DPaCK model [12,13].
The curricular integration of essential digital competencies into the first phase of teacher education requires specific preliminary considerations. To be able to integrate ICT-related elements of future-proof education into the teaching practices of all faculty involved in teacher training at universities, basic digital competencies need to be structured in advance [14].
Based on core elements of the natural sciences, the authors of DiKoLAN propose seven central competency areas [15]: Documentation, Presentation, Communication/Collaboration, Information Search and Evaluation, Data Acquisition, Data Processing, and Simulation and Modelling (Figure 1). These seven central competency areas are framed by Technical Core Competencies and the Legal Framework. The unique feature of DiKoLAN is that the DPaCK-related competencies are described in great detail and take into account subject-specific, subject-didactic (e.g., [16,17]), and pedagogical perspectives from all three natural sciences (biology, chemistry, and physics).
The framework thus coordinates and structures university curricula [14,15], which has been demonstrated, e.g., for the competency area Presentation [18], using the example of low-cost photometry with a smartphone [19], or by means of a project on scientific work [20,21]. Such coordination makes cooperation between different universities, which has been suggested by Zimmermann et al., possible without any significant difficulties [22].
For an overview measurement of DiKoLAN competencies, the self-assessment tool DiKoLAN-Grid is available [5], which helps to illustrate respective learning goals in teacher training to pre-service teachers.
Initial empirical studies support the factorial separation of the application areas according to the TPACK and DPaCK frameworks into Teaching, Methods/Digitality, Content-specific context, and Special technology [5,18].

3. Methods

In this section, two important methodological aspects are presented: the design of the course and the evaluation of the course using an online self-assessment of digital competencies.

3.1. Design of the Master-Course “Science Education III—DiKoLAN”

The aim of the seminar is to promote digital core competencies in science teaching following the DiKoLAN framework [5]. The students should be made aware of the individual competencies of digital teaching and learning and reflect on their own competencies. Skills that go beyond declarative knowledge are to be acquired through practical phases. Finally, students should reflect on the methods and tools used, and what has been learned should be transferred and related to the school context.
The seminar on didactics was implemented in the summer term of 2021 for advanced student teachers in the natural sciences at the University of Konstanz. Students received 5 ECTS credits for the module, which corresponds to an average weekly workload of 10 h. Two of these hours were spent on synchronous teaching with the entire course, while the remaining time was used for preparation and follow-up, including all exercises. Figure 2 illustrates the phase structure of the 14-week seminar. It starts with a synchronous initial phase, which aims to impart skills. At the beginning, the students get an introduction into learning with and about digital media in science education, including the DiKoLAN competence framework. After the introductory week, one area of competency is highlighted in weekly meetings, which are partly framed by preparatory tasks and further exercises. In the subsequent asynchronous project phase with individual support and advice, the students design a learning scenario, consider the didactic function of the media used, and reflect on the skills required of the teacher and the pupils. In the final examination phase, the designed lesson is presented to the seminar plenum, the learning scenario is implemented in a trial lesson, and a written elaboration is submitted.

3.1.1. Introductory Module

In the first module, background information is given on the use of ICT in the science classroom, the current situation regarding digital media in schools is examined [23], and initial frameworks such as SAMR [24], ICAP [25], TPACK [10,11,26], and DPaCK [13] are presented and critically questioned. Moreover, the approach to the integration of digital media in the classroom is illuminated, and the didactic functions of digital media in science are explained [27].

3.1.2. Workshop Phase: Overview of Modules on Areas of Competencies

In the module on the competency area of Documentation (DOC), the data storage processes (from documentation to versioning to archiving) are scrutinised, the documentation of experiments with a digital worksheet is introduced [28], and the documentation of experiments, specifically by students themselves using videos called EXPlainistry [29], is presented. As it can be assumed from previous surveys that advanced students already have basic knowledge in the field of documentation [30], the focus in this module is less on the technical aspects and more on the subject-specific context, questions of methods and digitality, and, above all, the integration of documentation techniques into teaching.
The module on the second competency area, Presentation (PRE), includes a discussion of the available hardware at schools for presentation and possible scenarios in which digital media are used for presentation. Theoretical principles are presented on multimedia, especially multimodality (which, despite its proven effectiveness, is surprisingly rarely mentioned in physics teacher journals [31]) and multicodality [32,33], as well as cognitive load theory [34]. Recommendations for action on text and image design [33] are presented. Since a certain prior knowledge can also be assumed in this competency area [30], the focus is on presentation forms specific to the natural sciences and methodological aspects.
The third module on Communication and Collaboration (COM) revolves around planning collaborative learning settings [35]. Tools for the collaborative editing of texts, mind maps, pin boards, wikis, online whiteboards, and learning management systems are presented and tried out. Finally, different accompanying communication channels between students and the teacher are discussed.
The Information Search and Evaluation (ISE) module focuses on the five steps of digital research using the IPS-I model [36]. Various scientific and science didactic databases are presented, and examples of different types of literature are examined. Since it can be assumed that advanced students have a basic background in this area of competency [30], the focus in this module is on methodological issues and integration into lesson planning.
In the module for Data Acquisition (DAQ), the possibilities of data acquisition are discussed, especially using a smartphone (e.g., [19,20,21,37,38,39,40]). Various options such as video analysis or taking measurements using an app are tried out. Experimentation in the Remote Lab is also introduced [41]. Furthermore, the necessary steps of teaching with digital data acquisition and the possibilities and challenges of teaching in this manner are discussed.
The penultimate module, Data Processing (DAP), presents different coding options for characters and numbers as well as typical problems that arise when importing data, which the students test by using an iPad. The differences between pixel and vector graphics are discussed. The focus is on the structure of the formats, i.e., xml and mp4.
In the last module, digital tools for Simulation and Modelling (SIM) are presented along with the competence expectations listed in DiKoLAN and tested in the exercises. Tools are discussed for which empirical findings are available [42,43,44,45,46] or which have already been successfully integrated in other DiKoLAN-oriented teaching concepts [47,48]. The tool types presented are spreadsheet programs, modelling systems, computer simulations, StopMotion programs [49], and programs for digital modelling and animation. In addition, Augmented Reality (AR) is discussed as a technique for representing models [50,51,52].

3.1.3. Free-Work Phase: Designing a Lesson Plan

In the free work phase, teams of two students design a lesson on a scenario of their own choosing. In doing so, they are asked to consider what the benefit of using digital media in the learning unit would be for the students and what skills the teaching staff need. During the process, the students write a seminar paper in which they present the scenario and the associated planning and also explain their approach and why they considered the planning to be didactically appropriate. Throughout the 4 weeks before the exam, the supervisors are available for individual coaching, which is used by students to varying degrees. All materials needed for the lesson are to be created and turned in, even if the lesson is not completely implemented.

3.1.4. Presenting the Lesson Plan in a Mock Trial

Finally, the students present their plans at a block meeting. Each participant in the seminar plenum is asked to try out the digital elements of the teaching scenario for themselves as completely as possible. For the supervisors, the following questions play a role in the evaluation:
  • Is the lesson a realistic lesson? Is it planned realistically?
  • Is the lesson well-founded from a didactic point of view?
  • Material created
    a.
    Did the students actually create material on their own?
    b.
    How much effort was invested in terms of content/time?
  • Is the methodological approach adequately justified?
    a.
    Is there a specific purpose served by the digitalisation?
    b.
    Is the media use didactically sensible?
  • How are the digital literacy skills of the students addressed?
  • How is DiKoLAN taken into account?
Both the presentations and the written assignments, which have to be handed in before the first presentation, are considered in the evaluation.

3.2. Design of the Individual Modules (Using the Example of Data Processing)

For each workshop, the areas to be covered in the module are selected based on the competency expectations defined in the DiKoLAN framework. When deriving the learning objectives of a module from the orientation framework, three categories were distinguished: Main learning objectives, secondary learning goals, and non-addressed competency expectations (see Figure 3 for an example). Using the area of data processing as an example, the majority of competencies on the level of Name and Describe are covered in a lecture. For instance, relevant software is introduced and data types common in the context of teaching the natural sciences are shown. Additionally, typical scenarios for the application of digital data processing appropriate to the school curriculum are shown. As an accompaniment to this part, in-lecture and at-home activities are designed to allow for timely application of the topics learned. This includes drawing on an example from data processing, exporting data from digital data acquisition applications, and importing the data into spreadsheet software. There, the data are manipulated by performing various analyses.
To get a first impression of the students’ previous experience, the students are asked in the introductory phase to identify which data processing software they have used before and which data manipulations they already know. In the next step, relevant software is introduced, and data types common in the context of teaching and natural sciences are shown. For this purpose, the export and import of data is presented in the first input phase using the example of csv files in the MeasureAPP app [53]. In the following phase, common issues related to tablets and data storage locations are addressed. In this context, the difference between csv and Excel files is highlighted. Examples are used to introduce the integer and float number formats. In particular, the coding of characters and numbers is discussed in this context. At the end of the first input phase, the visualisation of data using Excel [54] is demonstrated.
Using the integrated microphone of the iPad, the students record an audio oscilloscope of a sung vowel sound in individual work during the practice phase using the phyphox app [55]. They then export the measurement data as a csv file and then import it into Excel to display the data graphically.
In the second input phase, ways of calculating new data in Excel and using spreadsheets to analyse data are demonstrated, including the aspects of measurement uncertainties, statistics, and regression. The instruction is concluded with an introduction to the differentiation of formats for images into vector and pixel graphics and to the structure of video formats as containers.
In a final step, the challenges students have encountered so far during the acquiring and processing of measurement data were discussed, and possible solutions were shown.
As a follow-up task, the students recorded a series of measurements of a cooling teacup from which they are to determine the mean decay constant using a spreadsheet program of their choice.
With these initial practical experiences and theoretical foundations from the areas of Name and Describe, the students then set about working out teaching scenarios in the further course of the seminar to consolidate and extend the skills they have acquired in each module.

3.3. Evaluation

To investigate the effectiveness of the newly designed teaching-learning modules, the change in the participants’ self-efficacy expectations is used as a measure of effectiveness and is measured with an online test provided by the Working Group Digital Core Competencies [5]. So, the question to be answered is: Is it possible to measure a significant increase in students’ self-efficacy expectations in relation to the competences covered in the course? Due to the structure of the seminar, a large effect on students’ self-efficacy expectations is assumed for the main learning objectives, a medium effect for the secondary learning goals, and no effects for the areas not addressed.
The measurement of self-efficacy expectation was chosen for two reasons. First, it is precisely self-efficacy expectation that is influenced by experiences during studies and thus ultimately also has an effect on motivational orientation towards the later use of ICT and digital media in one’s own teaching [30]. Second, the subject-specific self-efficacy expectation can be assessed much more economically than a specific competency itself [31]. Accordingly, most of the digital competence questionnaires published so far measure self-efficacy expectations, e.g., [5,56,57,58,59,60,61,62].
The individual items are based on the competence expectations contained in DiKoLAN and are designed as Likert items. The participants indicate on an eight-point scale their agreement with a statement that describes their ability in the corresponding competence expectation, e.g.,
  • “I can name several computer-aided measurement systems developed for school use (e.g., for ECG, pH, temperature, current, voltage or motion measurements),”
  • “I can describe several systems of wireless mobile sensors for digital data acquisition with mobile devices such as smartphones and tablets, including the necessary procedure with reference to current hardware and software,” or
  • “I can perform measurement acquisition using a system of wireless mobile sensors for digital measurement acquisition with mobile devices such as smartphones and tablets.”
The items of the questionnaire can each be directly assigned to a single competence expectation. The naming of the items in the data set created in the survey follows the nomenclature in the tables with competence expectations listed in DiKoLAN (Figure 4).
Many competency expectations cover several individual aspects or are described using several examples. In such cases, several items were created, which, taken together, cover the competence expectation as a whole.
The questionnaire was implemented as an online survey with LimeSurvey [63] and made available to the participants of the course in each case as a pre-test in the week before the synchronous seminar session via individual e-mail invitation. Seven days later, the students received the same questionnaire again as a post-test.
It was hypothesised that the participants would have a higher self-efficacy expectation in the competency areas addressed in the respective modules after the intervention than before. It is also assumed that large effects can be measured for the main learning objectives, whereas at least medium effects can be measured for the secondary learning objectives, the acquisition of which can only be attributed to the brief learning time in the seminar.

4. Results

4.1. Sample

The participants included N = 16 pre-service German Gymnasium teachers for science subjects who participated in the newly designed seminar on promoting digital core competencies for teaching in science education according to the DiKoLAN framework. The course is developed for Master’s students in the 1st or 2nd semester but is also open for Bachelor’s students in the 5th or 6th semester. More than three quarters of the students participated in the voluntary pre- and post-test surveys. However, three participants failed to complete the single surveys. Hence, data from those participants were removed, resulting in a final total of n = 13 participants (5 male, 8 female, aged M = 23.5 ( S D = 2.9 ) years). These 13 participants indicated they studied the following science subjects (multiple answers possible; usually, students must study two subjects): 10 Biology (76.9%), 6 Chemistry (46.2%), 1 Physics (7.7%), and 1 Mathematics (7.7%). They were attending the following semesters at the time of the study: 5th BEd (1; 7.7%), 6th BEd (1; 7.7%), 1st MEd (6; 46.2%), 2nd MEd (4; 30.8%), or 3rd MEd (1; 7.7%).

4.2. Statistical Analysis

The responses were analysed using R statistical software [64]. Means and standard deviations were computed for each item in the pre-tests and post-tests. Wilcoxon signed-rank tests were conducted for each pre-test post-test item pair to test for growth in item means.
The results of the descriptive and inferential statistics are listed in tables in the Appendices A–G. As an example, the results for the competency area Data Processing (DAP) are also presented here.

4.2.1. Data Processing (DAP)

Table 1 shows the results for the main learning objectives, and the results for the secondary learning goals are listed in Table 2 (for an overview, the main and secondary learning objectives are marked in the respective table of competence expectations, Figure 3). If several items of the questionnaire can be assigned to a competence expectation listed in DiKoLAN, a mean effect size averaged over the associated Wilcoxon signed-rank tests (in italics) is given in addition to the effect sizes of the individual Wilcoxon signed-rank tests. For example, the competency expectation DAP.S.N2 (“Name digital tools […]”) is assessed with seven items, DAP.S.N2a-g, which reflect the individual examples mentioned in DiKoLAN (e.g., “Filtering”, “Calculation of new variables”, …).
The results show that there is an increase in self-efficacy expectations in all of the competency expectations addressed as the main learning objectives in the module. All of the tested hypotheses can be accepted.
According to Cohen, the effect sizes determined as correlation coefficient r can be roughly interpreted as follows: 0.10 → small effect, 0.3   → medium effect, and 0.50 → large effect [65] (p. 532). However, it must be taken into account that the interpretation of effect sizes should always depend on the context [65]. Since the learning goals addressed in the intervention and the tested self-efficacy expectations were both derived from the competency expectations defined in DiKoLAN and thus correlate very highly, larger overall effects are to be expected than in other studies. Therefore, we raise the thresholds for the classification of the observed effects into small, medium, and large effects for the following evaluations as follows: 0.20 → small effect, 0.40 → medium effect, and 0.60 → large effect.
The effect sizes of the intervention in this area are always 0.62 or higher if the mean effect size is considered for broken down sub-competencies. Hence, the hypothesised growth in self-efficacy can be observed with large effects of the intervention.
The results of the Wilcoxon signed-rank tests for the secondary learning goals show significant increases in self-efficacy for most of the hypotheses tested. Where single hypotheses must be rejected, only partial aspects of a competence expectation were addressed, as can be expected for a secondary learning objective. The averaged effect sizes mostly show medium effects of the intervention on self-efficacy expectations in these areas, as hypothesised.
For comparison, the mean values of the self-efficacy expectations in sub-competencies not explicitly addressed in the course are listed and examined for differences in mean values (Table 3). As expected, no significant differences are observed between the two test times.
For a better overview, the averaged effect sizes are clearly plotted in Table 4.

4.2.2. Documentation (DOC)

Due to the students’ previous experience, which is expected to be well developed (the comparatively high item means in the pre-test support this assumption), the focus in this module is less on the technical aspects and more on the areas of Teaching, Methods/Digitality, and Content-specific context (Figure A1). For the main learning objectives, large effects of the intervention are observed, in line with the expectations (Table A1). As expected, mostly medium (average) effects were measured for the secondary learning objectives (Table A2). The measured effects also show, for example, that within the sub-competency DOC.S.N1, the focus was specifically on versioning management and the possibilities of using corresponding tools, which is why a particularly large effect is measurable for item DOC.S.N1c (“I can name technical options for version management and file archiving (e.g., file naming with sequential numbering, date-based file names, Windows file version history, Apple Time Machine, Subversion, Git, etc.).”) but not for DOC.S.N1a (“I can name technical possibilities for digital documentation of e.g., protocols, experiments, data or analysis processes (e.g., using a word processor, a spreadsheet, OneNote, Etherpad).”) and DOC.S.N1b (“I can name technical options for permanent data storage and corresponding software offers/archives (e.g., network storage, archiving servers, cloud storage).”). As expected, there were no significant differences in the pre-test and post-test results for the sub-competencies that were not addressed (Table A3).

4.2.3. Presentation (PRE)

In the competency area of presentation, as expected, the item mean values in the pre-test are also quite high in some cases, and the students rate their own competencies in this area quite highly. Hence, the main learning objectives are in the areas of Teaching, Methods/Digitality, and Content-specific context (Figure A2). The intervention achieved strong (averaged) effects on the self-efficacy expectations for all main learning objectives (Table A5). Even if not all facets of a sub-competency can always be recorded (PRE.C.N1, PRE.C.D1), a clear increase can still be observed on average. As expected, mostly medium effects are achieved for the secondary learning goals (Table A6). The sub-competencies that were not addressed show no differences except for one (Table A7). Only the item PRE.S.A1c (“I can set up and use at least one tool/system to represent processes on different time scales.”) shows a clear increase in self-efficacy expectations.

4.2.4. Communication and Collaboration (COM)

In the module on the competency area of Communication/Collaboration, three central topics are placed in the foreground: firstly, the use of digital technologies for joint work on documents (by students as well as among colleagues) and the associated requirements, secondly, the instruction of students to communicate with each other, and thirdly, the exemplary integration into lesson planning. While mainly technical issues and tools are discussed and tested as the main learning objectives, methodological-didactic issues can only be considered on the basis of individual examples. Accordingly, the main learning objectives concentrate on the area of special tools (Figure A3).
The results show no significant improvement in self-efficacy expectations in the learning areas of the main learning objectives (Table A9). For the secondary learning goals, the picture is mixed (Table A10). Although there is a significant effect of the intervention on the assessment of the ability to integrate communication and collaboration into lesson planning (COM.T.A1), it is precisely in the case of the very complex learning objectives (COM.M.N1 and COM.M.D1) that no (or only smaller) effects can be observed in individual sub-aspects. In the competence expectations that were not addressed, no significant differences between the test times can be measured (Table A11).
Overall, it should be noted that the participants already assess their abilities as comparatively high in the pre-test.

4.2.5. Information Search and Evaluation (ISE)

The focus of the module Information Search and Evaluation is clearly on methodology and lesson planning (Figure A4). The analyses show large effects of the intervention in almost all sub-competencies addressed as the main learning objective (Table A13). As expected, medium effects were observed for the secondary learning objectives (Table A14). In areas that were not addressed, no differences were found between pre-test and post-test (Table A15).

4.2.6. Data Acquisition (DAQ)

In the Data Acquisition module, a variety of possibilities for the acquisition of measurement data—especially in distance learning—are presented, discussed, and tried out as examples (Figure A5). Accordingly, the contents of the main learning objectives, which all lie in the technical area, can only be briefly touched upon. In individual sub-aspects of the sub-competencies, pronounced effects can be seen, but the average effect strengths are in the range of medium effects (Table A17). Medium effects of the intervention on self-efficacy expectations can also be observed for the secondary learning goals (Table A18). As expected, in the sub-competencies that were not addressed, no differences are registered between the two test times (Table A19).

4.2.7. Simulation and Modelling (SIM)

Figure A7 shows the competency expectations addressed in the module Simulation and Modelling and distinguishes between main and secondary learning objectives. In the main learning objectives, the intervention results in an increase in self-efficacy expectations with large effect sizes (Table A25). For the secondary learning goals, the intervention had medium to large effects, exceeding expectations (Table A26). For the competence expectations that were not addressed, no significant differences can be determined between the test times (Table A27).

5. Discussion

This section first discusses the effects observed across all modules and the general classification into main and secondary learning objectives. Then, the individual modules are discussed, and implications for improving the teaching-learning modules as well as for designing and developing similar teaching-learning units to promote digital competences are given.

5.1. Joint Discussion of the Results of all Modules and the Separation in Main and Secondary Learning Objectives

5.1.1. Effectiveness of the Interventions for the Main Learning Objectives

Overall, the results are largely in line with the expectations. In five of the seven central competency areas (DOC, PRE, ISE, DAP, and SIM), the expected increase in the students’ self-efficacy expectation was observed in all main learning objectives with large effects ( r of 0.60 to 0.91). However, it should be noted that, in some cases, not all aspects of a main learning objective can be addressed, so the effect sizes for individual items may well be lower (r of 0.26 to 0.91), even if the averaged effect over all items depicting the competence expectation can nevertheless be considered a large effect.
Only in the competency area Communication/Collaboration (COM) does the intervention not lead to a significant increase in self-efficacy expectations in the main learning objectives. It should be noted that the item mean values are already extremely high in the pre-test, which means that the students consider their own abilities in this area to be very high even before the intervention. A similar picture emerges for the secondary learning goals, even though an effect of the intervention can certainly be recognised. Therefore, the competency area Communication/Collaboration (COM) will not be considered in the following observations, and this module will be discussed again afterwards.

5.1.2. Effectiveness of the Intervention in the Secondary Learning Objectives

For the secondary learning objectives, the expected picture also emerges for five of the seven central competency areas (DOC, PRE, DAQ, DAP, SIM). For learning objectives that are only tested with one item, the observed effect sizes are in the medium range, as expected (r from 0.40 to 0.67). In the module Information Search and Evaluation (ISE), contrary to the hypothesis, no significant increase in self-efficacy expectations was observed for the learning objective ISE.C.N2 (“Name several literature databases or search engines […]”), although this was clearly the content of the course. However, the students already indicated a comparatively high level of prior knowledge in the pre-test.
In the case of secondary learning objectives, which are regarded as such because only individual selected examples are deepened within the sub-competency areas, the effect sizes to be expected vary accordingly when comparing the items assigned to this learning objective with each other. This observation applies, for example, to DOC.S.N1 (“Name technical approaches […]”) in the competency area of Documentation. In the associated module, less emphasis was placed on word processing (DOC.S.N1a) and permanent data storage (DOC.S.N1b), and instead, the possibilities of digital version management (DOC.S.N1c) were discussed in depth, so a significant increase can only be recorded for the third item (DOC.S.N1c) The selection of this sub-aspect was based on the assumption that the students would have less prior knowledge of digital version management than of the other sub-aspects. The pre-test item mean values support this assumption (DOC.S.N1a: 5.46 (1.90), b: 5.69 (1.89), c: 4.00 (2.35)).

5.1.3. Differences between the Test Times in Sub-Areas which Were Not Addressed

Differences between the test times belonging to a module (pre-test and post-test) can only be found for one item (PRE.S.A1c: “I can initialise and use at least one tool/system to represent processes on different time scales.”). The results from the pre-test ( M = 4.92 , S D = 1.71 ) and post-test ( M = 6.00 , S D = 1.91 ) indicate that the intervention resulted in an improvement in self-efficacy expectation, V = 51.5 , p = 0.014 , r = 0.71 . This is understandable, since the creation of stop motion videos was specifically practised here, but not all of the presentation forms expected in this sub-competency were covered in the module.

5.1.4. Overall Comparison of the Observed Effects

Figure 5 shows boxplots of the observed (averaged) effect sizes r for the main learning objectives and secondary learning goals for each competency area.
Except for the competence areas of Communication/Collaboration and Data Processing, there are clear separations between the effect sizes of the main learning objectives and the secondary learning goals, which supports the division into main and secondary learning objectives.

5.2. Discussion of the Individual Teaching-Learning Modules

In the following section, the results of the individual learning modules are examined in more detail separately.

5.2.1. Data Processing (DAP)

Out of the 26 sub-competencies in the DAP competency area, 13 were selected as major and 9 as minor learning objectives. Less prior experience was assumed in the areas of Content-specific context and Special tools, which is why more attention was paid to these areas in the design of the unit. Large effects (r = 0.62 … 0.86) were found between the pre- and post-test for all major learning objectives, as well as medium to large effects for the minor learning objectives (d = 0.50 … 0.63), except for the test items DAP.S.A1b (“I can apply procedures for calculating new quantities in data processing.”), DAP.S.A1d (“I can apply procedures for statistical analysis in data processing.”), and DAP.S.A1e (“I can apply image/audio and video analysis procedures in data processing.”). The structure of the session can be seen well, as the application level played a minor role here and, similarly, for the secondary learning objective test item DAP.T.D1a (“I can describe the didactic prerequisites of using digital data processing in the classroom.”). Looking at the averaged effect sizes in the module (Table 4), it can be confirmed that the areas with greater focus produced stronger effects. Consequently, the focus on the content specific context and the specific tools has proven to be suitable and can be maintained for further courses. In this evaluation, the pre- and post-tests accompanying the synchronous session were considered. However, a significant change in self-assessment in the area of application is expected for the lesson design phase. Therefore, it can be said that, through the module, the competency area DAP can be promoted very well and that this module serves as a basis for further modules for the promotion of digital competences among prospective teachers at other locations.

5.2.2. Documentation (DOC)

From the 13 sub-competencies of the competency area of DOC, 8 were selected as the main objectives and 4 as secondary learning goals. Particular attention was paid to the levels of Name and Describe. In all main learning objectives, a large effect on the growth of the students’ self-efficacy expectations (r = 0.65 … 0.88) can be determined by the measuring instrument. As already discussed before, the secondary learning objectives in the area of DOC focused on the students’ previous experience, which is why less emphasis was placed on word processing (DOC.S.N1a) and permanent data storage (DOC.S.N1b) and, instead, the possibilities of digital version management (DOC.S.N1c) were discussed in depth, so a significant increase can only be recorded for the third item (DOC.S.N1c). Nevertheless, besides single items with a large effect (DOC.S.N1c), medium effects were found across all competencies of the secondary learning objectives (r = 0.53 … 0.67). As expected, no significant increases in students’ self-efficacy ratings were detected in the domains that were not addressed. If the focus is placed on the individual results, it can be seen that high effect sizes were obtainedm especially in the Teaching (T) category, reflecting the structure of the session. Therefore, it could be shown that the intervention has a great effect in the areas of the main learning objectives on the students’ self-efficacy expectation, which is why this session needs only minor adjustments for further implementations and can be used as a model example for courses at other universities. To be a little more prepared for the session on communication and collaboration (see below), further elaboration could be made in the area of specific technology (DOC.S.N1). Thus, the module fully covers the competency areas taken from the framework.

5.2.3. Presentation (PRE)

Out of the 17 sub-competencies that the competency area PRE comprises, only 8 sub-competencies were declared as main and 4 as secondary learning objectives due to the limited time available and based on the assumed prior experience. Particular emphasis was placed on the competencies of the Name and Describe competency levels and, as described before, mainly in the areas of Teaching, Methods/Digitality, and Content-specific context (Table A5). Out of the 36 test items used to assess the sub-competencies addressed, no significant effect on the students’ self-concept was found in 7 cases. In the area of the main learning objectives, these were one item at the naming level and two items at the describing level (see Table A6), each of which is a subitem of a supercategory (PRE.C.N1/D1). Nevertheless, by averaging all of the effect sizes of these supercategories, a large effect (r = 0.61 … 0.90) could also be shown for these two. The same applies to the effect sizes of the superordinate sub-competencies (PRE.S.N1/D1) of the four rejected items from the area of secondary learning objectives (r = 0.40 … 0.54). Thus, based on the results from the evaluation, an area-wide increase in self-efficacy expectations for the addressed competency domains can be determined. The individual results, which show comparatively high effect sizes in all areas of the category Methods/Digitality (TPK), reflect, on the one hand, the module structure, since, in this session, the focus was put more on the discussion among the students about the possible effects of the use in the classroom. On the other hand, students estimated their prior experience in the context of Principles and Criteria for Designing Digital Presentation Media (PRE.M.N1/D1) to be comparatively low. Thus, the focus on individual items in the competencies has proven successful, and the unit on presentation can be used as a successful example for the area-wide integration of the promotion of digital competencies in a master’s seminar for student teachers.

5.2.4. Communication and Collaboration (COM)

For this module, due to time considerations, 4 of the 29 competency expectations were selected as major learning objectives and 11 as minor learning objectives. Thus, only about half of the competencies could be covered. In order to get a better overview of the entire competency area and to better link the different areas of teaching, methods, context, and tools, it would certainly be advisable to extend this module to two sessions for future implementations. Nevertheless, for a first session, the focus on the use of digital technologies for joint work on documents (by students as well as among colleagues) and the associated requirements, as well as the instruction of students to communicate with each other and ultimately the exemplary integration into lesson planning, is considered correct. A Dunning–Kruger effect [66,67] is suspected, indicating that, in the area of the main learning objectives, no major effect on the self-assessment of the students could be achieved, because they overestimated their previous experience. During the course, the students first had to learn that, although they experience themselves as very competent in everyday digital communication, guiding digital collaboration between pupils goes far beyond the skills in everyday life and that completely different tools can be used for corresponding learning activities. Due to this overestimation of their previous experience, mainly technical issues and tools were discussed and tested, whereas methodological-didactic issues could only be considered on the basis of individual examples. If, as described above, some technical tools and tricks are already presented in the Documentation module, there is more time for methodology and teaching at this point in the course. The significant effect of the intervention on the assessment of being able to integrate communication and collaboration into lesson planning (COM.T.A1b) particularly shows that this module was able to achieve the goal of strengthening the students’ ability to use digital media in the classroom. With the changes described, this unit thus also serves as an adequate starting point for the development of similar modules elsewhere.

5.2.5. Information Search and Evaluation (ISE)

The focus of the module Information Search and Evaluation is clearly on methodology and lesson planning (Table A13). From the 32 sub-competencies of the competency area ISE, 21 were selected as the main learning goals and 7 as the secondary learning goals. As suspected, the students already rated their self-efficacy expectancy in the areas of Content-specific Context and Special Tools comparatively high at the Naming level ( M pre   = 5.23 1.92     6.46 1.61 , which is why only a subordinate urgency was assigned to these areas in the design of the unit. Moderate to strong effects (r = 0.60 … 0.91) were found between the pre- and post-test for all main learning objectives, as well as moderate effects for minor learning objectives ( r   =   0.42     0.60 ). As discussed before, the students already indicated a comparatively high level of prior knowledge in the pre-test. As in the previous competency areas, the module structure can also be recognised here with a view to the individual results. Particularly, high effects are visible in the area of Methods/Digitality, which also played a major role in the course. Thus, the intervention was found to have a large effect on students’ self-efficacy expectations in the areas of the main learning objectives, which is why this session requires only minor adjustments for further implementations and can be used as a model for courses at other universities.

5.2.6. Data Acquisition (DAQ)

For this session, only 3 of 16 competencies were chosen as major learning objectives, and another two were chosen as minor learning objectives. As suspected, students’ self-efficacy expectations were low in the area of specific technology, particularly on the “apply” level compared to other competency areas, which is why it was emphasised. The guided application of the tools in the area of data acquisition requires special time in this module, which, however, is necessary because the students come with little previous experience. The guidance on data collection can be considered successful when looking at the results. In order to be able to integrate further competencies into this module, it would be conceivable to outsource the practical phases into a self-study unit so that the synchronous main session can focus even more on the areas of methodology and teaching. Likewise, an expansion to two sessions would be useful so that students can continue to be guided as well. This session is a good example of integrating the competencies from the area of special tools and can be used as a blueprint for such implementations.

5.2.7. Simulation and Modelling (SIM)

The finding of a significant effect of the module on the self-concept of the students in 22 of 25 sub-competencies suggests that the students have received a comprehensive overview of the basic competency area of Simulation and Modelling with the module according to the addressed competence expectations. The strong average effect of the module on the students’ self-efficacy confirms that a targeted promotion of digital competencies from DiKoLAN in university teaching-learning arrangements can in principle be successful. Looking at the individual results, comparatively high effect sizes were obtained in the category Special Technology. This is probably due to the weak assessment of prior knowledge by the students compared to the other three categories (Table A26 and Table A28). Thus, the effectiveness measurement procedure identified a thematic area with great potential for development in this teaching–learning arrangement. The identified knowledge gap among the students can be explained, since prior knowledge of “special technology” cannot be expected from any of the previous stages of the teacher training program in Konstanz, in comparison to its subject-specific, pedagogical, and subject-didactic overlapping fields. Thus, the intervention was found to have a large effect on students’ self-efficacy expectations in the domains of the main learning objectives, which is why this session requires only minor adjustments for further implementations and can be used as a model for courses at other universities.

5.3. Final Discussion of the Course Design

It has been helpful to dedicate a separate week to each competency area, allowing us to cover large areas of the DiKoLAN competency framework in one term, achieving a significant gain in all areas. In addition, it became apparent that some areas (for example, the sessions on Documentation—DOC and Communication—COM) offer the opportunity to link content across multiple sessions, which can be integrated in future courses. The accompanying tasks create further need for support but also allow for a deepening of the topics addressed in the sessions, for which there would otherwise have been no time. The design of teaching units in particular provides students with initial teaching concepts in which digital media are integrated into lessons.

5.4. Final Discussion of the Methodology of Evaluation

The detailed monitoring of all the modules through separate pre- and post-tests allowed for a very precise observation of the effect of each module on the students’ self-efficacy expectations in the different areas. Since a high response rate was achieved despite the voluntary nature of the pre- and post-test, the additional time required of the students is not considered to be too high, but the benefit generated for the further development and confirmation of the course structure is immense. With the help of the test instrument used, we were able to confirm the effectiveness of existing structures and diagnose areas in need of further development.

6. Conclusions

With the help of the test instrument provided by the Working Group Digital Core Competencies [5], it was possible to show that the newly designed course aimed at promoting students’ digital competencies can specifically promote students’ self-efficacy expectations. Accordingly, pre-service teachers feel more self-efficacious after the seminar in large parts of the digital core competencies listed in the DiKoLAN framework. Thus, initial teaching and learning arrangements have been developed and implemented for all seven competency areas relevant to the science teaching profession. Therefore, a repetition and adaptation of such teaching concepts in the university context can be a proven method to fight against the current issues in the use of digital tools in schools. The piloting of the self-efficacy assessment instrument using the developed module as an example shows that it can be used to optimise such teaching concepts: For example, the content of a teaching–learning module could be adapted to the students’ prior knowledge and thus made even more effective by means of an anticipated learning level survey in the pre-test. At the same time, the strengths and weaknesses of already-tested modules (as in the presented course) can be revealed so that the modules can be improved and re-tested. Furthermore, this work presents a course that can be used as a best practice example for the development and design of new courses due to its effectiveness demonstrated here. Anyone interested in using and expanding on the material is invited to contact the corresponding author to obtain access to it.

Author Contributions

Conceptualisation, A.H., L.-J.T., P.M. and J.H.; methodology, A.H. and L.-J.T.; validation, L.-J.T.; formal analysis, L.-J.T.; investigation, L.-J.T.; data curation, L.-J.T.; writing—original draft preparation, A.H., L.-J.T., P.M. and J.H.; writing—review and editing, A.H., L.-J.T., P.M. and J.H.; visualisation, L.-J.T.; supervision, J.H.; project administration, J.H.; funding acquisition, J.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Federal Ministry of Education and Research (project “edu4.0” in the framework of the joint “Qualitätsoffensive Lehrerbildung”, grant number 01JA2011) and the German Chemical Industry Association (VCI) (project “Digitale Werkzeuge für den Chemieunterricht”). The APC was funded by the University of Konstanz.

Institutional Review Board Statement

All participants were students at a German university. They took part voluntarily and signed an informed consent form. Pseudonymization of participants was guaranteed during the study. Due to all these measures in the implementation of the study, an audit by an ethics committee was waived.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to the ongoing study.

Acknowledgments

The authors would especially like to thank our colleagues in the Digital Core Competencies Working Group (Sebastian Becker, Till Bruckermann, Alexander Finger, Lena von Kotzebue, Erik Kremser, Monique Meier, and Christoph Thyssen) for their support of this research project, the provision of the test instrument, and for the lively exchange on the research design and the evaluation of the results. We also thank Lukas Müller, who helped us as a student assistant during the seminar. Lastly, we thank the Joachim Herz Foundation, which supports all DiKoLAN projects.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Documentation (DOC)

Figure A1. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Documentation (DOC). Main topics (magenta), side topics (blue).
Figure A1. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Documentation (DOC). Main topics (magenta), side topics (blue).
Education 12 00356 g0a1
Table A1. Results of Wilcoxon signed-rank tests for competencies in the area of Documentation (DOC) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A1. Results of Wilcoxon signed-rank tests for competencies in the area of Documentation (DOC) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DOC.T.N14.851.826.850.5578.00.0010.88
DOC.T.D14.001.415.851.2166.00.0020.86
DOC.T.A1 * 0.77
a4.231.645.771.3674.50.0030.80
b3.771.425.231.4283.00.0040.74
DOC.M.N15.081.807.000.9163.50.0040.77
DOC.C.N1 °
DOC.C.N23.692.105.921.6163.50.0040.77
DOC.C.D13.312.145.081.6655.00.0030.83
DOC.S.D1 * 0.65
a4.851.996.540.8863.00.0040.76
b4.771.696.001.1548.00.0180.58
c3.852.125.541.6149.50.0140.61
Note: ° not tested. * The average effect size is given for competencies assessed with more than one item.
Table A2. Results of Wilcoxon signed-rank tests for competencies in the area of Documentation (DOC) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A2. Results of Wilcoxon signed-rank tests for competencies in the area of Documentation (DOC) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DOC.M.D15.232.206.620.9658.00.0130.67
DOC.S.N1 * 0.53
a5.461.906.151.6342.00.073(0.40)
b5.691.896.691.1143.00.061(0.36)
c4.002.356.151.3455.00.0030.83
DOC.S.N26.002.047.460.6640.00.0210.54
* The average effect size is given for competencies assessed with more than one item.
Table A3. Results of Wilcoxon signed-rank tests for competencies in the area of Documentation (DOC) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A3. Results of Wilcoxon signed-rank tests for competencies in the area of Documentation (DOC) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DOC.S.A1 *
a5.921.716.770.8334.00.187
b3.922.605.691.7568.00.120
* Assessed with more than one item.
Table A4. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Documentation (DOC). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A4. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Documentation (DOC). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
LevelTPACKTPKTCKTK
Comp.rComp.rComp.rComp.r
NameDOC.T.N10.88DOC.M.N10.77DOC.C.N1 ° DOC.S.N1 *0.53
DOC.C.N20.77DOC.S.N20.54
DescribeDOC.T.D10.86DOC.M.D10.67DOC.C.D10.83DOC.S.D1 *0.65
DOC.S.D2 °
Use/App.DOC.T.A1 *0.77 DOC.S.A1 *-
Main learning objectives (bold magenta), secondary learning goals (italic cyan), and non-addressed competencies (yellow). * The average effect size is given for competencies assessed with more than one item. ° Not tested.

Appendix B. Presentation (PRE)

Figure A2. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Presentation (PRE). Main topics (magenta), side topics (blue).
Figure A2. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Presentation (PRE). Main topics (magenta), side topics (blue).
Education 12 00356 g0a2
Table A5. Results of Wilcoxon signed-rank tests for competencies in the area of Presentation (PRE) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A5. Results of Wilcoxon signed-rank tests for competencies in the area of Presentation (PRE) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
PRE.T.N13.851.725.541.7178.0<0.0010.90
PRE.T.D1 * 0.67
a3.851.346.231.7978.00.0010.88
b4.921.326.381.6169.50.0080.68
c4.461.716.001.6865.50.0200.60
d4.151.635.772.3161.00.0430.51
PRE.M.N12.541.945.772.0166.00.0020.86
PRE.M.N23.851.416.381.9490.0<0.0010.87
PRE.M.D12.772.055.851.9974.00.0030.76
PRE.M.D23.921.616.151.9574.00.0030.78
PRE.C.N1 * 0.62
a5.622.266.691.8048.50.0160.65
b3.852.086.082.1076.00.0020.82
c3.311.446.381.9491.0<0.0010.89
d5.922.256.461.8531.00.169(0.37)
e4.692.065.691.9745.50.0360.47
f4.542.445.541.9454.00.0300.51
PRE.C.D1 * 0.61
a5.922.106.621.8042.50.066(0.47)
b3.922.225.542.2675.50.0180.59
c3.151.576.381.8989.50.0010.86
d5.692.466.461.9834.50.080(0.36)
e4.542.075.851.8252.00.0060.72
f4.002.315.541.7670.50.0070.67
° Not tested. * The average effect size is given for competencies assessed with more than one item.
Table A6. Results of Wilcoxon signed-rank tests for competencies in the area of Presentation (PRE) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A6. Results of Wilcoxon signed-rank tests for competencies in the area of Presentation (PRE) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
PRE.T.N24.621.715.921.7145.00.0400.52
PRE.T.A1 * 0.54
a4.381.265.692.1052.50.0440.52
b4.231.365.541.9063.00.0310.55
PRE.S.N1 * 0.40
a6.001.836.231.3034.50.4640.05
b3.772.175.231.9679.00.0100.66
c4.081.855.152.0859.00.061(0.43)
d6.381.396.691.3227.00.312(0.13)
e3.851.215.921.9862.00.0050.70
f4.382.145.621.8944.50.0420.45
PRE.S.D2 * 0.54
a5.311.896.231.7424.50.0430.53
b3.542.075.152.1958.50.0130.64
c3.151.465.541.8188.00.0020.83
d3.461.814.921.8959.00.0100.65
e5.231.796.231.7941.00.089(0.38)
f4.382.264.921.8041.50.234(0.23)
* The average effect size is given for competencies assessed with more than one item.
Table A7. Results of Wilcoxon signed-rank tests for competencies in the area of Presentation (PRE) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A7. Results of Wilcoxon signed-rank tests for competencies in the area of Presentation (PRE) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
PRE.T.A25.381.665.922.1017.50.609
PRE.M.A16.001.476.311.9732.50.645
PRE.C.A16.921.126.381.9409.00.430
PRE.S.D1
a6.771.306.621.3925.00.836
b5.082.405.542.2630.50.797
c4.232.205.621.8552.50.089
d6.152.036.461.8524.00.905
e4.462.185.621.7637.50.083
f5.152.235.462.0315.00.932
PRE.S.A1 * 0.34
a6.851.146.851.9915.50.865(0.22)
b5.462.116.232.0536.00.411(0.28)
c4.921.716.001.9151.50.0140.71
d7.231.017.151.0706.50.892(0.07)
e5.002.045.851.8652.50.086(0.48)
f5.312.025.692.2930.50.359(0.28)
* Assessed with more than one item.
Table A8. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Presentation (PRE). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A8. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Presentation (PRE). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
LevelTPACKTPKTCKTK
Comp.rComp.rComp.rComp.r
NamePRE.T.N10.90RE.M.N10.86PRE.C.N1 *0.62PRE.S.N1 *0.40
PRE.T.N20.54PRE.M.N20.87
DescribePRE.T.D1 *0.67PRE.M.D10.76PRE.C.D1 *0.61PRE.S.D1-
PRE.M.D20.78 PRE.S.D2 *0.54
Use/App.PRE.T.A1-PRE.M.A1-PRE.C.A1-PRE.S.A10.34
PRE.T.A2-
Main learning objectives (bold magenta), secondary learning goals (italic cyan), and non-addressed competencies (yellow). * The average effect size is given for competencies assessed with more than one item. ° Not tested.

Appendix C. Communication/Collaboration (COM)

Figure A3. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Communication/Collaboration (COM). Main topics (magenta), side topics (blue).
Figure A3. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Communication/Collaboration (COM). Main topics (magenta), side topics (blue).
Education 12 00356 g0a3
Table A9. Results of Wilcoxon signed-rank tests for competencies in the area of Communication/Collaboration (COM) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A9. Results of Wilcoxon signed-rank tests for competencies in the area of Communication/Collaboration (COM) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
COM.S.N16.541.566.691.3831.50.572
COM.S.N26.461.516.920.8637.00.166
COM.S.A16.381.856.461.7131.00.590
COM.S.A2 * 0.27
a6.081.556.691.0336.00.0480.48
b4.772.205.382.0227.50.291(0.06)
* The average effect size is given for competencies assessed with more than one item.
Table A10. Results of Wilcoxon signed-rank tests for competencies in the area of Communication/Collaboration (COM) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A10. Results of Wilcoxon signed-rank tests for competencies in the area of Communication/Collaboration (COM) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
COM.T.N16.691.036.621.3325.00.625
COM.T.N2 *
a6.621.126.850.9917.50.302
b6.151.286.691.1832.50.122
c6.231.486.541.2022.00.310
COM.T.D25.771.306.081.3834.00.268
COM.T.A1 * 0.50
a5.541.206.081.2627.50.100(0.39)
b5.001.225.851.1433.50.0160.61
COM.M.N1 *
a6.231.016.771.0950.50.054
b6.310.756.311.4927.00.541
c6.080.866.381.3925.50.152
d5.850.906.311.1854.50.113
e6.230.936.541.1342.50.201
f5.621.805.851.6329.50.439
g6.081.126.231.5931.00.376
h5.082.725.921.7540.50.097
COM.M.D1 * 0.42
a6.151.216.621.1249.50.060(0.44)
b6.310.956.851.0722.50.083(0.49)
c5.920.956.541.2752.50.0390.49
d6.001.086.620.9652.00.0450.44
e6.150.806.540.9718.00.060(0.46)
f5.771.746.541.0539.00.122(0.35)
g5.771.486.311.2542.00.221(0.20)
h4.922.846.151.7261.00.0430.47
COM.C.N15.691.496.851.1451.00.056
COM.C.N34.771.595.771.7461.00.0420.49
COM.C.N5 °
COM.S.N6 °
COM.S.D1 *
a6.311.756.771.0129.50.438
b6.541.336.770.6020.00.411
c5.691.895.851.2127.00.541
d4.852.345.691.6034.50.080
* The average effect size is given for competencies assessed with more than one item. ° Not tested.
Table A11. Results of Wilcoxon signed-rank tests for competencies in the area of Communication/Collaboration (COM) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A11. Results of Wilcoxon signed-rank tests for competencies in the area of Communication/Collaboration (COM) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
COM.T.N3 °
COM.T.D1 °
COM.T.D3 *
a5.461.335.921.1226.50.250
b5.691.445.921.3235.00.457
COM.T.A2 °
COM.M.D25.231.746.081.0441.00.181
COM.C.N26.001.086.311.4428.00.548
COM.C.N45.541.766.081.7530.50.368
COM.C.D1 *
a6.001.586.850.9029.00.124
b6.150.906.621.1933.00.224
c5.082.025.771.3639.50.591
d6.231.426.001.5820.00.809
COM.S.N35.621.715.851.7739.510.000
COM.S.N4 °
COM.S.N55.771.645.691.8940.50.740
COM.S.A3 °
COM.S.A4 °
COM.S.A5 °
* Assessed with more than one item. ° Not tested.
Table A12. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Communication/Collaboration (COM). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A12. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Communication/Collaboration (COM). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
LevelTPACKTPKTCKTK
Comp.rComp.rComp.rComp.r
NameCOM.T.N1-COM.M.N1 *-COM.C.N1-COM.S.N1-
COM.T.N2 *- COM.C.N2-COM.S.N2-
COM.T.N3 ° COM.C.N30.49COM.S.N3-
COM.C.N4-COM.S.N4 °
COM.C.N5 ° COM.S.N5-
COM.S.N6 °
DescribeCOM.T.D1 ° COM.M.D1 *0.42COM.C.D1 *-COM.S.D1 *-
COM.T.D2-COM.M.D2-
COM.T.D3 *-
Use/App.COM.T.A1 *0.50 COM.S.A1-
COM.T.A2 *- COM.S.A2 *0.27
COM.S.A3 °
COM.S.A4 °
COM.S.A5 °
Main learning objectives (bold magenta), secondary learning goals (italic cyan), and non-addressed competencies (yellow). * The average effect size is given for competencies assessed with more than one item. ° Not tested.

Appendix D. Information Search and Evaluation (ISE)

Figure A4. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Information Search and Evaluation (ISE). Main topics (magenta), side topics (blue).
Figure A4. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Information Search and Evaluation (ISE). Main topics (magenta), side topics (blue).
Education 12 00356 g0a4
Table A13. Results of Wilcoxon signed-rank tests for competencies in the area of Information Search and Evaluation (ISE) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A13. Results of Wilcoxon signed-rank tests for competencies in the area of Information Search and Evaluation (ISE) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
ISE.T.N14.921.386.310.8543.00.0080.68
ISE.T.N25.231.176.770.7366.00.0020.86
ISE.T.N35.311.496.311.1157.00.0150.62
ISE.T.D1 * 0.76
a5.231.246.230.9368.00.0090.67
b5.001.226.311.0355.00.0020.84
ISE.T.D25.231.306.230.9345.00.0040.81
ISE.T.A1 * 0.76
a4.851.415.921.1955.00.0020.85
b4.691.555.541.3341.50.0120.66
ISE.T.A2 * 0.74
a4.771.366.080.6486.50.0020.82
b4.771.595.691.1141.50.0120.66
ISE.M.N15.461.136.620.8778.0<0.0010.91
ISE.M.N25.621.046.690.8555.00.0020.85
ISE.M.D14.921.046.620.9666.00.0020.86
ISE.M.D25.311.186.540.5252.00.0060.72
ISE.M.A1 * 0.74
a4.921.506.151.0752.00.0060.72
b4.541.615.850.9962.50.0040.76
ISE.C.D34.381.615.771.7442.50.0100.67
ISE.C.D44.852.086.460.8863.00.0040.77
ISE.C.A15.621.196.770.9362.00.0050.75
ISE.S.D25.461.516.540.8849.00.0150.60
ISE.S.D34.851.636.380.8766.00.0020.86
° Not tested. * The average effect size is given for competencies assessed with more than one item.
Table A14. Results of Wilcoxon signed-rank tests for competencies in the area of Information Search and Evaluation (ISE) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 .
Table A14. Results of Wilcoxon signed-rank tests for competencies in the area of Information Search and Evaluation (ISE) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 .
CompetencyPrePostVpr
MSDMSD
ISE.C.N15.231.926.231.2453.50.0340.54
ISE.C.N26.151.216.691.3234.50.081
ISE.C.N35.691.326.540.8849.00.0140.60
ISE.C.N45.231.176.080.8641.00.0140.57
ISE.C.D25.621.456.541.0531.00.0390.56
ISE.S.N16.461.617.150.9030.00.0490.44
ISE.S.N25.851.146.541.2037.50.0400.42
Table A15. Results of Wilcoxon signed-rank tests for competencies in the area of Information Search and Evaluation (ISE) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A15. Results of Wilcoxon signed-rank tests for competencies in the area of Information Search and Evaluation (ISE) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
ISE.C.D15.231.795.771.0925.50.323
ISE.C.D55.691.846.231.4856.00.166
ISE.S.N34.541.715.381.3349.00.160
ISE.S.D15.541.566.151.4641.00.174
Table A16. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Information Search and Evaluation (ISE). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A16. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Information Search and Evaluation (ISE). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
LevelTPACKTPKTCKTK
Comp.rComp.rComp.rComp.r
NameISE.T.N10.68ISE.M.N10.91ISE.C.N10.54ISE.S.N10.44
ISE.T.N20.86ISE.M.N20.85ISE.C.N2-ISE.S.N20.42
ISE.T.N30.62 ISE.C.N30.60ISE.S.N3-
ISE.C.N40.57
DescribeISE.T.D1 *0.76ISE.M.D10.86ISE.C.D1-ISE.S.D1-
ISE.T.D20.81ISE.M.D20.72ISE.C.D20.56ISE.S.D20.60
ISE.C.D30.67ISE.S.D30.86
ISE.C.D40.77
ISE.C.D5-
Use/App.ISE.T.A1 *0.76ISE.M.A1 *0.74ISE.C.A10.75
ISE.T.A2 *0.74
Main learning objectives (bold magenta), secondary learning goals (italic cyan), and non-addressed competencies (yellow). * The average effect size is given for competencies assessed with more than one item.

Appendix E. Data Acquisition (DAQ)

Figure A5. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Data Acquisition (DAQ). Main topics (magenta), side topics (blue).
Figure A5. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Data Acquisition (DAQ). Main topics (magenta), side topics (blue).
Education 12 00356 g0a5
Table A17. Results of Wilcoxon signed-rank tests for competencies in the area of Data Acquisition (DAQ) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 10 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A17. Results of Wilcoxon signed-rank tests for competencies in the area of Data Acquisition (DAQ) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 10 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DAQ.S.N1 * 0.56
a5.001.416.201.0321.00.0170.76
b5.301.776.500.8515.00.0290.70
c5.601.786.501.1817.00.101(0.49)
d4.902.025.901.6017.00.102(0.35)
e4.702.066.201.3229.50.061(0.50)
DAQ.S.D1 * 0.59
a4.201.035.500.9742.50.0090.77
b4.801.625.800.7925.50.0290.63
c4.801.755.601.3538.00.152(0.34)
d4.302.065.801.1443.00.060(0.51)
e4.101.855.901.1034.00.0140.72
f4.101.855.301.3431.00.0380.56
DAQ.S.A1 * 0.58
a4.201.235.501.2733.00.0190.69
b4.701.775.501.3524.00.215(0.26)
c4.301.955.501.0838.50.0310.64
d4.402.015.701.0623.50.062(0.56)
e3.801.815.101.6042.00.0110.75
* The average effect size is given for competencies assessed with more than one item.
Table A18. Results of Wilcoxon signed-rank tests for competencies in the area of Data Acquisition (DAQ) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 10 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A18. Results of Wilcoxon signed-rank tests for competencies in the area of Data Acquisition (DAQ) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 10 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DAQ.T.D1 * 0.41
a5.001.255.700.8233.00.106(0.44)
b5.101.295.401.2617.00.333(0.15)
c4.601.785.401.2615.00.198(0.29)
d5.001.336.001.2541.50.0120.75
DAQ.C.N44.001.705.301.7736.00.058
* The average effect size is given for competencies assessed with more than one item.
Table A19. Results of Wilcoxon signed-rank tests for competencies in the area of Data Acquisition (DAQ) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 10 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A19. Results of Wilcoxon signed-rank tests for competencies in the area of Data Acquisition (DAQ) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 10 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DAQ.T.N14.601.355.301.1625.50.320
DAQ.T.N24.801.485.601.1713.00.170
DAQ.T.A1 *
a4.701.645.601.4310.00.098
b4.401.655.101.7913.00.170
DAQ.M.N15.900.745.901.7319.50.887
DAQ.M.D15.600.975.901.2022.00.613
DAQ.C.N15.701.776.201.6223.50.478
DAQ.C.N25.201.625.801.4034.50.491
DAQ.C.N3 °
DAQ.C.D15.701.065.900.9927.00.608
DAQ.C.A14.901.454.901.7323.01.000
DAQ.S.D2 °
* Assessed with more than one item. ° Not tested.
Table A20. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Data Acquisition (DAQ). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A20. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Data Acquisition (DAQ). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
LevelTPACKTPKTCKTK
Comp.rComp.rComp.rComp.r
NameDAQ.T.N1-DAQ.M.N1-DAQ.C.N1-DAQ.S.N1 *0.56
DAQ.T.N2- DAQ.C.N2-
DAQ.C.N3 °
DAQ.C.N4-
DescribeDAQ.T.D1 *0.41DAQ.M.D1-DAQ.C.D1-DAQ.S.D1 *0.59
DAQ.S.D2 °
Use/App.DAQ.T.A1 *- DAQ.C.A1-DAQ.S.A1 *0.58
Main learning objectives (bold magenta), secondary learning goals (italic cyan), and non-addressed competencies (yellow). * The average effect size is given for competencies assessed with more than one item. ° Not tested.

Appendix F. Data Processing (DAP)

Figure A6. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Data Processing (DAP). Main topics (magenta), side topics (blue). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Figure A6. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Data Processing (DAP). Main topics (magenta), side topics (blue). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Education 12 00356 g0a6
Table A21. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A21. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DAP.C.N24.081.935.751.0642.00.0110.62
DAP.C.D13.771.795.421.1651.00.0090.67
DAP.S.N1 °
DAP.S.N2 * 0.77
a3.851.575.751.2945.00.0040.83
b4.001.735.751.2960.50.0080.73
c4.771.546.081.1643.00.0080.71
d5.081.665.671.5032.50.0200.62
e4.691.446.080.7945.00.0040.83
f4.001.785.751.0663.00.0040.79
g3.381.764.921.7855.00.0030.86
DAP.S.N33.621.945.671.6755.00.0030.86
DAP.S.N44.151.636.080.9055.00.0030.86
DAP.S.N54.461.906.001.1363.00.0040.79
DAP.S.D13.921.715.421.1643.00.0080.71
DAP.S.D2 * 0.70
a3.381.395.001.7163.00.0040.79
b3.772.095.831.1145.00.0040.83
c4.381.615.921.3150.00.0120.65
d4.151.775.501.5173.50.0030.81
e4.541.515.581.1646.00.0310.52
f3.621.765.330.8962.00.0050.74
g3.381.714.501.7355.50.0230.58
DAP.S.D44.001.685.671.4449.50.0130.69
DAP.S.D52.922.225.001.8661.00.0070.72
DAP.S.A24.151.915.921.0843.00.0090.71
DAP.S.A34.461.515.671.5051.50.0070.74
° Not tested. * The average effect size is given for competencies assessed with more than one item.
Table A22. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A22. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DAP.T.N14.151.575.421.3145.50.0360.50
DAP.T.N24.541.715.671.2337.50.0410.52
DAP.T.D1 * 0.57
a4.621.715.501.0942.50.066(0.48)
b4.461.615.581.2458.00.0120.66
DAP.T.D24.001.415.331.5639.50.0240.63
DAP.M.D14.541.515.331.1540.00.0190.58
DAP.M.D24.771.486.080.7941.00.0150.60
DAP.C.N14.151.865.331.0751.50.053
DAP.S.D33.851.955.081.3858.00.0120.66
DAP.S.A1 * 0.50
a3.231.795.251.8650.50.0100.71
b4.002.164.831.6439.50.117(0.32)
c4.771.545.831.4039.00.0270.55
d4.381.944.922.0229.50.217(0.27)
e4.311.555.251.4835.50.068(0.47)
f3.001.784.751.4268.50.0110.67
g3.151.914.251.8652.50.0430.50
* The average effect size is given for competencies assessed with more than one item.
Table A23. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A23. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DAP.T.A1 *
a4.621.505.081.3841.50.145
b4.381.664.831.5337.00.080
DAP.M.N14.771.745.921.3158.50.133
DAP.M.N24.921.715.831.0343.50.109
DAP.M.N3 °
* Assessed with more than one item. ° Not tested.
Table A24. Overview of (average) effect sizes of the effects of the intervention on the competence expectations in the area of Data Processing (DAP). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A24. Overview of (average) effect sizes of the effects of the intervention on the competence expectations in the area of Data Processing (DAP). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
LevelTPACKTPKTCKTK
Comp.rComp.rComp.rComp.r
NameDAP.T.N10.50DAP.M.N1-DAP.C.N1-DAP.S.N1 °
DAPT.N20.52DAP.M.N2-DAP.C.N20.62DAP.S.N2 *0.77
DAP.M.N3 ° DAP.S.N30.86
DAP.S.N40.86
DAP.S.N50.79
DescribeDAP.T.D1 *0.57DAP.M.D10.58DAP.C.D10.067DAP.S.D10.71
DAP.T.D20.63DAP.M.D20.60 DAP.S.D2 *0.70
DAP.S.D30.66
DAP.S.D40.69
DAP.S.D50.72
Use/App.DAP.T.A1 *- DAP.S.A1 *0.50
DAP.S.A20.71
DAP.S.A30.74
Main learning objectives (bold magenta), secondary learning goals (italic cyan), and non-addressed competencies (yellow). * The average effect size is given for competencies assessed with more than one item. ° Not tested.

Appendix G. Simulation and Modelling (SIM)

Figure A7. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Simulation and Modelling (SIM). Main topics (magenta), side topics (blue).
Figure A7. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Simulation and Modelling (SIM). Main topics (magenta), side topics (blue).
Education 12 00356 g0a7
Table A25. Results of Wilcoxon signed-rank tests for competencies in the area of Simulation and Modelling (SIM) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A25. Results of Wilcoxon signed-rank tests for competencies in the area of Simulation and Modelling (SIM) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
SIM.T.A1 * 0.73
a4.821.545.831.1942.50.0090.74
b4.451.755.501.0942.00.0110.72
SIM.M.N1 * 0.60
a4.821.335.830.9431.00.0380.52
b4.361.295.921.1636.00.0060.82
c4.821.545.751.1436.00.058(0.45)
SIM.C.N14.641.576.331.0762.00.0050.80
SIM.C.N24.361.506.081.0052.50.0060.79
SIM.S.N14.731.496.250.9742.50.0100.73
SIM.S.N3 * 0.72
a14.271.495.920.7935.00.0100.71
a24.271.745.830.7249.00.0150.67
b4.361.436.000.8552.00.0060.77
SIM.S.N44.452.026.170.9436.00.0070.82
SIM.S.A14.731.626.831.2755.00.0030.88
* The average effect size is given for competencies assessed with more than one item.
Table A26. Results of Wilcoxon signed-rank tests for competencies in the area of Simulation and Modelling (SIM) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A26. Results of Wilcoxon signed-rank tests for competencies in the area of Simulation and Modelling (SIM) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
SIM.T.D1 * 0.60
a5.000.895.831.0346.00.0260.60
b4.821.175.751.0639.00.0260.59
SIM.M.N25.271.106.171.3425.00.0320.59
SIM.M.D25.451.046.170.9437.00.0400.54
SIM.C.N54.911.386.000.9532.50.0200.65
SIM.C.N6 °
SIM.C.D14.821.336.081.2448.00.0180.64
SIM.S.N2 * 0.69
a3.821.945.001.4133.50.0160.67
b4.001.905.331.2341.50.0120.71
SIM.S.D13.821.544.921.4437.00.0470.53
° Not tested. * The average effect size is given for competencies assessed with more than one item.
Table A27. Results of Wilcoxon signed-rank tests for competencies in the area of Simulation and Modelling (SIM) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A27. Results of Wilcoxon signed-rank tests for competencies in the area of Simulation and Modelling (SIM) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
SIM.T.N15.361.295.920.9022.00.188
SIM.M.D15.181.175.671.0739.00.244
SIM.C.N34.451.375.251.3622.50.172
SIM.C.N45.731.276.421.1616.50.242
Table A28. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Simulation and Modelling (SIM). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table A28. Overview of the (average) effect sizes of the effects of the intervention on the competence expectations in the area of Simulation and Modelling (SIM). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
LevelTPACKTPKTCKTK
Comp.rComp.rComp.rComp.r
NameSIM.T.N1-SIM.M.N1 *0.60SIM.C.N10.80SIM.S.N10.73
SIM.M.N20.59SIM.C.N20.79SIM.S.N2 *0.69
SIM.C.N3-SIM.S.N3 *0.72
SIM.C.N4-SIM.S.N40.82
SIM.C.N50.65
SIM.C.N6 °
DescribeSIM.T.D1 *0.60SIM.T.D1-SIM.C.D10.64SIM.S.D10.53
SIM.M.D20.54
Use/App.SIM.T.A1 *0.73 SIM.S.A10.88
Main learning objectives (bold magenta), secondary learning goals (italic cyan), and non-addressed competencies (yellow). * The average effect size is given for competencies assessed with more than one item. ° Not tested.

References

  1. United Nations Educational, Scientific and Cultural Organization. UNESCO ICT Competency Framework for Teachers; UNESCO: Paris, France, 2011; Available online: http://unesdoc.unesco.org/images/0021/002134/213475e.pdf (accessed on 20 September 2021).
  2. Crompton, H. ISTE Standards for Educators. A Guide for Teachers and Other Professionals; International Society for Technology in Education: Washington, DC, USA, 2017. [Google Scholar]
  3. Redecker, C. European Framework for the Digital Competence of Educators: DigCompEdu; Publications Office of the European Union: Luxembourg, 2017. [Google Scholar]
  4. Becker, S.; Bruckermann, T.; Finger, A.; Huwer, J.; Kremser, E.; Meier, M.; Thoms, L.-J.; Thyssen, C.; von Kotzebue, L. Orien- tierungsrahmen Digitale Kompetenzen für das Lehramt in den Naturwissenschaften—DiKoLAN. In Digitale Basiskompetenzen— Orientierungshilfe und Praxisbeispiele für die universitäre Lehramtsausbildung in den Naturwissenschaften; Becker, S., Meßinger-Koppelt, J., Thyssen, C., Eds.; Joachim Herz Stiftung: Hamburg, Germany, 2020; pp. 14–43. [Google Scholar]
  5. Kotzebue, L.V.; Meier, M.; Finger, A.; Kremser, E.; Huwer, J.; Thoms, L.-J.; Becker, S.; Bruckermann, T.; Thyssen, C. The Framework DiKoLAN (Digital Competencies for Teaching in Science Education) as Basis for the Self-Assessment Tool DiKoLAN-Grid. Educ. Sci. 2021, 11, 775. [Google Scholar] [CrossRef]
  6. Tondeur, J.; Pareja Roblin, N.; van Braak, J.; Voogt, J.; Prestridge, S. Preparing beginning teachers for technology integration in education: Ready for take-off? Technol. Pedagog. Educ. 2017, 26, 157–177. [Google Scholar] [CrossRef]
  7. Kirschner, P.A.; de Bruyckere, P. The myths of the digital native and the multitasker. Teach. Teach. Educ. 2017, 67, 135–142. [Google Scholar] [CrossRef]
  8. Angeli, C.; Valanides, N. Epistemological and methodological issues for the conceptualization, development, and assessment of ICT–TPCK: Advances in technological pedagogical content knowledge (TPCK). Comput. Educ. 2009, 52, 154–168. [Google Scholar] [CrossRef]
  9. Thoms, L.-J.; Colberg, C.; Heiniger, P.; Huwer, J. Digital Competencies for Science Teaching: Adapting the DiKoLAN Framework to Teacher Education in Switzerland. Front. Educ. 2022, 7, 802170. [Google Scholar] [CrossRef]
  10. Mishra, P.; Koehler, M.J. Technological Pedagogical Content Knowledge: A new framework for teacher knowledge. Teachers College Record 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  11. Koehler, M.J.; Mishra, P.; Cain, W. What is Technological Pedagogical Content Knowledge (TPACK)? J. Educ. 2013, 193, 13–19. [Google Scholar] [CrossRef] [Green Version]
  12. Huwer, J.; Irion, T.; Kuntze, S.; Schaal, S.; Thyssen, C. Von TPaCK zu DPaCK-Digitalisierung des Unterrichts erfordert mehr als technisches Wissen. MNU J. 2019, 5, 358. [Google Scholar]
  13. Huwer, J.; Irion, T.; Kuntze, S.; Schaal, S.; Thyssen, C. From TPaCK to DPaCK – Digitalization in Education Requires more than Technical Knowledge. In Education Research Highlights in Mathematics, Science and Technology; Shelly, M., Kiray, A., Eds.; IRES Publishing: Des Moines, IA, USA, 2019; pp. 298–309. [Google Scholar]
  14. Thoms, L.-J.; Meier, M.; Huwer, J.; Thyssen, C.; von Kotzebue, L.; Becker, S.; Kremser, E.; Finger, A.; Bruckermann, T. DiKoLAN – A Framework to Identify and Classify Digital Competencies for Teaching in Science Education and to Restructure Pre-Service Teacher Training. In Proceedings of the Society for Information Technology & Teacher Education International Conference, Waynesville, NC, USA, 29 March–2 April 2021; Langran, E., Archambault, L., Eds.; Association for the Advancement of Computing in Education (AACE): Waynesville, NC, USA, 2021; pp. 1652–1657, ISBN 978-1-939797-55-1. [Google Scholar]
  15. Thyssen, C.; Thoms, L.-J.; Kremser, E.; Finger, A.; Huwer, J.; Becker, S. Digitale Basiskompetenzen in der Lehrerbildung unter besonderer Berücksichtigung der Naturwissenschaften. In Digitale Innovationen und Kompetenzen in der Lehramtsausbildung; Beißwenger, M., Bulizek, B., Gryl, I., Schacht, F.F., Eds.; Universitätsverlag Rhein-Ruhr KG: Duisburg, Germany, 2020; pp. 77–98. [Google Scholar]
  16. Hoyer, C.; Thoms, L.-J.; Girwidz, R. Lehren mit Multimedia, Fernlaboren und 3D-Druck im Physikunterricht. In Naturwissenschaftliche Kompetenzen in der Gesellschaft von morgen; Habig, S., Ed.; Universität Duisburg-Essen: Essen, Germany, 2020; pp. 979–982. [Google Scholar]
  17. Banerji, A.; Thyssen, C.; Pampel, B.; Huwer, J. Naturwissenschaftsunterricht und Informatik – bringt zusammen, was zusammen gehört?! ChemKon 2021, 28. [Google Scholar] [CrossRef]
  18. Meier, M.; Thyssen, C.; Becker, S.; Bruckermann, T.; Finger, A.; Kremser, E.; Thoms, L.-J.; von Kotzbue, L.; Huwer, J. Digitale Kompetenzen für das Lehramt in den Naturwissenschaften – Beschreibung und Messung von Kompetenzzielen der Studienphase im Bereich Präsentation. In Bildung in der digitalen Transformation; Wollersheim, H.-W., Pengel, N., Eds.; Waxmann: Münster, Deutschland, 2021; pp. 185–190. [Google Scholar]
  19. Meier, M.; Thoms, L.-J.; Becker, S.; Finger, A.; Kremser, E.; Huwer, J.; von Kotzebue, L.; Bruckermann, T.; Thyssen, C. Digitale Transformation von Unterrichtseinheiten – DiKoLAN als Orientierungs- und Strukturierungshilfe am Beispiel Low-Cost-Photometrie mit dem Smartphone. In Digitalisation in Chemistry Education. Digitales Lehren und Lernen an Hochschule und Schule im Fach Chemie; Graulich, N., Huwer, J., Banerji, A., Eds.; Waxmann Verlag GmbH: Münster, Germany, 2021; pp. 13–27. ISBN 9783830944188. [Google Scholar]
  20. Frank, T.; Thoms, L.-J. Digitale Kompetenzen beim Experimentieren fördern: Ortsfaktorbestimmung mit verschiedenen Sensoren im Physikunterricht. PhyDid B 2021, 13–20. [Google Scholar]
  21. Thoms, L.-J.; Finger, A.; Thyssen, C.; Frank, T. Digitale Kompetenzen beim Experimentieren fördern: Schülerexperimente zur Messung der Periodendauer eines Fadenpendels und zur Bestimmung des Ortsfaktors. Naturwissenschaften im Unterricht Physik 2020, 31, 23–27. [Google Scholar]
  22. Zimmermann, F.; Melle, I.; Huwer, J. Developing Prospective Chemistry Teachers’ TPACK–A Comparison between Students of Two Different Universities and Expertise Levels Regarding Their TPACK Self-Efficacy, Attitude, and Lesson Planning Competence. J. Chem. Educ. 2021. [Google Scholar] [CrossRef]
  23. BMBF. Verwaltungsvereinbarung DigitalPakt Schule 2019 bis 2024. Available online: https://www.bmbf.de/files/19-03-15_VV_DigitalPaktSchule_Wasserzeichen.pdf (accessed on 5 May 2022).
  24. Puentedura, R.R. As We May Teach: Educational Technology, From Theory Into Practice; 2009. Available online: http://www.hippasus.com/rrpweblog/archives/000025.html (accessed on 5 May 2022).
  25. Chi, M.T.H.; Wylie, R. The ICAP Framework: Linking Cognitive Engagement to Active Learning Outcomes. Educ. Psychol. 2014, 49, 219–243. [Google Scholar] [CrossRef]
  26. Archambault, L.; Crippen, K. Examining TPACK among K-12 online distance educators in the United States. Contemp. Issues Technol. Teach. Educ. 2019, 9, 71–88. [Google Scholar]
  27. Stuckey, M.; Hofstein, A.; Mamlok-Naaman, R.; Eilks, I. The meaning of ‘relevance’ in science education and its implications for the science curriculum. Stud. Sci. Educ. 2013, 49, 1–34. [Google Scholar] [CrossRef]
  28. Huwer, J.; Banerji, A.; Thyssen, C. Digitalisierung -Perspektiven für den Chemieunterricht. Nachrichten Chemie 2020, 68, 10–16. [Google Scholar] [CrossRef]
  29. Huwer, J.; Seibert, J. EXPlain Chemistry–innovative Methode zur Erklärung und Visualisierung. Naturwissenschaften im Unterricht Chemie 2017, 160, 44–48. [Google Scholar]
  30. Vogelsang, C.; Finger, A.; Laumann, D.; Thyssen, C. Experience, Attitudes and Motivational Orientations as Potential Factors Influencing the Use of Digital Tools in Science Teaching. ZfDN 2019, 25, 115–129. [Google Scholar] [CrossRef]
  31. Girwidz, R.; Thoms, L.-J.; Pol, H.; López, V.; Michelini, M.; Stefanel, A.; Greczyło, T.; Müller, A.; Gregorcic, B.; Hömöstrei, M. Physics teaching and learning with multimedia applications: A review of teacher-oriented literature in 34 local language journals from 2006 to 2015. Int. J. Sci. Educ. 2019, 25, 1–26. [Google Scholar] [CrossRef]
  32. Girwidz, R.; Hoyer, C. Didaktische Aspekte zum Einsatz von digitalen Medien—Leitlinien zum Lernen mit Multimedia, veranschaulicht an Beispielen. In Naturwissenschaften Digital: Toolbox für den Unterricht; Maxton-Küchenmeister, J., Meßinger-Koppelt, J., Eds.; Joachim Herz Stiftung Verlag: Hamburg, Germany, 2018; pp. 6–23. [Google Scholar]
  33. Scheiter, K.; Richter, J. Multimediale Unterrichtsmaterialien gestalten. Ergebnisse der empirischen Lehr-Lernforschung. Nat. Unterr. Chem. 2015, 26, 8–11. [Google Scholar]
  34. Sweller, J. Implications of Cognitive Load Theory for Multimedia Learning. In The Cambridge Handbook of Multimedia Learning; Mayer, R.E., Ed.; Cambridge University Press: Cambridge, UK, 2005; pp. 19–30. [Google Scholar]
  35. Krause, U.M.; Stark, R.; Mandl, H. The effects of cooperative learning and feedback on e-learning in statistics. Learn. Instr. 2009, 19, 158–170. [Google Scholar] [CrossRef]
  36. Brand-Gruwel, S.; Wopereis, I.; Walraven, A. A descriptive model of information problem solving while using internet. Comput. Educ. 2009, 53, 1207–1217. [Google Scholar] [CrossRef]
  37. Thoms, L.-J.; Colicchia, G.; Girwidz, R. Using the Naked Eye to Analyze Polarized Light From a Smartphone. Phys. Teach. 2021, 59, 337–339. [Google Scholar] [CrossRef]
  38. Thoms, L.-J.; Colicchia, G.; Watzka, B.; Girwidz, R. Electrocardiography with a Smartphone. Phys. Teach. 2019, 57, 586–589. [Google Scholar] [CrossRef]
  39. Thoms, L.-J.; Colicchia, G.; Girwidz, R. Audiometric Test with a Smartphone. Phys. Teach. 2018, 56, 478–481. [Google Scholar] [CrossRef]
  40. Thoms, L.-J.; Colicchia, G.; Girwidz, R. Phonocardiography with a smartphone. Phys. Educ. 2017, 52, 23004. [Google Scholar] [CrossRef]
  41. Thoms, L.-J.; Girwidz, R. Virtual and remote experiments for radiometric and photometric measurements. Eur. J. Phys. 2017, 38, 55301–55324. [Google Scholar] [CrossRef]
  42. Rutten, N.; van Joolingen, W.R.; van der Veen, J.T. The learning effects of computer simulations in science education. Comput. Educ. 2012, 58, 136–153. [Google Scholar] [CrossRef]
  43. Farrokhnia, M.; Meulenbroeks, R.F.G.; van Joolingen, W.R. Student-Generated Stop-Motion Animation in Science Classes: A Systematic Literature Review. J. Sci. Educ. Technol. 2020, 29, 797–812. [Google Scholar] [CrossRef]
  44. van Borkulo, S.P.; van Joolingen, W.R.; Savelsbergh, E.R.; de Jong, T. What Can Be Learned from Computer Modeling? Comparing Expository and Modeling Approaches to Teaching Dynamic Systems Behavior. J. Sci. Educ. Technol. 2020, 21, 267–275. [Google Scholar] [CrossRef] [Green Version]
  45. van Joolingen, W. A Germ for Young European Scientists: Drawing-Based Modelling. In Simulation and Serious Games for Education. Gaming Media and Social Effects; Cai, Y., Goei, S., Trooster, W., Eds.; Springer: Singapore, 2017; pp. 13–28. [Google Scholar] [CrossRef]
  46. Probst, C.; Fetzer, D.; Lukas, S.; Huwer, J. Effects of using augmented reality (AR) in visualizing a dynamic particle model. CHEMKON 2021. [Google Scholar] [CrossRef]
  47. Becker, S.; Meßinger-Koppelt, J.; Thyssen, C. (Eds.) Digitale Basiskompetenzen—Orientierungshilfe und Praxisbeispiele für die Universitäre Lehramtsausbildung in den Naturwissenschaften; Joachim Herz Stiftung Verlag: Hamburg, Germany, 2020. [Google Scholar]
  48. Vogelsang, C.; Szabone Varnai, A. Modellierung und Analyse komplexer Alltagsphänomene. Herausford. Lehr. Innenbildung—Z. Konzept. Gestalt. Diskuss. 2018, 1, 120–146. [Google Scholar]
  49. Seibert, J.; Kay, C.; Huwer, J. EXPlainistry: Creating Documentation, Explanations, and Animated Visualizations of Chemistry Experiments Supported by Information and Communication Technology To Help School Students Understand Molecular-Level Interactions. J. Chem. Educ. 2019, 96, 2503–2509. [Google Scholar] [CrossRef]
  50. Huwer, J.; Lauer, L.; Dörrenbächer-Ulrich, L.; Thyssen, C.; Perels, F. Chemie neu erleben mit Augmented Reality. MNU J. 2019, 5, 420–427. [Google Scholar]
  51. Tschiersch, A.; Krug, M.; Huwer, J.; Banerji, A. ARbeiten mit erweiterter Realität im Chemieunterricht – ein Überblick über Augmented Reality in naturwissenschaftlichen Lehr-Lernszenarien. ChemKon 2021, 28, 6. [Google Scholar] [CrossRef]
  52. Krug, M.; Czok, V.; Huwer, J.; Weitzel, H.; Müller, W. Challenges for the design of augmented reality applications for science teacher education. INTED2021 Proc. 2021, 6, 2484–2491. [Google Scholar] [CrossRef]
  53. PHYWE MeasureAPP; PHYWE Systeme GmbH & Co. KG.: Göttingen, Germany, 2021.
  54. Excel; Microsoft Corporation: Redmond, WA, USA, 2021.
  55. Phyphox; RWTH Aachen University: Aachen, Germany, 2021.
  56. Chai, C.S.; Koh, J.H.L.; Tsai, C.-C. A review of the quantitative measures of technological pedagogical content knowledge (TPACK). In Handbook of Technological Pedagogical Content Knowledge (TPCK) for Educators, 2nd ed; Routledge: London, UK, 2016; pp. 87–106. [Google Scholar]
  57. Schmidt, D.A.; Baran, E.; Thompson, A.D.; Mishra, P.; Koehler, M.J.; Shin, T.S. Technological Pedagogical Content Knowledge (TPACK): The Development and Validation of an Assessment Instrument for Preservice Teachers. J. Res. Technol. Educ. 2009, 42, 123–149. [Google Scholar] [CrossRef]
  58. Vucaj, I. Development and initial validation of Digital Age Teaching Scale (DATS) to assess application of ISTE Standards for Educators in K–12 education classrooms. J. Res. Technol. Educ. 2020, 1–23. [Google Scholar] [CrossRef]
  59. Gomez, F.C.; Trespalacios, J.; Hsu, Y.-C.; Yang, D. Exploring Teachers’ Technology Integration Self-Efficacy through the 2017 ISTE Standards. TechTrends 2021, 66, 159–171. [Google Scholar] [CrossRef]
  60. Ghomi, M.; Redecker, C. Digital Competence of Educators (DigCompEdu): Development and Evaluation of a Self-assessment Instrument for Teachers’ Digital Competence. In Proceedings of the 11th International Conference on Computer Supported Education, Heraklion, Greece, 2–4 May 2019; Scitepress—Science and Technology Publications: Setúbal, Portugal, 2019; pp. 541–548. [Google Scholar]
  61. National Competence Center eEducation Austria. digi.check: PädagogInnenbildung. Available online: https://digicheck.at/paedagoginnenbildung (accessed on 30 April 2022).
  62. European Schoolnet. In Final Executive Report MENTEP Global Self-Evaluation and TET-SAT as a Certification Tool; European School Net: Brussels, Belgium, 2018.
  63. LimeSurvey: An Open Source Survey Tool; Limesurvey GmbH: Hamburg, Germany, 2021.
  64. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2021.
  65. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Elsevier Science & Technology: New York, NY, USA, 1988. [Google Scholar]
  66. Kruger, J.; Dunning, D. Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments. J. Pers. Soc. Psychol. 1999, 77, 1121–1134. [Google Scholar] [CrossRef] [PubMed]
  67. Mahmood, K. Do People Overestimate Their Information Literacy Skills? A Systematic Review of Empirical Evidence on the Dunning-Kruger Effect. Comminfolit 2016, 10, 199. [Google Scholar] [CrossRef]
Figure 1. The DiKoLAN framework (https://dikolan.de/en) (accessed 10 May 2022) [5].
Figure 1. The DiKoLAN framework (https://dikolan.de/en) (accessed 10 May 2022) [5].
Education 12 00356 g001
Figure 2. Phase structure of the seminar.
Figure 2. Phase structure of the seminar.
Education 12 00356 g002
Figure 3. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Data Processing (DAP). Main topics (magenta), side topics (blue).
Figure 3. Competence expectations defined in the DiKoLAN framework addressed in the respective teaching module Data Processing (DAP). Main topics (magenta), side topics (blue).
Education 12 00356 g003
Figure 4. The nomenclature of competence expectations used in DiKoLAN [4,5]. Adapted with permission from Ref. [4]. © 2020 Joachim Herz Stiftung.
Figure 4. The nomenclature of competence expectations used in DiKoLAN [4,5]. Adapted with permission from Ref. [4]. © 2020 Joachim Herz Stiftung.
Education 12 00356 g004
Figure 5. Comparison of the observed (averaged) effect size of the main learning objectives and secondary learning goals. Boxplots visualise the distribution of the (averaged) effect size within a category of learning goals. The green lines show the median of effect sizes within a competency area. An adjusted threshold for large effects 0.60 is chosen (yellow line).
Figure 5. Comparison of the observed (averaged) effect size of the main learning objectives and secondary learning goals. Boxplots visualise the distribution of the (averaged) effect size within a category of learning goals. The green lines show the median of effect sizes within a competency area. An adjusted threshold for large effects 0.60 is chosen (yellow line).
Education 12 00356 g005
Table 1. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table 1. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) explicitly addressed as main learning objectives in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DAP.C.N24.081.935.751.0642.00.0110.62
DAP.C.D13.771.795.421.1651.00.0090.67
DAP.S.N1 °
DAP.S.N2 * 0.77
a3.851.575.751.2945.00.0040.83
b4.001.735.751.2960.50.0080.73
c4.771.546.081.1643.00.0080.71
d5.081.665.671.5032.50.0200.62
e4.691.446.080.7945.00.0040.83
f4.001.785.751.0663.00.0040.79
g3.381.764.921.7855.00.0030.86
DAP.S.N33.621.945.671.6755.00.0030.86
DAP.S.N44.151.636.080.9055.00.0030.86
DAP.S.N54.461.906.001.1363.00.0040.79
DAP.S.D13.921.715.421.1643.00.0080.71
DAP.S.D2 * 0.70
a3.381.395.001.7163.00.0040.79
b3.772.095.831.1145.00.0040.83
c4.381.615.921.3150.00.0120.65
d4.151.775.501.5173.50.0030.81
e4.541.515.581.1646.00.0310.52
f3.621.765.330.8962.00.0050.74
g3.381.714.501.7355.50.0230.58
DAP.S.D44.001.685.671.4449.50.0130.69
DAP.S.D52.922.225.001.8661.00.0070.72
DAP.S.A24.151.915.921.0843.00.0090.71
DAP.S.A34.461.515.671.5051.50.0070.74
Note: ° not tested. * The average effect size is given for competencies assessed with more than one item.
Table 2. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table 2. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) addressed as secondary learning goals in the respective module and hypothesised to grow during intervention. n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DAP.T.N14.151.575.421.3145.50.0360.50
DAP.T.N24.541.715.671.2337.50.0410.52
DAP.T.D1 * 0.57
a4.621.715.501.0942.50.066(0.48)
b4.461.615.581.2458.00.0120.66
DAP.T.D24.001.415.331.5639.50.0240.63
DAP.M.D14.541.515.331.1540.00.0190.58
DAP.M.D24.771.486.080.7941.00.0150.60
DAP.C.N14.151.865.331.0751.50.053
DAP.S.D33.851.955.081.3858.00.0120.66
DAP.S.A1 * 0.50
a3.231.795.251.8650.50.0100.71
b4.002.164.831.6439.50.117(0.32)
c4.771.545.831.4039.00.0270.55
d4.381.944.922.0229.50.217(0.27)
e4.311.555.251.4835.50.068(0.47)
f3.001.784.751.4268.50.0110.67
g3.151.914.251.8652.50.0430.50
Note: * the average effect size is given for competencies assessed with more than one item.
Table 3. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table 3. Results of Wilcoxon signed-rank tests for competencies in the area of Data Processing (DAP) NOT explicitly addressed in the respective module and thus NOT hypothesised to change during intervention (for comparison). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
CompetencyPrePostVpr
MSDMSD
DAP.T.A1 *
a4.621.505.081.3841.50.145
b4.381.664.831.5337.00.080
DAP.M.N14.771.745.921.3158.50.133
DAP.M.N24.921.715.831.0343.50.109
DAP.M.N3 °
Note: * assessed with more than one item. ° Not tested.
Table 4. Overview of (average) effect sizes of the effects of the intervention on the competence expectations in the area of Data Processing (DAP). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
Table 4. Overview of (average) effect sizes of the effects of the intervention on the competence expectations in the area of Data Processing (DAP). n = 13 . S: Special Tools, C: Content-specific Context, M: Methods/Digitality, T: Teaching, N: Name, D: Describe, A: Use/Apply.
LevelTPACKTPKTCKTK
Comp.rComp.rComp.rComp.r
NameDAP.T.N10.50DAP.M.N1-DAP.C.N1-DAP.S.N1 °
DAPT.N20.52DAP.M.N2-DAP.C.N20.62DAP.S.N2 *0.77
DAP.M.N3 ° DAP.S.N30.86
DAP.S.N40.86
DAP.S.N50.79
DescribeDAP.T.D1 *0.57DAP.M.D10.58DAP.C.D10.67DAP.S.D10.71
DAP.T.D20.63DAP.M.D20.60 DAP.S.D2 *0.70
DAP.S.D30.66
DAP.S.D40.69
DAP.S.D50.72
Use/App.DAP.T.A1 *- DAP.S.A1 *0.50
DAP.S.A20.71
DAP.S.A30.74
Note: main learning objectives (bold magenta), secondary learning goals (italic cyan), and non-addressed competencies (yellow). * The average effect size is given for competencies assessed with more than one item. ° Not tested.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Henne, A.; Möhrke, P.; Thoms, L.-J.; Huwer, J. Implementing Digital Competencies in University Science Education Seminars Following the DiKoLAN Framework. Educ. Sci. 2022, 12, 356. https://doi.org/10.3390/educsci12050356

AMA Style

Henne A, Möhrke P, Thoms L-J, Huwer J. Implementing Digital Competencies in University Science Education Seminars Following the DiKoLAN Framework. Education Sciences. 2022; 12(5):356. https://doi.org/10.3390/educsci12050356

Chicago/Turabian Style

Henne, Anna, Philipp Möhrke, Lars-Jochen Thoms, and Johannes Huwer. 2022. "Implementing Digital Competencies in University Science Education Seminars Following the DiKoLAN Framework" Education Sciences 12, no. 5: 356. https://doi.org/10.3390/educsci12050356

APA Style

Henne, A., Möhrke, P., Thoms, L. -J., & Huwer, J. (2022). Implementing Digital Competencies in University Science Education Seminars Following the DiKoLAN Framework. Education Sciences, 12(5), 356. https://doi.org/10.3390/educsci12050356

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop