A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19
Abstract
:1. Introduction
1.1. Online Classroom for Teaching
1.1.1. Definitions of Online Classroom
1.1.2. Application of Online Classroom
1.2. Training of Online Classroom
1.2.1. The Training of Technological Pedagogical Content Knowledge (TPACK)
1.2.2. The Cases of Training of Online Classroom
1.2.3. Pre-Service Teacher Training
1.3. Research Aims
- To develop a new training model for pre-service science teachers’ TPACK and online-classroom technology.
- To evaluate pre-service science teachers’ TPACK with online-classroom technology through the training model.
- To verify the utility of the training model.
2. Decode Model and CloudClassRoom (CCR)
2.1. Training Model: Decode Model
- (1)
- The first stage is “Teacher Demonstration,” a technique with instructional characteristics such as “passive guidance or support, preparatory activities or tasks, concurrent activities, retrospective activities, and prospective activities” (p. 220, [54]). In the same way, a demonstration is a dynamic model to exemplify task performance linked with knowledge, skills, abilities, and learning contents. This technique is also related to the situated cognitive and collaborative learning approach; it is connected to collaborative problem solving, displaying multiple roles, confronting ineffective strategies and misconceptions, and providing collaborative work skills [55].
- (2)
- The second stage is “Students Training”, guided by the TPACK model aims to open a set of opportunities to experiment with ICT-specific knowledge areas and increase the integration degree between technology and teaching practices. Thus, technology should be “understood as a methodical and holistic process inserted in the initial teacher training, starting from a collaborative learning, active participation, and through the design of materials as a final product that leads to significant learning processes” [56]. Teacher Training is also linked to improving learning practices such as planning, action and performance, evaluation, and improving metacognitive question prompts (comprehension, strategic, and reflection questions) [57].
- (3)
- The third stage is “Students Design Course,” a strategy based on collaborative design teams or peer design “for stimulating and supporting teacher learning. This approach to technology integration will help move pre-service teachers from being passive learners and consumers of technological resources to being more active learners and producer/designers of technology resources, thereby increasing user involvement and local ownership” (p. 561, [43]). The Student Design Course is a type of participation more sensitive towards learning goals requirements, what should be taught, and technology integration: “The idea of learning by design is not new. However, we believe that the TPCK framework provides yet another argument for the pedagogical value of such activities, especially when considering the integration of educational technology in pedagogy” (p. 148, [58]).
- (4)
- The fourth stage is “Students Teach Course,” a strategy for developing a collaborative learning environment, following the antecedents above. In general terms, the “cooperative teaching (co-teaching or teaching in pairs) contribute to the attainment of several objectives: increasing learning time by facilitating work in several groups at the same time, inclusion through heterogeneous classes, providing alternative teaching approaches adapted to learners by working in stations, providing alternative tasks, increasing personal attention to each pupil, and allowing experienced teachers to mentor young teachers” (p. 1402, [59]).
2.2. Online Classroom: CloudClassRoom (CCR)
- (1)
- Questioning: In CloudClassRoom, there are three basic types of questions: True/False, Multiple Choice, and Open-Ended (see Figure 1a). They form the fundamental dialogue and discuss strategy on CloudClassRoom. Questions allow developing learning-oriented interactions, timely feedback, peer discussions, quizzes, flipped classes, and possibilities to participate in online classroom experiences. Public questions are one of the most important guides for students, promoting their connections with collective pace from their own pace.
- (2)
- Analyzing: True/False and Multiple Choice can be analyzed with percentage (%) and amount; Open-Ended questions can be analyzed with Semantic analysis by Artificial Intelligence. Teachers and students, in different ways, have at their disposal outcomes, students’ responses, trends, tables, and graphics, useful data for dynamic analyses to support pedagogical reasoning and decisions about what should be taught (see Figure 1b).
- (3)
- Grouping: After answering, participants can be grouped according to homo responses, heterogeneous responses to foster discussions checking and exploring other points of view, refining questions, enhancing the comprehension of a specific subject matter, or planning tasks (see Figure 1c); this is a resource to support a collaborative learning environment.
- (4)
- Testing: A test with items can be published and used offline or organized by a test bank. (Figure 1d) Assessments and feedback are based on testing to develop an integral learning-oriented interaction and generate references to improve different classroom performances and factors like planning, roles, paces, distribution, classification, competition, playing, and reinforcement, among others.
- (5)
- Managing: Different functions are available: Roll call, distribution of seats and positions, appoint an assistant, messages, and a general question area with more important questions (see Figure 1e). They allow for support of the interactions with control tools in the classroom through information about, for example, students present at a specific time, possibilities and opportunities for grouping, several ways for teacher-students dialogue, and identification of trends.
- (6)
- Reviewing: File downloads can make learning data available for teachers, discussion records, students’ responses, progress in the application of tests (see Figure 1f). After a class, there is available information to analyze, make decisions, display, evaluate, check, and plan. Each classroom produces qualitative and quantitative information related to task and action results.
- (7)
- Free access: online webpage, non-register, free, and guest profile, are features of exploration on the CloudClassRoom, to know it and notice what the possibilities to connect it with learning spaces are.
3. Methodology and Methods
3.1. Measurements
3.1.1. TPACK Questionnaire
- (1)
- Content knowledge: “I understand the content in my expert discipline.”
- (2)
- Pedagogical knowledge: “I understand the various teaching strategies and methods used in the class.”
- (3)
- Pedagogical content knowledge: “I understand how to present content in my expert discipline and the teaching method conforming to the subject content and students’ level.”
- (4)
- Technological knowledge: “I understand the interface, operation, and question-making methods in CloudClassRoom”.
- (5)
- Technological content knowledge: “I understand how to use CloudClassRoom to present the content in my expert discipline conforming to the subject content.”
- (6)
- Technological pedagogical knowledge: “I understand how to use CloudClassRoom to implement various teaching strategies and evaluations in the class.”
- (7)
- Technological Pedagogical and Content Knowledge: “I understa what content is suitable for presentation with CloudClassRoom, and can convey knowledge truly in my expert discipline.” “I understand how to use CCR to present the content in my expert discipline and assist students in constructing knowledge well in class.”
3.1.2. The Questionnaire of Students’ Feedback for Each Group’s Presentation
3.1.3. The Questionnaire of Students’ Feedback for the Decode Model
3.2. Procedural and Statistical Analysis
4. Results
4.1. The Performance of Pre-Service Science Teachers’ TPACK
4.2. Pre-Service Science Teachers Designed the Course
4.3. The Participants’ Feedback on the DECODE Model
5. Discussion
5.1. TPACK through the Training Model
5.2. Adjustment of the Training Model
6. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Sánchez Ruiz, L.M.; Moll-López, S.; Moraño-Fernández, J.A.; Llobregat-Gómez, N. B-learning and technology: Enablers for university education resilience. An experience case under COVID-19 in Spain. Sustainability 2021, 13, 3532. [Google Scholar] [CrossRef]
- Md Yunus, M.; Ang, W.S.; Hashim, H. Factors affecting teaching English as a Second Language (TESL) postgraduate students’ behavioural intention for online learning during the COVID-19 pandemic. Sustainability 2021, 13, 3524. [Google Scholar] [CrossRef]
- Nicolò, G.; Aversano, N.; Sannino, G.; Tartaglia Polcini, P. Investigating web-based sustainability reporting in Italian public universities in the era of COVID-19. Sustainability 2021, 13, 3468. [Google Scholar] [CrossRef]
- Liu, C.; McCabe, M.; Dawson, A.; Cyrzon, C.; Shankar, S.; Gerges, N.; Kellett-Renzella, S.; Chye, Y.; Cornish, K. Identifying predictors of university students’ wellbeing during the COVID-19 pandemic—A data-driven approach. Int. J. Environ. Res. Public Health 2021, 18, 6730. [Google Scholar] [CrossRef]
- Yang, Z.; Liu, Q. Research and development of web-based virtual online classroom. Comput. Educ. 2007, 48, 171–184. [Google Scholar] [CrossRef]
- Bickle, M.C.; Rucker, R. Student-to-student interaction: Humanizing the online classroom using technology and group assignments. Q. Rev. Distance Educ. 2018, 19, 1–11, 56. [Google Scholar]
- Palvia, S.; Aeron, P.; Gupta, P.; Mahapatra, D.; Parida, R.; Rosner, R.; Sindhi, S. Online Education: Worldwide Status, Challenges, Trends, and Implications; Taylor & Francis: Oxfordshire, UK, 2018. [Google Scholar]
- Zafar, H.; Akhtar, S.H. Analyzing the effectiveness of activity based teaching and traditional teaching method through students’ achievement in sub domain knowledge at secondary level. Lang. India 2021, 21, 149–162. [Google Scholar]
- Noreen, R.; Rana, A.M.K. Activity-based teaching versus traditional method of teaching in mathematics at elementary level. Bull. Educ. Res. 2019, 41, 145–159. [Google Scholar]
- Hokor, E.K.; Sedofia, J. Developing probabilistic reasoning in preservice teachers: Comparing the learner-centered and teacher-centered approaches of teaching. Int. J. Stud. Educ. Sci. 2021, 2, 120–145. [Google Scholar]
- Houston, L. Efficient strategies for integrating universal design for learning in the online classroom. J. Educ. Online 2018, 15, n3. [Google Scholar] [CrossRef]
- Davis, N.L.; Gough, M.; Taylor, L.L. Online teaching: Advantages, obstacles and tools for getting it right. J. Teach. Travel Tour. 2019, 19, 256–263. [Google Scholar] [CrossRef]
- Dumford, A.D.; Miller, A.L. Online learning in higher education: Exploring advantages and disadvantages for engagement. J. Comput. High. Educ. 2018, 30, 452–465. [Google Scholar] [CrossRef]
- Stöhr, C.; Demazière, C.; Adawi, T. The polarizing effect of the online flipped classroom. Comput. Educ. 2020, 147, 103789. [Google Scholar] [CrossRef]
- Tang, T.; Abuhmaid, A.M.; Olaimat, M.; Oudat, D.M.; Aldhaeebi, M.; Bamanger, E. Efficiency of flipped classroom with online-based teaching under COVID-19. Interact. Learn. Environ. 2020, 28, 1–2. [Google Scholar] [CrossRef]
- Muthuprasad, T.; Aiswarya, S.; Aditya, K.; Jha, G.K. Students’ perception and preference for online education in India during COVID-19 pandemic. Soc. Sci. Humanit. Open 2021, 3, 100101. [Google Scholar] [CrossRef]
- Dhawan, S. Online learning: A panacea in the time of COVID-19 crisis. J. Educ. Technol. Syst. 2020, 49, 5–22. [Google Scholar] [CrossRef]
- Steele, J.P.; Robertson, S.N.; Mandernach, B.J. Beyond Content: The value of instructor-student connections in the online classroom. J. Scholarsh. Teach. Learn. 2018, 18, 130–150. [Google Scholar]
- Yang, X.; Zhang, M.; Kong, L.; Wang, Q.; Hong, J.-C. The effects of scientific self-efficacy and cognitive anxiety on science engagement with the “question-observation-doing-explanation” model during school disruption in COVID-19 pandemic. J. Sci. Educ. Technol. 2021, 30, 380–393. [Google Scholar] [CrossRef]
- Almahasees, Z.; Mohsen, K.; Amin, M. Faculty’s and students’ perceptions of online learning during COVID-19. Front. Educ. 2021, 6, 638470. [Google Scholar] [CrossRef]
- Singh, V.; Thurman, A. How many ways can we define online learning? A systematic literature review of definitions of online learning (1988–2018). Am. J. Distance Educ. 2019, 33, 289–306. [Google Scholar] [CrossRef]
- Batubara, B.M. The problems of the world of education in the middle of the COVID-19 pandemic. Bp. Int. Res. Crit. Inst. Humanit. Soc. Sci. Humanit. Open 2021, 4, 450–457. [Google Scholar] [CrossRef]
- Husni Rahiem, M.D. Indonesian university students’ likes and dislikes about emergency remote learning during the COVID-19 pandemic. Asian J. Univ. Educ. 2021, 17, 1–18. [Google Scholar] [CrossRef]
- Moorhouse, B.L. Adaptations to a face-to-face initial teacher education course ‘forced’ online due to the COVID-19 pandemic. J. Educ. Teach. 2020, 46, 609–611. [Google Scholar] [CrossRef] [Green Version]
- Zhu, X.; Liu, J. Education in and after COVID-19: Immediate responses and long-term visions. Postdigital Sci. Educ. 2020, 2, 695–699. [Google Scholar] [CrossRef] [Green Version]
- Donnelly, R.; Patrinos, H.A. Learning loss during COVID-19: An early systematic review. Prospects 2021, 1–9. [Google Scholar] [CrossRef]
- Hartshorne, R.; Baumgartner, E.; Kaplan-Rakowski, R.; Mouza, C.; Ferdig, R.E. Special issue editorial: Preservice and inservice professional development during the COVID-19 pandemic. J. Technol. Teach. Educ. 2020, 28, 137–147. [Google Scholar]
- Koehler, M.J.; Mishra, P.; Yahya, K. Tracing the development of teacher knowledge in a design seminar: Integrating content, pedagogy and technology. Comput. Educ. 2007, 49, 740–762. [Google Scholar] [CrossRef]
- Angeli, C.; Valanides, N. Epistemological and methodological issues for the conceptualization, development, and assessment of ICT–TPCK: Advances in technological pedagogical content knowledge (TPCK). Comput. Educ. 2009, 52, 154–168. [Google Scholar] [CrossRef]
- Eshet, Y. Digital literacy: A conceptual framework for survival skills in the digital era. J. Educ. Multimed. Hypermedia 2004, 13, 93–106. [Google Scholar]
- Philipsen, B.; Tondeur, J.; Pareja Roblin, N.; Vanslambrouck, S.; Zhu, C. Improving teacher professional development for online and blended learning: A systematic meta-aggregative review. Educ. Technol. Res. Dev. 2019, 67, 1145–1174. [Google Scholar] [CrossRef]
- Díaz-Noguera, M.D.; Hervás-Gómez, C.; la Calle-Cabrera, D.; María, A.; López-Meneses, E. Autonomy, motivation, and digital pedagogy are key factors in the perceptions of Spanish higher-education students toward online learning during the COVID-19 pandemic. Int. J. Environ. Res. Public Health 2022, 19, 654. [Google Scholar] [CrossRef] [PubMed]
- Angeli, C.; Valanides, N. Technology mapping: An approach for developing technological pedagogical content knowledge. J. Educ. Comput. Res. 2013, 48, 199–221. [Google Scholar] [CrossRef]
- Oyedotun, T.D. Sudden change of pedagogy in education driven by COVID-19: Perspectives and evaluation from a developing country. Res. Glob. 2020, 2, 100029. [Google Scholar] [CrossRef]
- Jang, S.-J.; Tsai, M.-F. Exploring the TPACK of Taiwanese secondary school science teachers using a new contextualized TPACK model. Australas. J. Educ. Technol. 2013, 29, 566–580. [Google Scholar] [CrossRef] [Green Version]
- Mishra, P.; Koehler, M.J. Technological pedagogical content knowledge: A framework for teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
- Koehler, M.; Mishra, P. What is technological pedagogical content knowledge (TPACK)? Contemp. Issues Technol. Teach. Educ. 2009, 9, 60–70. [Google Scholar] [CrossRef] [Green Version]
- Hechter, R.P.; Phyfe, L.D.; Vermette, L.A. Integrating technology in education: Moving the TPCK framework towards practical applications. Educ. Res. Perspect. 2012, 39, 136. [Google Scholar]
- Araújo Filho, R.; Gitirana, V. Pre-service teachers’ knowledge: Analysis of teachers’ education situation based on TPACK. Math. Enthus. 2022, 19, 594–631. [Google Scholar] [CrossRef]
- Supriatna, N.; Abbas, E.W.; Rini, T.P.W.; Subiyakto, B. Technological, pedagogical, content knowledge (TPACK): A discursions in learning innovation on social studies. Innov. Soc. Stud. J. 2020, 2, 135–142. [Google Scholar]
- Gómez-Trigueros, I.M.; Yáñez de Aldecoa, C. The digital gender gap in teacher education: The TPACK framework for the 21st century. Eur. J. Investig. Health Psychol. Educ. 2021, 11, 1333–1349. [Google Scholar] [CrossRef]
- Chuang, H.-H.; Weng, C.-Y.; Huang, F.-C. A structure equation model among factors of teachers’ technology integration practice and their TPCK. Comput. Educ. 2015, 86, 182–191. [Google Scholar] [CrossRef]
- Agyei, D.D.; Voogt, J. Developing technological pedagogical content knowledge in pre-service mathematics teachers through collaborative design. Australas. J. Educ. Technol. 2012, 28, 547–564. [Google Scholar] [CrossRef] [Green Version]
- Khan, S. A model for integrating ICT into teacher training programs in Bangladesh based on TPCK. Int. J. Educ. Dev. Using ICT 2014, 10, 21–31. [Google Scholar]
- Niess, M.L. Preparing teachers to teach science and mathematics with technology: Developing a technology pedagogical content knowledge. Teach. Teach. Educ. 2005, 21, 509–523. [Google Scholar] [CrossRef]
- Guzey, S.S.; Roehrig, G.H. Teaching science with technology: Case studies of science teachers’ development of technological pedagogical content knowledge (TPCK). Contemp. Issues Technol. Teach. Educ. 2009, 9, 25–45. [Google Scholar]
- Chang, C.-Y.; Chien, Y.-T.; Chang, Y.-H.; Lin, C.-Y. MAGDAIRE: A model to foster pre-service teachers’ ability in integrating ICT and teaching in Taiwan. Australas. J. Educ. Technol. 2012, 28, 983–999. [Google Scholar] [CrossRef] [Green Version]
- Tondeur, J.; Scherer, R.; Siddiq, F.; Baran, E. A comprehensive investigation of TPACK within pre-service teachers’ ICT profiles: Mind the gap! Australas. J. Educ. Technol. 2017, 33, 46–60. [Google Scholar] [CrossRef] [Green Version]
- Swallow, M.J.; Olofson, M.W. Contextual understandings in the TPACK framework. J. Res. Technol. Educ. 2017, 49, 228–244. [Google Scholar] [CrossRef]
- Srisawasdi, N. The role of TPACK in physics classroom: Case studies of pre-service physics teachers. Procedia-Soc. Behav. Sci. 2012, 46, 3235–3243. [Google Scholar] [CrossRef] [Green Version]
- Angeli, C.; Valanides, N. TPCK in pre-service teacher education: Preparing primary education students to teach with technology. In Proceedings of the AERA Annual Conference, New York, NY, USA, 24–28 March 2018. [Google Scholar]
- Pamuk, S. Understanding pre-service teachers’ technology use through TPACK framework. J. Comput. Assist. Learn. 2012, 28, 425–439. [Google Scholar] [CrossRef]
- Belda-Medina, J. ICTs and Project-Based Learning (PBL) in EFL: Pre-service teachers’ attitudes and digital skills. Int. J. Appl. Linguist. Engl. Lit. 2021, 10, 63–70. [Google Scholar] [CrossRef]
- Grossman, R.; Salas, E.; Pavlas, D.; Rosen, M.A. Using instructional features to enhance demonstration-based training in management education. Acad. Manag. Learn. Educ. 2013, 12, 219–243. [Google Scholar] [CrossRef]
- Brown, J.S.; Collins, A.; Duguid, P. Situated cognition and the culture of learning. Educ. Res. 1989, 18, 32–42. [Google Scholar] [CrossRef]
- Rodríguez Moreno, J.; Agreda Montoro, M.; Ortiz Colón, A.M. Changes in teacher training within the TPACK model framework: A systematic review. Sustainability 2019, 11, 1870. [Google Scholar] [CrossRef] [Green Version]
- Kramarski, B.; Michalsky, T. Three metacognitive approaches to training pre-service teachers in different learning phases of technological pedagogical content knowledge. Educ. Res. Eval. 2009, 15, 465–485. [Google Scholar] [CrossRef]
- Koehler, M.J.; Mishra, P. What happens when teachers design educational technology? The development of technological pedagogical content knowledge. J. Educ. Comput. Res. 2005, 32, 131–152. [Google Scholar] [CrossRef] [Green Version]
- Zach, S. Co-teaching—An approach for enhancing teaching-learning collaboration in physical education teacher education (PETE). J. Phys. Educ. Sport 2020, 20, 1402–1407. [Google Scholar]
- Chien, Y.-T.; Chang, C.-Y. Supporting socio-scientific argumentation in the classroom through automatic group formation based on students’ real-time responses. In Science Education in East Asia; Khine, M.S., Ed.; Springer: Cham, Switzerland, 2015; pp. 549–563. [Google Scholar]
- Liou, W.-K.; Bhagat, K.K.; Chang, C.-Y. Beyond the flipped classroom: A highly interactive cloud-classroom (HIC) embedded into basic materials science courses. J. Sci. Educ. Technol. 2016, 25, 460–473. [Google Scholar] [CrossRef]
- Chien, Y.-T.; Lee, Y.-H.; Li, T.-Y.; Chang, C.-Y. Examining the effects of displaying clicker voting results on high school students’ voting behaviors, discussion processes, and learning outcomes. Eurasia J. Math. Sci. Technol. Educ. 2015, 11, 1089–1104. [Google Scholar]
- Chien, Y.-T.; Chang, Y.-H.; Chang, C.-Y. Do we click in the right way? A meta-analytic review of clicker-integrated instruction. Educ. Res. Rev. 2016, 17, 1–18. [Google Scholar] [CrossRef] [Green Version]
- Alexander, B.; Ashford-Rowe, K.; Barajas-Murph, N.; Dobbin, G.; Knott, J.; McCormack, M.; Pomerantz, J.; Seilhamer, R.; Weber, N. Horizon Report 2019 Higher Education Edition; EDU19: Boulder, CO, USA, 2019. [Google Scholar]
- Archambault, L.M.; Barnett, J.H. Revisiting technological pedagogical content knowledge: Exploring the TPACK framework. Comput. Educ. 2010, 55, 1656–1662. [Google Scholar] [CrossRef]
- Schmidt, D.A.; Baran, E.; Thompson, A.D.; Mishra, P.; Koehler, M.J.; Shin, T.S. Technological pedagogical content knowledge (TPACK) the development and validation of an assessment instrument for pre-service teachers. J. Res. Technol. Educ. 2009, 42, 123–149. [Google Scholar] [CrossRef]
- Jeon, E.S.; Kim, S.L. Analyses of early childhood teachers’ concept maps on economic education. Int. J. Adv. Cult. Technol. 2019, 7, 43–48. [Google Scholar]
- Wu, P.-H. Effects of real-time diagnosis on mobile learning and self-regulation mechanism on students’ learning achievement and behavior of using concept mapping. Int. J. Digit. Learn. Technol. 2017, 9, 1–27. [Google Scholar] [CrossRef]
- Wang, Y.H. The effectiveness of integrating teaching strategies into IRS activities to facilitate learning. J. Comput. Assist. Learn. 2017, 33, 35–50. [Google Scholar] [CrossRef] [Green Version]
- Kurt, G.; Mishra, P.; Kocoglu, Z. Technological pedagogical content knowledge development of Turkish pre-service teachers of English. In Proceedings of the Society for Information Technology & Teacher Education International Conference, New Orleans, LA, USA, 25 March 2013; pp. 5073–5077. [Google Scholar]
- Agyei, D.D.; Keengwe, J. Using technology pedagogical content knowledge development to enhance learning outcomes. Educ. Inf. Technol. 2014, 19, 155–171. [Google Scholar] [CrossRef]
- Kleickmann, T.; Richter, D.; Kunter, M.; Elsner, J.; Besser, M.; Krauss, S.; Baumert, J. Teachers’ content knowledge and pedagogical content knowledge: The role of structural differences in teacher education. J. Teach. Educ. 2013, 64, 90–106. [Google Scholar] [CrossRef]
- Durdu, L.; Dag, F. Pre-service teachers’ TPACK development and conceptions through a TPACK-based course. Aust. J. Teach. Educ. 2017, 42, 10. [Google Scholar] [CrossRef]
- Young, J.R.; Young, J.L.; Hamilton, C. The use of confidence intervals as a meta-analytic lens to summarize the effects of teacher education technology courses on pre-service teacher TPACK. J. Res. Technol. Educ. 2013, 46, 149–172. [Google Scholar] [CrossRef]
- Chien, Y.-T.; Chang, C.-Y.; Yeh, T.-K.; Chang, K.-E. Engaging pre-service science teachers to act as active designers of technology integration: A MAGDAIRE framework. Teach. Teach. Educ. 2012, 28, 578–588. [Google Scholar] [CrossRef]
- Erna, M.; Elfizar, E.; Dewi, C. The development of E-worksheet using kvisoft flipbook maker software based on lesson study to improve teacher’s critical thinking ability. Int. J. Interact. Mob. Technol. 2021, 15, 39–55. [Google Scholar] [CrossRef]
- Ajzen, I. The theory of planned behavior. Organ. Behav. Hum. Decis. Processes 1991, 50, 179–211. [Google Scholar] [CrossRef]
- Jang, S.J. Development of a Research-Based Model for Enhancing PCK of Secondary Science Teachers; Nova Science Publishers Inc.: New York, NY, USA, 2009. [Google Scholar]
- Dalgarno, N.; Colgan, L. Supporting novice elementary mathematics teachers’ induction in professional communities and providing innovative forms of pedagogical content knowledge development through information and communication technology. Teach. Teach. Educ. 2007, 23, 1051–1065. [Google Scholar] [CrossRef]
The Feature of the Training Model | Aim | Procedure |
---|---|---|
Teacher’s Demonstrations | Improving pre-service teachers’ TK about ICT. | The educator demonstrates the questioning function of ICT. |
Students Co-train the use of ICT | Improving pre-service teachers’ TCK with ICT. | The students were grouped into a team with 2–3 people. In a group, each student takes turns to be a teacher and student, to practice the operation of ICT. |
Students Co-design an ICT -an integrated course | Improving pre-service teachers’ PCK in their major discipline. | Each group is asked to develop a course that should use ICT. Each group needs to finish the concept map, teaching content, and questions about ICT. |
Students Co-teach the course e and receive feedbacks | Improving pre-service teachers’ TPACK | Each group takes turns demonstrating their course to other students with briefing and ICT. After the group’s demonstration, other students give feedback to this group. |
Week | Time (Min) | Activity |
---|---|---|
1st | 10 | Pre-test: TPACK questionnaire |
50 | DECODE: Teacher’s Demonstrations | |
60 | DECODE: Students Co-train the CCR | |
2nd | 120 | DECODE: Students Co-design a course |
3rd | 100 | DECODE: Students Co-teach the module & students receive feedbacks |
10 10 | Post-test: TPACK questionnaire Post-test: Feedback for DECODE |
TPACK Test | Pre-Test | Post-Test | t | Effect Size | ||
---|---|---|---|---|---|---|
Mean | Std | Mean | Std | |||
CK | 4.93 | 0.61 | 4.77 | 1.03 | −1.17 | −0.19 |
PK | 4.22 | 0.92 | 4.43 | 1.05 | 1.29 | 0.21 |
PCK | 4.52 | 0.91 | 4.58 | 0.96 | 0.46 | 0.06 |
TK | 3.58 | 1.33 | 4.50 | 0.93 | 5.06 ** | 0.80 |
TCK | 3.28 | 1.40 | 4.45 | 1.06 | 4.98 ** | 0.94 |
TPK | 3.22 | 1.30 | 4.38 | 1.06 | 5.09 ** | 0.98 |
TPACK | 3.42 | 1.32 | 4.51 | 0.88 | 5.08 ** | 0.97 |
TPACK test | 3.80 | 0.87 | 4.52 | 0.84 | 4.35 ** | 0.84 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cheng, P.-H.; Molina, J.; Lin, M.-C.; Liu, H.-H.; Chang, C.-Y. A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19. Appl. Syst. Innov. 2022, 5, 32. https://doi.org/10.3390/asi5020032
Cheng P-H, Molina J, Lin M-C, Liu H-H, Chang C-Y. A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19. Applied System Innovation. 2022; 5(2):32. https://doi.org/10.3390/asi5020032
Chicago/Turabian StyleCheng, Ping-Han, José Molina, Mei-Chun Lin, Hsiang-Hu Liu, and Chun-Yen Chang. 2022. "A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19" Applied System Innovation 5, no. 2: 32. https://doi.org/10.3390/asi5020032
APA StyleCheng, P. -H., Molina, J., Lin, M. -C., Liu, H. -H., & Chang, C. -Y. (2022). A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19. Applied System Innovation, 5(2), 32. https://doi.org/10.3390/asi5020032