How Concept Maps with and without a List of Concepts Differ: The Case of Statistics
Abstract
:1. Introduction
1.1. Concept Maps as a Knowledge Structure Assessment Tool
1.2. Knowledge Structure in Statistics Domain
2. Materials and Methods
2.1. Data Collecting
2.2. Analysis Strategy
3. Results
4. Discussion
Funding
Conflicts of Interest
Appendix A
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
References
- Ausubel, D.P.; Novak, J.D.; Hanesian, H. Educational Psychology: A Cognitive View; Holt, Rinehart and Winston: New York, NY, USA, 1978; ISBN 978-0-03-089951-5. [Google Scholar]
- Novak, J.D.; Gowin, B. Learning How to Learn; reprinted; Cambridge Univ. Press: Cambridge, UK, 1999; ISBN 978-0-521-31926-3. [Google Scholar]
- Novak, J.D. Concept mapping: A useful tool for science education. J. Res. Sci. Teach. 1990, 27, 937–949. [Google Scholar] [CrossRef]
- Nesbit, J.C.; Adesope, O.O. Learning with concept and knowledge maps: A meta-analysis. Rev. Educ. Res. 2006, 76, 413–448. [Google Scholar] [CrossRef] [Green Version]
- Hattie, J. Visible Learning: A Synthesis of over 800 Meta-Analyses Relating to Achievement; Reprinted; Routledge: London, UK, 2010; ISBN 978-0-415-47618-8. [Google Scholar]
- Schroeder, N.L.; Nesbit, J.C.; Anguiano, C.J.; Adesope, O.O. Studying and Constructing Concept Maps: A Meta-Analysis. Educ. Psychol. Rev. 2018, 30, 431–455. [Google Scholar] [CrossRef] [Green Version]
- Strautmane, M. Concept map-based knowledge assessment tasks and their scoring criteria: An overview. In Proceedings of the Fifth International Conference on Concept Mapping, Valletta, Malta, 17–20 September 2012. [Google Scholar]
- Ruiz-Primo, M.A.; Shavelson, R.J. Problems and issues in the use of concept maps in science assessment. J. Res. Sci. Teach. 1996, 33, 569–600. [Google Scholar] [CrossRef]
- Watson, M.K.; Pelkey, J.; Noyes, C.R.; Rodgers, M.O. Assessing Conceptual Knowledge Using Three Concept Map Scoring Methods. J. Eng. Educ. 2016, 105, 118–146. [Google Scholar] [CrossRef]
- Ruiz-Primo, M.A. Examining concept maps as an assessment tool. In Proceedings of the First International Conference on Concept Mapping, Pamplona, Spain, 14–17 September 2004; Volume 1, pp. 555–562. [Google Scholar]
- Anohina-Naumeca, A. Determining the Set of Concept Map Based Tasks for Computerized Knowledge Self-Assessment. Procedia Soc. Behav. Sci. 2012, 69, 143–152. [Google Scholar] [CrossRef] [Green Version]
- Ruiz-Primo, M.A.; Schultz, S.E.; Li, M.; Shavelson, R.J. Comparison of the reliability and validity of scores from two concept-mapping techniques. J. Res. Sci. Teach. 2001, 38, 260–278. [Google Scholar] [CrossRef]
- Yin, Y.; Vanides, J.; Ruiz-Primo, M.A.; Ayala, C.C.; Shavelson, R.J. Comparison of two concept-mapping techniques: Implications for scoring, interpretation, and use. J. Res. Sci. Teach. 2005, 42, 166–184. [Google Scholar] [CrossRef] [Green Version]
- Derbentseva, N.; Safayeni, F.; Cañas, A.J. Concept maps: Experiments on dynamic thinking. J. Res. Sci. Teach. 2007, 44, 448–465. [Google Scholar] [CrossRef]
- Cañas, A.J.; Novak, J.D.; Reiska, P. Freedom vs. Restriction of Content and Structure during Concept Mapping-possibilities and Limitations for Construction and Assessment. In Proceedings of the Fifth International Conference on Concept Mapping, Valletta, Malta, 17–20 September 2012. [Google Scholar]
- Doorn, D.J.; O’Brien, M. Assessing the Gains from Concept Mapping in Introductory Statistics. Int. J. Scholarsh. Teach. Learn. 2007, 1, n2. [Google Scholar] [CrossRef]
- Lavigne, N.C.; Salkind, S.J.; Yan, J. Exploring college students’ mental representations of inferential statistics. J. Math. Behav. 2008, 27, 11–32. [Google Scholar] [CrossRef]
- Witmer, J.A. Concept maps in introductory statistics. Teach. Stat. 2016, 38, 4–7. [Google Scholar] [CrossRef]
- Piaget, J. Part I: Cognitive development in children: Piaget development and learning. J. Res. Sci. Teach. 1964, 2, 176–186. [Google Scholar] [CrossRef]
- Vygotsky, L. Myshlenie I Rech; Natsional’noe obrazovanie: Moscow, Russia, 1982; Volume 2. [Google Scholar]
- Howe, A.C. Development of science concepts within a Vygotskian framework. Sci. Educ. 1996, 80, 35–51. [Google Scholar] [CrossRef]
- Hedges, H. Vygotsky’s phases of everyday concept development and the notion of children’s “working theories”. Learn. Cult. Soc. Interact. 2012, 1, 143–152. [Google Scholar] [CrossRef]
- Novak, J.D. Meaningful learning: The essential factor for conceptual change in limited or inappropriate propositional hierarchies leading to empowerment of learners. Sci. Educ. 2002, 86, 548–571. [Google Scholar] [CrossRef] [Green Version]
- Mcclure, J.R.; Sonak, B.; Suen, H.K. Concept map assessment of classroom learning: Reliability, validity, and logistical practicality. J. Res. Sci. Teach. 1999, 36, 475–492. [Google Scholar] [CrossRef] [Green Version]
- Wallace, J.D.; Mintzes, J.J. The concept map as a research tool: Exploring conceptual change in biology. J. Res. Sci. Teach. 1990, 27, 1033–1052. [Google Scholar] [CrossRef]
- Stoddart, T.; Abrams, R.; Gasper, E.; Canaday, D. Concept maps as assessment in science inquiry learning—A report of methodology. Int. J. Sci. Educ. 2000, 22, 1221–1246. [Google Scholar] [CrossRef]
- Buitrago, M.; Chiappe, A. Representation of knowledge in digital educational environments: A systematic review of literature. Aust. J. Educ. Technol. 2019, 35, 35. [Google Scholar] [CrossRef]
- Richmond, S.S.; Defranco, J.F.; Jablokow, K. A set of guidelines for the consistent assessment of concept maps. Int. J. Eng. Educ. 2014, 30, 1072–1082. [Google Scholar]
- Hay, D.; Wells, H.; Kinchin, I. Quantitative and qualitative measures of student learning at university level. High. Educ. 2008, 56, 221–239. [Google Scholar] [CrossRef] [Green Version]
- Jablokow, K.W.; DeFranco, J.F.; Richmond, S.S.; Piovoso, M.J.; Bilén, S.G. Cognitive Style and Concept Mapping Performance: Cognitive Style and Concept Mapping Performance. J. Eng. Educ. 2015, 104, 303–325. [Google Scholar] [CrossRef]
- Kinchin, I.M.; Hay, D.B.; Adams, A. How a qualitative approach to concept map analysis can be used to aid learning by illustrating patterns of conceptual development. Educ. Res. 2000, 42, 43–57. [Google Scholar] [CrossRef]
- Anohina, A.; Grundspenkis, J. Scoring concept maps: An overview. In Proceedings of the International Conference on Computer Systems and Technologies and Workshop for PhD Students in Computing—CompSysTech ’09; ACM Press: Ruse, Bulgaria, 2009; p. 1. [Google Scholar]
- Roberts, L. Using concept maps to measure statistical understanding. Int. J. Math. Educ. Sci. Technol. 1999, 30, 707–717. [Google Scholar] [CrossRef]
- Marzetta, K.; Mason, H.; Wee, B. ‘Sometimes They Are Fun and Sometimes They Are Not’: Concept Mapping with English Language Acquisition (ELA) and Gifted/Talented (GT) Elementary Students Learning Science and Sustainability. Educ. Sci. 2018, 8, 13. [Google Scholar] [CrossRef] [Green Version]
- Frerichs, L.; Young, T.L.; Dave, G.; Stith, D.; Corbie-Smith, G.; Hassmiller Lich, K. Mind maps and network analysis to evaluate conceptualization of complex issues: A case example evaluating systems science workshops for childhood obesity prevention. Eval. Program Plan. 2018, 68, 135–147. [Google Scholar] [CrossRef]
- Ifenthaler, D.; Masduki, I.; Seel, N.M. The mystery of cognitive structure and how we can detect it. Instr. Sci. 2011, 39, 41–61. [Google Scholar] [CrossRef]
- Koponen, I.T.; Nousiainen, M. Concept networks in learning: Finding key concepts in learners’ representations of the interlinked structure of scientific knowledge. J. Complex Netw. 2014, 2, 187–202. [Google Scholar] [CrossRef]
- Siew, C.S.Q. Using network science to analyze concept maps of psychology undergraduates. Appl. Cogn. Psychol. 2018, 33, 662–668. [Google Scholar] [CrossRef]
- Tyumeneva, Y.; Kapuza, A.; Vergeles, K. Distinctive Ability of Concept Maps for Assessing Levels of Competence. Pilot study. Vopr. Obraz. Educ. Stud. Mosc. 2017, 4, 150–170. [Google Scholar] [CrossRef]
- Zouaq, A.; Gasevic, D.; Hatala, M. Ontologizing concept maps using graph theory. In Proceedings of the 2011 ACM Symposium on Applied Computing—SAC ’11; ACM Press: TaiChung, Taiwan, 2011; p. 1687. [Google Scholar]
- Lavigne, N.C. Mutually Informative Measures of Knowledge: Concept Maps Plus Problem Sorts in Statistics. Educ. Assess. 2005, 10, 39–71. [Google Scholar] [CrossRef]
- Reise, S.P.; Widaman, K.F.; Pugh, R.H. Confirmatory factor analysis and item response theory: Two approaches for exploring measurement invariance. Psychol. Bull. 1993, 114, 552. [Google Scholar] [CrossRef] [PubMed]
- Hagiuda, N.; Shigemasu, K. Some remarks on the application of factor analysis to ordered categorical data. Shinrigaku Kenkyu 1996, 67, 1–8. [Google Scholar] [CrossRef] [Green Version]
- Henson, R.K.; Roberts, J.K. Use of Exploratory Factor Analysis in Published Research: Common Errors and Some Comment on Improved Practice. Educ. Psychol. Meas. 2006, 66, 393–416. [Google Scholar] [CrossRef] [Green Version]
- Mansolf, M.; Reise, S.P. Case Diagnostics for Factor Analysis of Ordered Categorical Data with Applications to Person-Fit Measurement. Struct. Equ. Model. 2018, 25, 86–100. [Google Scholar] [CrossRef]
- Cauley, K.M. Studying Knowledge Acquisition: Distinctions among Procedural, Conceptual and Logical Knowledge. In Proceedings of the 67th Annual Meeting of the American Educational Research Association, San Francisco, CA, USA, 16–20 April 1986. [Google Scholar]
- Rittle-Johnson, B.; Schneider, M. Developing Conceptual and Procedural Knowledge of Mathematics. In The Oxford Handbook of Numerical Cognition; Oxford University Press: Oxford, UK, 2015. [Google Scholar]
- Sloutsky, V.S.; Yarlas, A.S. Problem Representation in Experts and Novices: Part 2. Underlying Processing Mechanisms. In Proceedings of the 22nd Annual Conference of the Cognitive Science Society, Philadelphia, PA, USA, 13–15 August 2000; pp. 475–480. [Google Scholar]
- Kinchin, I.M.; Möllits, A.; Reiska, P. Uncovering Types of Knowledge in Concept Maps. Educ. Sci. 2019, 9, 131. [Google Scholar] [CrossRef] [Green Version]
- Zieffler, A.; Garfield, J.; Alt, S.; Dupuis, D.; Holleque, K.; Chang, B. What Does Research Suggest About the Teaching and Learning of Introductory Statistics at the College Level? A Review of the Literature. J. Stat. Educ. 2008, 16. [Google Scholar] [CrossRef]
- Tishkovskaya, S.; Lancaster, G.A. Statistical Education in the 21 st Century: A Review of Challenges, Teaching Innovations and Strategies for Reform. J. Stat. Educ. 2012, 20. [Google Scholar] [CrossRef]
- Rumsey, D.J. Statistical Literacy as a Goal for Introductory Statistics Courses. J. Stat. Educ. 2002, 10, 10. [Google Scholar] [CrossRef]
- Garfield, J. The Challenge of Developing Statistical Reasoning. J. Stat. Educ. 2002, 10, 10. [Google Scholar] [CrossRef]
- Landrum, R.E. Core Terms in Undergraduate Statistics. Teach. Psychol. 2005, 32, 249–251. [Google Scholar]
- McKenzie, J.D. Conveying the Core Concepts. In Proceedings of the ASA Proceedings of the Joint Statistical Meeting; American Statistical Association: Alexandria, VA, USA, 2004; pp. 2755–2757. [Google Scholar]
- Newman, M.E.J.; Girvan, M. Finding and evaluating community structure in networks. Phys. Rev. E 2004, 69, 026113. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chi, M.T.H.; Feltovich, P.J.; Glaser, R. Categorization and Representation of Physics Problems by Experts and Novices*. Cogn. Sci. 1981, 5, 121–152. [Google Scholar] [CrossRef] [Green Version]
- Lachner, A.; Nückles, M. Bothered by abstractness or engaged by cohesion? Experts’ explanations enhance novices’ deep-learning. J. Exp. Psychol. Appl. 2015, 21, 101–115. [Google Scholar] [CrossRef] [PubMed]
Without the List | With the List | t-Statistic | One-Tailed t-Test (df = 10) | |||
---|---|---|---|---|---|---|
Mean | SD | Mean | SD | |||
Number of concepts | 19.55 | 7.74 | 21.18 | 3.57 | 0.60 | p = 0.28 |
Number of propositions | 19.64 | 7.68 | 23.73 | 4.76 | 1.31 | p = 0.11 |
Concepts–proposition ratio | 1.00 | 0.11 | 0.90 | 0.10 | 2.8 | p < 0.01 |
Number of concepts with one degree | 10.91 | 5.74 | 7.91 | 3.45 | 1.81 | p < 0.05 |
% of concepts with one degree | 53.60 | 14.29 | 37.58 | 17.10 | 3.18 | p < 0.01 |
Number of concepts with three or more degrees | 4.73 | 2.45 | 4.64 | 2.69 | 0.07 | p = 0.47 |
% of concepts with three or more degrees | 24.01 | 6.96 | 21.25 | 10.17 | 0.67 | p = 0.26 |
Diameter | 5.73 | 1.79 | 7.73 | 2.24 | 2.8 | p < 0.01 |
Share of diameter | 0.32 | 0.12 | 0.38 | 0.15 | 1.05 | p = 0.16 |
© 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kapuza, A. How Concept Maps with and without a List of Concepts Differ: The Case of Statistics. Educ. Sci. 2020, 10, 91. https://doi.org/10.3390/educsci10040091
Kapuza A. How Concept Maps with and without a List of Concepts Differ: The Case of Statistics. Education Sciences. 2020; 10(4):91. https://doi.org/10.3390/educsci10040091
Chicago/Turabian StyleKapuza, Anastasia. 2020. "How Concept Maps with and without a List of Concepts Differ: The Case of Statistics" Education Sciences 10, no. 4: 91. https://doi.org/10.3390/educsci10040091
APA StyleKapuza, A. (2020). How Concept Maps with and without a List of Concepts Differ: The Case of Statistics. Education Sciences, 10(4), 91. https://doi.org/10.3390/educsci10040091