Design and Validation of a Novel Tool to Assess Citizens’ Netiquette and Information and Data Literacy Using Interactive Simulations
Abstract
:1. Introduction
- To design a tool for the assessment of DC that supports dynamic formats such as interactive simulations, which are particularly relevant when measuring complex cognitive constructs such as DC in safe settings.
- To describe the design principles applied during the different steps of the development of the tests for evaluating the DCs selected, with the aim that they can be extended to the rest of the DCs included in the reference framework.
- Is it possible to assess IDL through a DBR-designed test using simulations?
- Is it possible to assess netiquette through a DBR-designed test using simulations?
1.1. Reference Framework for the Evaluation of DC
1.2. Information and Data Literacy
1.3. Netiquette
1.4. Item Response Theory (IRT)
2. Materials and Methods
2.1. Phase 1: Analysis of the Problem by Researchers and Practitioners in Collaboration
2.2. Phase 2: Development of Theoretical Framework Solutions Based on Existing Design Principles and Technological Innovations
- The shorter and simpler the better.
- Related to practical situations and common situations, especially in real-world scenarios.
- Neutral with respect to commercial brands and specific technological solutions. If this is not possible, use the most commonly used solutions as a basis. In the simulations, provide “alt” messages (alternative text to images) when hovering over the different options to help users who do not normally work with this tool.
- Address the selected competence elements (knowledge and skills) and refer to the three macro proficiency levels (foundation, intermediate and advanced).
- Balance the number of knowledge and skill questions (k/s) in each test: 22/22 for the netiquette test and 25/35 for the IDL test.
- All the items were dichotomous for all the formats (correct 1 and incorrect 0). The complexity of the tasks or any partial responses during the resolution of a task were not considered. We made this decision to simplify their understanding. On the other hand, the assessment criteria for each item would have been more complicated.
2.3. Phase 3: Iterative Cycles of Testing and Refinement of the Solution in Practice
2.3.1. First Iteration with DC Centre Facilitators
- Content and wording of the items.
- Facilitators’ suggestions to improve some items identified as “to be improved”.
- Facilitators’ suggestions to improve the questions/tests.
2.3.2. Second Iteration with Citizens
- Time needed for completing each test should be less than 30 min, to decrease the probability of users dropping out too early. So, the test for IDL competence area included 60 items and the test for netiquette DC included 44 items.
- The distribution of items in the IDL test was similar for each DC. In the netiquette test, the distribution was carried out ensuring that all sub-competences were present. The distribution of sub-competences was realised according to the literature review carried out at the beginning of the study and the feedback received by the facilitators (see Table 4).
- According to the macro proficiency levels, we considered the following proportion for each test: 25% foundation level, 50% intermediate level and 25% advanced level. The proficiency levels were assigned to the items following a pragmatic approach mapping the verbs of the statements with the Bloom’s taxonomy [93].
- Evaluate the tests with end users.
- Analyse the data gathered from the participants using different item response theory models and investigate the appropriateness of the models by examining different indicators of model fit.
2.4. Phase 4: Reflection to Produce “Design Principles” and Enhance Solution Implementation
3. Results
3.1. Phase 3: Iterative Cycles of Testing and Refinement of the Solution in Practice
3.1.1. First Iteration with DC Centre Facilitators
3.1.2. Second Iteration with Citizens
- Examine the difficulty parameter (p-value) of the items and the discrimination indices as a starting indicator to justify the choice of the model.
- Analyse the dimensional validity and reliability.
4. Discussion and Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- List, A.; Brante, E.W.; Klee, H.L. A framework of pre-service teachers’ conceptions about digital literacy: Comparing the United States and Sweden. Comput. Educ. 2020, 148, 103788. [Google Scholar] [CrossRef]
- O’Sullivan, K.; Clark, S.; Marshall, K.; MacLachlan, M. A Just Digital framework to ensure equitable achievement of the Sustainable Development Goals. Nat. Commun. 2021, 12, 6345. [Google Scholar] [CrossRef]
- Ala-Mutka, K. Mapping Digital Competence: Towards a Conceptual Understanding; Institute for Prospective Technological Studies: Sevilla, Spain, 2011; pp. 7–60. [Google Scholar]
- Abidoye, R.; Lim, B.T.H.; Lin, Y.C.; Ma, J. Equipping Property Graduates for the Digital Age. Sustainability 2022, 14, 640. [Google Scholar] [CrossRef]
- Portillo, J.; Garay, U.; Tejada, E.; Bilbao, N. Self-perception of the digital competence of educators during the COVID-19 pandemic: A cross-analysis of different educational stages. Sustainability 2020, 12, 10128. [Google Scholar] [CrossRef]
- Sá, M.J.; Santos, A.I.; Serpa, S.; Miguel Ferreira, C. Digitainability—Digital Competences Post-COVID-19 for a Sustainable Society. Sustainability 2021, 13, 9564. [Google Scholar] [CrossRef]
- Ferrari, A. Digital Competence in Practice: An Analysis of Frameworks; JRC IPTS: Seville, Spain, 2012. [Google Scholar] [CrossRef]
- Law, N.W.Y.; Woo, D.J.; de la Torre, J.; Wong, K.W.G. A Global Framework of Reference on Digital Literacy Skills for Indicator 4.4.2; UNESCO: Paris, France, 2018; p. 146. [Google Scholar]
- Santos, A.I.; Serpa, S. The importance of promoting digital literacy in higher education. Int. J. Soc. Sci. Stud. 2017, 5, 90. [Google Scholar] [CrossRef] [Green Version]
- Ferrari, A. DIGCOMP: A Framework for Developing and Understanding Digital Competence in Europe; Publications Office of the European Union: Brussels, Belgium, 2013. [Google Scholar] [CrossRef]
- Siddiq, F.; Hatlevik, O.E.; Olsen, R.V.; Throndsen, I.; Scherer, R. Taking a future perspective by learning from the past—A systematic review of assessment instruments that aim to measure primary and secondary school students’ ICT literacy. Educ. Res. Rev. 2016, 19, 58–84. [Google Scholar] [CrossRef] [Green Version]
- Kluzer, S.; Priego, L.P. Digcomp into Action: Get Inspired, Make it Happen. A User Guide to the European Digital Competence Framework; Joint Research Centre: Seville, Spain, 2018. [Google Scholar]
- Zhao, Y.; Llorente, A.M.P.; Gómez, M.C.S. Digital competence in higher education research: A systematic literature review. Comput. Educ. 2021, 168, 104212. [Google Scholar] [CrossRef]
- Saltos-Rivas, R.; Novoa-Hernández, P.; Rodríguez, R.S. On the quality of quantitative instruments to measure digital competence in higher education: A systematic mapping study. PLoS ONE 2021, 16, e0257344. [Google Scholar] [CrossRef] [PubMed]
- Greiff, S.; Wüstenberg, S.; Avvisati, F. Computer-generated log-file analyses as a window into students’ minds? A showcase study based on the PISA 2012 assessment of problem solving. Comput. Educ. 2015, 91, 92–105. [Google Scholar] [CrossRef]
- Osborne, R.; Dunne, E.; Farrand, P. Integrating technologies into “authentic” assessment design: An affordances approach. Res. Learn. Technol. 2013, 21, 21986. [Google Scholar] [CrossRef] [Green Version]
- Timmis, S.; Broadfoot, P.; Sutherland, R.; Oldfield, A. Rethinking assessment in a digital age: Opportunities, challenges and risks. Br. Educ. Res. J. 2016, 42, 454–476. [Google Scholar] [CrossRef] [Green Version]
- Binkley, M.; Erstad, O.; Herman, J.; Raizen, S.; Ripley, M.; Miller-Ricci, M.; Rumble, M. Defining twenty-first century skills. In Assessment and Teaching of 21st Century Skills; Springer: Dordrecht, The Netherlands, 2012; pp. 17–66. [Google Scholar] [CrossRef]
- Nguyen, Q.; Rienties, B.; Toetenel, L.; Ferguson, R.; Whitelock, D. Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Comput. Hum. Behav. 2017, 76, 703–714. [Google Scholar] [CrossRef] [Green Version]
- Rienties, B.; Toetenel, L. The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules. Comput. Hum. Behav. 2016, 60, 333–341. [Google Scholar] [CrossRef]
- Papamitsiou, Z.; Economides, A.A. Learning analytics for smart learning environments: A meta-analysis of empirical research results from 2009 to 2015. In Learning, Design, and Technology: An International Compendium of Theory, Research, Practice, and Policy; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1–23. [Google Scholar] [CrossRef]
- Heer, R. A Model of Learning Objectives–Based on a Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives; Center for Excellence in Learning and Teaching, Iowa State University: Ames, IA, USA, 2012; Available online: www.celt.iastate.edu/wp-content/uploads/2015/09/RevisedBloomsHandout-1.pdf (accessed on 19 January 2022).
- Eurostat. Being Young in Europe Today–Digital World. 2017. Available online: https://ec.europa.eu/eurostat/statistics-explained/index.php?title=Being_young_in_Europe_today (accessed on 19 January 2022).
- BAIT—Evaluation and Certification System of Digital Competences. Available online: http://www.bait.eus (accessed on 19 January 2022).
- Vuorikari, R.; Punie, Y.; Carretero Gomez, S.; Van Den Brande, G. DigComp 2.0: The Digital Competence Framework for Citizens; EUR 27948 EN; Publications Office of the European Union: Luxembourg, 2016. [Google Scholar] [CrossRef]
- Carretero, S.; Vuorikari, R.; Punie, Y. DigComp 2.1: The Digital Competence Framework for Citizens with Eight Proficiency Levels and Examples of Use; EUR 28558 EN; Publications Office of the European Union: Luxembourg, 2017. [Google Scholar] [CrossRef]
- Laanpere, M. Recommendations on Assessment Tools for Monitoring Digital Literacy within UNESCO’s Digital Literacy Global Framework. Information Paper No, 56. 2019. Available online: https://unesdoc.unesco.org/ark:/48223/pf0000366740 (accessed on 3 March 2022).
- Bashir, S.; Miyamoto, K. Digital Skills: Frameworks and Programs; World Bank: Washington, DC, USA, 2020; Available online: https://openknowledge.worldbank.org/handle/10986/35080 (accessed on 3 March 2022).
- Fraillon, J. International large-scale computer-based studies on information technology literacy in education. In Second Handbook of Information Technology in Primary and Secondary Education; Springer: Berlin/Heidelberg, Germany, 2018; pp. 1161–1179. [Google Scholar]
- Sparks, J.R.; Katz, I.R.; Beile, P.M. Assessing digital information literacy in higher education: A review of existing frameworks and assessments with recommendations for next-generation assessment. ETS Res. Rep. Ser. 2016, 2016, 1–33. [Google Scholar] [CrossRef] [Green Version]
- Messick, S. Validity of psychological assessment: Validation of inferences from persons’ responses and performance as scientific inquiry into score meaning. Am. Psychol. 1995, 50, 741–749. [Google Scholar] [CrossRef]
- Reichert, F.; Zhang, D.J.; Law, N.W.; Wong, G.K.; de la Torre, J. Exploring the structure of digital literacy competence assessed using authentic software applications. Educ. Technol. Res. Dev. 2020, 68, 2991–3013. [Google Scholar] [CrossRef]
- Jin, K.Y.; Reichert, F.; Cagasan, L.P., Jr.; de la Torre, J.; Law, N. Measuring digital literacy across three age cohorts: Exploring test dimensionality and performance differences. Comput. Educ. 2020, 157, 103968. [Google Scholar] [CrossRef]
- Aesaert, K.; Van Nijlen, D.; Vanderlinde, R.; van Braak, J. Direct measures of digital information processing and communication skills in primary education: Using item response theory for the development and validation of an ICT competence scale. Comput. Educ. 2014, 76, 168–181. [Google Scholar] [CrossRef]
- Goldhammer, F.; Naumann, J.; Keßel, Y. Assessing individual differences in basic computer skills. Eur. J. Psychol. Assess. 2013, 29, 263–275. [Google Scholar] [CrossRef]
- Huggins, A.C.; Ritzhaupt, A.D.; Dawson, K. Measuring information and communication technology literacy using a performance assessment: Validation of the student tool for technology literacy (ST2L). Comput. Educ. 2014, 77, 1–12. [Google Scholar] [CrossRef]
- Pérez-Escoda, A.; Esteban, L.M.P. Retos del periodismo frente a las redes sociales, las fake news y la desconfianza de la generación Z. Rev. Lat. Comun. Soc. 2021, 79, 67–85. [Google Scholar] [CrossRef]
- Dessart, L. Social media engagement: A model of antecedents and relational outcomes. J. Mark. Manag. 2017, 33, 375–399. [Google Scholar] [CrossRef]
- Pérez-Escoda, A.; Pedrero-Esteban, L.M.; Rubio-Romero, J.; Jiménez-Narros, C. Fake News Reaching Young People on Social Networks: Distrust Challenging Media Literacy. Publications 2021, 9, 24. [Google Scholar] [CrossRef]
- Larrondo-Ureta, A.; Peña-Fernández, S.; Agirreazkuenaga-Onaindia, I. Hacia una mayor participación de la audiencia: Experiencias transmedia para jóvenes. Estud. Sobre Mensaje Periodístico 2020, 26, 1445–1454. [Google Scholar] [CrossRef]
- Castillo-Abdul, B.; Romero-Rodríguez, L.M.; Larrea-Ayala, A. Kid influencers in Spain: Understanding the themes they address and preteens’ engagement with their YouTube channels. Heliyon 2020, 6, e05056. [Google Scholar] [CrossRef] [PubMed]
- Vraga, E.K.; Bode, L. Defining misinformation and understanding its bounded nature: Using expertise and evidence for describing misinformation. Political Commun. 2020, 37, 136–144. [Google Scholar] [CrossRef]
- Masip, P.; Suau, J.; Ruiz-Caballero, C. Perceptions on media and disinformation: Ideology and polarization in the Spanish media system. Prof. Inf. 2020, 29, 1–13. [Google Scholar] [CrossRef]
- Viner, K. How Technology Disrupted the Truth. The Guardian. 12 July 2016. Available online: https://www.theguardian.com/media/2016/jul/12/how-technology-disrupted-the-truth (accessed on 19 January 2022).
- Orso, D.; Federici, N.; Copetti, R.; Vetrugno, L.; Bove, T. Infodemic and the spread of fake news in the COVID-19-era. Eur. J. Emerg. Med. 2020, 27, 327–328. [Google Scholar] [CrossRef]
- Kopecky, K.; Szotkowski, R.; Aznar-Díaz, I.; Romero-Rodríguez, J.M. The phenomenon of sharenting and its risks in the online environment. Experiences from Czech Republic and Spain. Child. Youth Serv. Rev. 2020, 110, 104812. [Google Scholar] [CrossRef]
- European Commission. Standard Eurobarometer 93. Summer 2020. Report. Public Opinion in the European Union. 2020. Available online: https://ec.europa.eu/commfrontoffice/publicopinion/index.cfm/ResultDoc/download/DocumentKy/91061 (accessed on 3 March 2022).
- Jones-Jang, S.M.; Mortensen, T.; Liu, J. Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. Am. Behav. Sci. 2021, 65, 371–388. [Google Scholar] [CrossRef]
- Walsh, A. Information literacy assessment: Where do we start? J. Libr. Inf. Sci. 2009, 41, 19–28. [Google Scholar] [CrossRef] [Green Version]
- Catalano, A. The effect of a situated learning environment in a distance education information literacy course. J. Acad. Libr. 2015, 41, 653–659. [Google Scholar] [CrossRef]
- Foo, S.; Majid, S.; Chang, Y.K. Assessing information literacy skills among young information age students in Singapore. Aslib J. Inf. Manag. 2017, 69, 335–353. [Google Scholar] [CrossRef]
- Kruger, J.; Dunning, D. Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J. Pers. Soc. Psychol. 1999, 77, 1121. [Google Scholar] [CrossRef] [PubMed]
- Mahmood, K. Do people overestimate their information literacy skills? A systematic review of empirical evidence on the Dunning-Kruger effect. Commun. Inf. Lit. 2016, 10, 3. [Google Scholar] [CrossRef]
- Leichner, N.; Peter, J.; Mayer, A.-K.; Krampen, G. Assessing information literacy among German psychology students. Ref. Serv. Rev. 2013, 41, 660–674. [Google Scholar] [CrossRef] [Green Version]
- Markowski, B.; McCartin, L.; Evers, S. Meeting students where they are: Using rubric-based assessment to modify an information literacy curriculum. Commun. Inf. Lit. 2018, 12, 5. [Google Scholar] [CrossRef] [Green Version]
- Association of College & Research Libraries [ACRL]. Framework for Information Literacy for Higher Education; American Library Association: Chicago, IL, USA, 2016; Available online: http://www.ala.org/acrl/standards/ilframework (accessed on 19 January 2022).
- Hollis, H. Information literacy as a measurable construct: A need for more freely available, validated and wide-ranging instruments. J. Inf. Lit. 2018, 12, 76–88. [Google Scholar]
- Catalano, A.J. Streamlining LIS Research: A Compendium of Tried and True Tests, Measurements, and Other Instruments: A Compendium of Tried and True Tests, Measurements, and Other Instruments; ABC-CLIO: Santa Barbara, CA, USA, 2016. [Google Scholar]
- Mahmood, K. A systematic review of evidence on psychometric properties of information literacy tests. Libr. Rev. 2017, 66, 442–455. [Google Scholar] [CrossRef]
- Vaterlaus, J.M.; Aylward, A.; Tarabochia, D.; Martin, J.D. “A smartphone made my life easier”: An exploratory study on age of adolescent Smartphone acquisition and well-being. Comput. Hum. Behav. 2021, 114, 106563. [Google Scholar] [CrossRef]
- Galera, M.D.C.G.; Muñoz, C.F.; Pedrosa, L.P. Youth empowerment through social networks. Creating participative digital citizenship. Commun. Soc. 2017, 30, 129–140. [Google Scholar] [CrossRef] [Green Version]
- Cabezas-González, M.; Casillas-Martín, S.; Muñoz-Repiso, A.G.V. Basic Education Students’ Digital Competence in the Area of Communication: The Influence of Online Communication and the Use of Social Networks. Sustainability 2021, 13, 4442. [Google Scholar] [CrossRef]
- Kozík, T.; Slivová, J. Netiquette in electronic communication. Int. J. Eng. Pedagog. 2014, 4, 67–70. [Google Scholar] [CrossRef] [Green Version]
- Soler-Costa, R.; Lafarga-Ostáriz, P.; Mauri-Medrano, M.; Moreno-Guerrero, A.J. Netiquette: Ethic, education, and behavior on internet—a systematic literatura review. Int. J. Environ. Res. Public Health 2021, 18, 1212. [Google Scholar] [CrossRef]
- Brusco, J.M. Know your netiquette. AORN J. 2011, 94, 279–286. [Google Scholar] [CrossRef] [PubMed]
- Hammond, L.; Moseley, K. Reeling in proper “netiquette”. Nurs. Made Incred. Easy 2018, 16, 50–53. [Google Scholar] [CrossRef]
- McMurdo, G. Netiquettes for networkers. J. Inf. Sci. 1995, 21, 305–318. [Google Scholar] [CrossRef]
- Linek, S.B.; Ostermaier-Grabow, A. Netiquette between students and their lecturers on Facebook: Injunctive and descriptive social norms. Soc. Media + Soc. 2018, 4, 2056305118789629. [Google Scholar] [CrossRef]
- Arouri, Y.M.; Hamaidi, D.A. Undergraduate Students’ Perspectives of the Extent of Practicing Netiquettes in a Jordanian Southern University. Int. J. Emerg. Technol. Learn. 2017, 12, 84. [Google Scholar] [CrossRef] [Green Version]
- Muñiz Fernández, J. Introducción a la Teoría de Respuesta a los Ítems; Repositorio Institucional de la Universidad de Oviedo: Pirámide, Mexico, 1997. [Google Scholar]
- Baker, F.B.; Kim, S.H. (Eds.) Item Response Theory: Parameter Estimation Techniques; CRC Press: Boca Raton, FL, USA, 2004. [Google Scholar]
- Hambleton, R.K.; Jones, R.W. Comparison of classical test theory and item response theory and their applications to test development. Educ. Meas. Issues Pract. 1993, 12, 535–556. [Google Scholar]
- Wilson, M. Constructing Measures: An Item Response Modeling Approach; Routledge: London, UK, 2004. [Google Scholar] [CrossRef]
- Rasch, G. Probabilistic Models for Some Intelligence and Achievement Tests; Danish Institute for Educational Research: Copenhagen, Denmark; MESA Press: Chicago, IL, USA, 1983. [Google Scholar]
- Thissen, D. Marginal maximum likelihood estimation for the one-parameter logistic model. Psychometrika 1982, 47, 175–186. [Google Scholar] [CrossRef]
- Hambleton, R.K.; Swaminathan, H.; Rogers, H.J. Fundamentals of Item Response Theory; Sage: Newcastle upon Tyne District, UK, 1991; Volume 2. [Google Scholar]
- Reckase, M.D. Multidimensional Item Response Theory Models. In Multidimensional Item Response Theory; Springer: New York, NY, USA, 2009. [Google Scholar] [CrossRef]
- Adams, R.J.; Wilson, M.; Wang, W. The multidimensional random coefficients multinomial logit model. Appl. Psychol. Meas. 1997, 21, e23. [Google Scholar] [CrossRef]
- Adams, R.J.; Wu, M.L.; Wilson, M.R. ACER ConQuest 3.0. 1; Computer Software; Australian Council for Educational Research: Melbourne, Australia, 2012. [Google Scholar]
- Wright, B.D.; Stone, M.H. Best Test Design; Australian Council for Educational Research: Melbourne, Australia, 1979. [Google Scholar]
- Sandoval, W. Conjecture mapping: An approach to systematic educational design research. J. Learn. Sci. 2014, 23, 18–36. [Google Scholar] [CrossRef]
- Herrington, J.; McKenney, S.; Reeves, T.; Oliver, R. Design-based research and doctoral students: Guidelines for preparing a dissertation proposal. In Proceedings of the ED-MEDIA 2007—World Conference on Educational Multimedia, Hypermedia & Telecommunications 2007, Vancouver, BC, Canada, 25–29 June 2007; Montgomerie, C., Seale, J., Eds.; Association for the Advancement of Computing in Education (AACE): Vancouver, BC, Canada, 2007; pp. 4089–4097. Available online: https://www.learntechlib.org/primary/p/25967/ (accessed on 19 January 2022).
- McKenney, S.; Reeves, T.C. Conducting Educational Design Research; Routledge: London, UK, 2018. [Google Scholar] [CrossRef]
- Reeves, T. Design research from a technology perspective. In Educational Design Research; Routledge: London, UK, 2006; pp. 64–78. [Google Scholar]
- All Digital. Available online: https://all-digital.org/ (accessed on 19 January 2022).
- Bartolomé, J.; Garaizar, P.; Larrucea, X. A Pragmatic Approach for Evaluating and Accrediting Digital Competence of Digital Profiles: A Case Study of Entrepreneurs and Remote Workers. Technol. Knowl. Learn. 2021, 1–36. [Google Scholar] [CrossRef]
- Bartolomé, J.; Garaizar, P.; Bastida, L. Validating item response processes in digital competence assessment through eye-tracking techniques. In Proceedings of the Eighth International Conference on Technological Ecosystems for Enhancing Multiculturality 2020, Salamanca, Spain, 21–23 October 2020; pp. 738–746. [Google Scholar] [CrossRef]
- Articulate Storyline 360. Available online: https://articulate.com/360/storyline (accessed on 19 January 2022).
- Kzgunea. Available online: https://www.kzgunea.eus/es/inicio (accessed on 19 January 2022).
- All Digital Week. Available online: https://alldigitalweek.org/ (accessed on 19 January 2022).
- IT Txartela, Sistema de Certificación de Competencias Básicas en Tecnologías de la Información. Available online: http://www.it-txartela.net (accessed on 19 January 2022).
- Van Deursen, A.J.; Helsper, E.J.; Eynon, R. Development and validation of the Internet Skills Scale (ISS). Inf. Commun. Soc. 2016, 19, 804–823. [Google Scholar] [CrossRef]
- Krathwohl, D.R. A revision of Bloom’s taxonomy: An overview. Theory Pract. 2002, 41, 212–218. [Google Scholar] [CrossRef]
- American Educational Research Association; American Psychological Association y National Council on Measurement in Education. Standards for Educational and Psychological Testing; American Educational Research Association: Washington, DC, USA, 2014. [Google Scholar]
- Mueller, R.O.; Knapp, T.R. Reliability and validity. In The Reviewer’s Guide to Quantitative Methods in the Social Sciences; Routledge: London, UK, 2018; pp. 397–401. [Google Scholar]
- Bandalos, D.L. Measurement Theory and Applications for the Social Sciences; Guilford Publications: New York, NY, USA, 2018. [Google Scholar]
- Scholtes, V.A.; Terwee, C.B.; Poolman, R.W. What makes a measurement instrument valid and reliable? Injury 2011, 42, 236–240. [Google Scholar] [CrossRef]
- Varma, S.; Simon, R. Bias in error estimation when using cross-validation for model selection. BMC Bioinform. 2006, 7, 91. [Google Scholar] [CrossRef] [Green Version]
- Wu, M.; Adams, R.J. Properties of Rasch residual fit statistics. J. Appl. Meas. 2013, 14, 339–355. [Google Scholar] [PubMed]
- Adams, R.J.; Khoo, S.T. Quest; ACER: Melbourne, Australia, 1996. [Google Scholar]
- Adams, R.J. Reliability as a measurement design effect. Stud. Educ. Eval. 2005, 31, 162–172. [Google Scholar] [CrossRef]
- Iglesias-Rodríguez, A.; Hernández-Martín, A.; Martín-González, Y.; Herráez-Corredera, P. Design, Validation and Implementation of a Questionnaire to Assess Teenagers’ Digital Competence in the Area of Communication in Digital Environments. Sustainability 2021, 13, 6733. [Google Scholar] [CrossRef]
- Clifford, I.; Kluzer, S.; Troia, S.; Jakobsone, M.; Zandbergs, U. DigCompSat. A Self-Reflection Tool for the European Digital Framework for Citizens (No. JRC123226); Joint Research Centre: Seville, Spain, 2020. [Google Scholar]
Digital Competence | Description |
---|---|
Browsing, searching and filtering data, information and digital content | To articulate information needs, to search for data, information and content in digital environments, to access and navigate between them. To create and update personal search strategies. |
Evaluating data, information and digital content | To analyse, compare and critically evaluate the credibility and reliability of sources of data, information and digital content. To analyse, interpret and critically evaluate the data, information and digital content. |
Managing data, information and digital content | To organise, store and retrieve data, information and content in digital environments. To organise and process them in a structured environment. |
Phase | Element |
---|---|
PHASE 1: analysis of the problem by researchers and practitioners in collaboration | Statement of problem |
Consultation with researchers and practitioners | |
Research questions | |
Literature review | |
PHASE 2: development of theoretical framework solutions based on existing design principles and technological innovations | Theoretical framework |
Development of draft principles to guide the design of the solution | |
Description of proposed solution | |
PHASE 3: iterative cycles of testing and refinement of the solution in practice | Implementation of intervention (first iteration with digital competence centre facilitators and second iteration with citizens) |
Participants | |
Data collection | |
Data analysis | |
PHASE 4: reflection to produce “design principles” and enhance solution implementation | Design principles |
Designed artefact |
Competence Area | Digital Competence | Sub-Competence |
---|---|---|
Communication and collaboration | Netiquette | Sub-competence1: apply basic netiquette guidelines when using email (e.g., use of blind carbon copy (BCC), forward an email/content, etc.). |
Sub-competence2: apply simple online writing rules (no capital letters, respect the spelling, referring to others by their aliases or nicknames, etc.) and use emoticons appropriately when communicating online. | ||
Sub-competence3: recognise appropriate behaviours on social networks, such as receiving permission from others before publishing (especially when children are involved); avoiding SPAM (e.g., sending invitations or other messages to everyone); using words or a non-clear language that may be misunderstood. | ||
Sub-competence4: Recognise inappropriate online behaviour, such as stalking, trolling or cyber bullying. Able to deal with negative behaviours such as flagging disrespectful publications or notifying the police. | ||
Information and data literacy | Browsing, searching and filtering data, information and digital content | Sub-competence5: analyse information needs, search for data and information in digital environments, filter and locate. |
Sub-competence6: define the search strategy required at any given moment. | ||
Information and data literacy | Evaluating data, information and digital content | Sub-competence7: examine and evaluate the credibility and reliability of sources of data and information. |
Sub-competence8: examine and evaluate digital content, data and information. | ||
Information and data literacy | Managing data, information and digital content | Sub-competence9: organise, store and process data, information and content in digital environments. |
Test | N° of Items | Sub-Competence |
---|---|---|
Netiquette | 10 | Sub-competence1 |
11 | Sub-competence2 | |
16 | Sub-competence3 | |
7 | Sub-competence4 | |
Information and data literacy | 10 | Sub-competence5 |
10 | Sub-competence6 | |
10 | Sub-competence7 | |
10 | Sub-competence8 | |
20 | Sub-competence9 |
Test | Gender | Age Range |
---|---|---|
Netiquette (n = 201) entry 2 | Male 54.6% Female 46.4% | (16–24) 15.7% |
(25–54) 76.9% | ||
(55–74) 7.4% | ||
IDL (n = 209) | Male 64.8% Female 35.2% | (16–24) 22.6% |
(25–54) 68.3% | ||
(55–74) 9.0% |
Test | Mean | Standard Deviation |
---|---|---|
Netiquette (n = 201) | 24.70 | 8.70 |
IDL (n = 209) | 42.69 | 11.25 |
Information and Data Literacy Test | Netiquette Test | ||||
---|---|---|---|---|---|
Item | p-Value | Point-Biserial Correlations | Item | p-Value | Point-Biserial Correlations |
Item1 | 0.87 | 0.567 | Item2 | 0.72 | 0.541 |
Item2 | 0.83 | 0.435 | Item3 | 0.55 | 0.257 |
Item3 | 0.53 | 0.527 | Item4 | 0.37 | 0.458 |
Item4 | 0.82 | 0.468 | Item5 | 0.21 | 0.113 |
Item5 | 0.25 | 0.200 | Item6 | 0.61 | 0.311 |
Item6 | 0.68 | 0.553 | Item7 | 0.56 | 0.431 |
Item7 | 0.82 | 0.627 | Item8 | 0.31 | 0.398 |
Item8 | 0.46 | 0.386 | Item9 | 0.67 | 0.474 |
Item9 | 0.41 | 0.269 | Item10 | 0.59 | 0.422 |
Item10 | 0.66 | 0.444 | Item11 | 0.29 | 0.256 |
Item11 | 0.39 | 0.467 | Item13 | 0.67 | 0.413 |
Item12 | 0.62 | 0.450 | Item14 | 0.75 | 0.396 |
Item13 | 0.47 | 0.439 | Item15 | 0.67 | 0.615 |
Item14 | 0.51 | 0.450 | Item16 | 0.64 | 0.478 |
Item15 | 0.63 | 0.342 | Item17 | 0.44 | 0.183 |
Item16 | 0.73 | 0.500 | Item19 | 0.64 | 0.568 |
Item17 | 0.76 | 0.549 | Item20 | 0.64 | 0.466 |
Item18 | 0.71 | 0.431 | Item22 | 0.60 | 0.441 |
Item19 | 0.45 | 0.299 | Item23 | 0.43 | 0.233 |
Item20 | 0.67 | 0.383 | Item24 | 0.41 | 0.377 |
Item21 | 0.82 | 0.317 | Item25 | 0.69 | 0.604 |
Item22 | 0.89 | 0.560 | Item28 | 0.70 | 0.455 |
Item23 | 0.78 | 0.504 | Item29 | 0.66 | 0.659 |
Item25 | 0.87 | 0.541 | Item30 | 0.66 | 0.374 |
Item26 | 0.84 | 0.405 | Item31 | 0.29 | 0.164 |
Item27 | 0.84 | 0.453 | Item32 | 0.60 | 0.329 |
Item28 | 0.60 | 0.261 | Item33 | 0.62 | 0.571 |
Item29 | 0.71 | 0.414 | Item34 | 0.66 | 0.534 |
Item30 | 0.67 | 0.291 | Item35 | 0.50 | 0.554 |
Item31 | 0.81 | 0.626 | Item36 | 0.60 | 0.694 |
Item32 | 0.80 | 0.445 | Item37 | 0.41 | 0.243 |
Item33 | 0.88 | 0.506 | Item38 | 0.67 | 0.534 |
Item34 | 0.59 | 0.282 | Item39 | 0.76 | 0.501 |
Item35 | 0.76 | 0.575 | Item40 | 0.37 | 0.173 |
Item36 | 0.80 | 0.517 | Item41 | 0.81 | 0.622 |
Item37 | 0.87 | 0.460 | Item42 | 0.56 | 0.373 |
Item38 | 0.65 | 0.404 | Item43 | 0.55 | 0.537 |
Item39 | 0.70 | 0.350 | Item44 | 0.34 | 0.229 |
Item40 | 0.76 | 0.529 | Item45 | 0.61 | 0.545 |
Item41 | 0.78 | 0.415 | Item46 | 0.74 | 0.568 |
Item42 | 0.89 | 0.580 | Item47 | 0.60 | 0.433 |
Item43 | 0.69 | 0.344 | Item48 | 0.51 | 0.251 |
Item44 | 0.80 | 0.433 | Item49 | 0.50 | 0.431 |
Item45 | 0.71 | 0.552 | Item50 | 0.79 | 0.191 |
Item46 | 0.52 | 0.358 | |||
Item47 | 0.83 | 0.556 | |||
Item48 | 0.90 | 0.443 | |||
Item49 | 0.59 | 0.374 | |||
Item50 | 0.74 | 0.410 | |||
Item51 | 0.94 | 0.585 | |||
Item52 | 0.91 | 0.528 | |||
Item53 | 0.78 | 0.552 | |||
Item54 | 0.46 | 0.219 | |||
Item55 | 0.93 | 0.501 | |||
Item56 | 0.74 | 0.435 | |||
Item57 | 0.86 | 0.543 | |||
Item58 | 0.66 | 0.512 | |||
Item59 | 0.68 | 0.660 | |||
Item60 | 0.57 | 0.342 |
Model | Deviance | Number of Parameters |
---|---|---|
1-dim | 11,900.3 | 61 |
3-dim | 11,824.4 | 66 |
Sample size | 209 |
Number of items in calibration | 60 |
Weighted fit MNSQ (0.75, 1.33) T sig. | 1 (0.74) |
Reliability estimates: EAP/PV reliability | |
DC 1.1 | 0.880 |
DC 1.2 | 0.840 |
DC 1.3 | 0.875 |
Dimensions | DC 1.1 | DC 1.2 | DC 1.3 |
---|---|---|---|
DC 1.1 | |||
DC 1.2 | 0.785 | ||
DC 1.3 | 0.929 | 0.847 |
Model | Deviance | Number of Parameters |
---|---|---|
1-dim | 10,100.0 | 44 |
4-dim | 10,073.8 | 53 |
Sample size | 201 |
Number of items in calibration | 43 |
Weighted fit MNSQ (0.75, 1.33) T sig. | none |
Reliability estimates: EAP/PV reliability | |
Sub-competence1 | 0.822 |
Sub-competence2 | 0.795 |
Sub-competence3 | 0.859 |
Sub-competence4 | 0.774 |
Dimensions | SC1 | SC2 | SC3 | SC4 |
---|---|---|---|---|
SC1 | ||||
SC2 | 0.854 | |||
SC3 | 0.913 | 0.845 | ||
SC4 | 0.827 | 0.806 | 0.862 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bartolomé, J.; Garaizar, P. Design and Validation of a Novel Tool to Assess Citizens’ Netiquette and Information and Data Literacy Using Interactive Simulations. Sustainability 2022, 14, 3392. https://doi.org/10.3390/su14063392
Bartolomé J, Garaizar P. Design and Validation of a Novel Tool to Assess Citizens’ Netiquette and Information and Data Literacy Using Interactive Simulations. Sustainability. 2022; 14(6):3392. https://doi.org/10.3390/su14063392
Chicago/Turabian StyleBartolomé, Juan, and Pablo Garaizar. 2022. "Design and Validation of a Novel Tool to Assess Citizens’ Netiquette and Information and Data Literacy Using Interactive Simulations" Sustainability 14, no. 6: 3392. https://doi.org/10.3390/su14063392
APA StyleBartolomé, J., & Garaizar, P. (2022). Design and Validation of a Novel Tool to Assess Citizens’ Netiquette and Information and Data Literacy Using Interactive Simulations. Sustainability, 14(6), 3392. https://doi.org/10.3390/su14063392