Promoting Sustainable Data-Based Decision-Making in the Korean Educational Information Disclosure System
Abstract
:1. Introduction
…To guarantee citizens’ right to know and promote academic studies and research on policies by providing for the duty to disclose information held and managed by each education-related institution… in order to encourage participation in school education and enhance the efficiency and transparency of educational administration.
2. Theoretical Background
2.1. Data Quality
Many educators questioned the validity of some data, such as whether test scores accurately reflect students’ knowledge, whether students take tests seriously, whether tests are aligned with curriculum, or whether satisfaction data derived from surveys with low response rates accurately measure opinions. These doubts greatly affected some educators’ buy-in, or acceptance of and support for the data.(p. 8)
2.2. School Context
2.2.1. Calibration
2.2.2. Principal Leadership
2.2.3. Teacher Involvement and Collaboration
2.3. Institutional Support
2.4. Theoretical Framework for Facilitating DBDM
3. Research Methods
3.1. Archival Research
3.2. Qualitative Data Collection and Analysis
4. Research Finding: Problems and Sustainable Development Policies for Promoting DBDM in the KEIDS
4.1. Data Quality
The information disclosed by the KEIDS is often not the most recent data. The Schoolinfo data system does not provide high-quality information on how school programs for gifted children and children with learning disabilities are implemented, so the KEIDS does not provide educators with data on how students’ learning, aptitude, and talent have been improved and developed.(Teacher A3)
If high-quality data are offered, data use would be promoted more than the present. The reason why teachers do not utilize data is owing to out-of-date information that most teachers already know.(Teacher B10)
4.2. School Contexts
I think the Schoolinfo system is as being used as a mere window dressing. Thus, we did not use school and student data to define teaching and learning. School teachers knew the data which is disclosed by KEIDS; we already shared student as well as school data through NEIS data system.(Teacher B2)
I heard the term DBDM for the first time while interviewing with you. Actually, we (our school teachers), including our school principal, do not recognize the need to use data. In addition, the school principal does not feel that he/she should invest in a data system and urge us to use school and student data. Teacher leaders who manage school data only collaborate with other teacher leaders.(Teacher B4)
4.3. Institutional Support
There was no institutional support from the government or school district, such as professional training, for improving data skills and/or incentives to help educators use data.(Teacher A4)
We did not know the purpose and content of the KEIDS and that there was no school level and institutional support for work reduction in order to promote DBDM.(Teacher B9)
5. Contributions and Limitations
6. Conclusions
Funding
Conflicts of Interest
References
- Luo, M. Structural equation modeling for high school principals’ data-driven decision making: An analysis of information use environments. Educ. Adm. Q. 2008, 44, 603–634. [Google Scholar]
- Marsh, J.A.; Pane, J.F.; Hamilton, L.S. Making Sense of Data-Driven Decision Making in Education: Evidence from Recent RAND Research; OP-170-EDU; RAND: Santa Monica, CA, USA, 2006. [Google Scholar]
- Valli, L.; Buese, D. The changing roles of teachers in an era of high-stakes accountability. Am. Educ. Res. J. 2007, 44, 519–558. [Google Scholar]
- Wayman, J.; Cho, V.; Richards, M. Student data systems and their use for educational improvement. In International Encyclopedia of Education; Elsevier: London, UK, 2010; Volume 8, pp. 14–20. [Google Scholar]
- Young, V.M. Teachers’ use of data: Loose coupling, agenda setting, and team norms. Am. J. Educ. 2006, 112, 521–548. [Google Scholar]
- Lee, H.C. Achievements and prospects of Korea’s educational reform. Korean J. Comp. Educ. 2004, 14, 1–11. (In Korean) [Google Scholar]
- Joo, Y.H.; Reyes, P. A political analysis of the policy process of the Open Recruitment System of Principals in Korea. KEDI J. Educ. Policy 2010, 7, 233–255. [Google Scholar]
- Hong, S.C. The policy process of education information disclosure in Korea. J. Korean Educ. 2012, 39, 235–259. (In Korean) [Google Scholar]
- Joo, Y.H.; Halx, M.D. The power of institutional isomorphism: An analysis of the institutionalization of performance-based pay systems in Korean National Universities. Asia Pac. Educ. Rev. 2011, 13, 281–297. [Google Scholar]
- Krasner, S.D. Sovereignty: An institutional perspective. Comp. Politic. Stud. 1988, 21, 66–94. [Google Scholar]
- Ministry of Education. A Master Plan for the Educational Information Disclosure System in 2008; Ministry of Education: Seoul, Korea, 2008. (In Korean) [Google Scholar]
- Lim, H.N.; Lee, S.H.; Oh, B.H.; Kim, K.S.; Kim, S.H. Development Plan and Implementing Strategies for Educational Information Disclosure System; RR2008-27; KEDI: Seoul, Korea, 2008. (In Korean) [Google Scholar]
- Wayman, J.C.; Stringfield, S.; Yakimowski, M. Software Enabling School Improvement through Analysis of Student Data; CRESPAR Technical Report No. 67; Johns Hopkins University: Baltimore, MD, USA, 2004. [Google Scholar]
- Earl, L.; Katz, S. Leading Schools in a Data-Rich World. In Second International Handbook of Educational Leadership and Administration; Kluwer Academic Publishers: Dordrecht, The Netherlands, 2002; pp. 1003–1022. [Google Scholar]
- Schildkamp, K.; Poortman, C.; Luyten, H.; Ebbeler, J. Factors promoting and hindering data-based decision making in schools. Sch. Eff. Sch. Improv. 2016, 28, 242–258. [Google Scholar]
- Datnow, A.; Hubbard, L. Teacher capacity for and beliefs about data-driven decision making: A literature review of international research. J. Educ. Chang. 2016, 17, 7–28. [Google Scholar]
- Honig, M.I.; Coburn, C. Evidence-based decision making in school district central offices: Toward a policy and research agenda. Educ. Policy 2008, 22, 578–608. [Google Scholar] [CrossRef] [Green Version]
- Wayman, J.C.; Stringfield, S. Technology-supported involvement of entire faculties in examination of student data for instructional improvement. Am. J. Educ. 2006, 112, 549–571. [Google Scholar] [CrossRef] [Green Version]
- Park, V.; Datnow, A. Co-constructing distributed leadership: District and school connections in data-driven decision-making. Sch. Leadersh. Manag. 2009, 29, 477–494. [Google Scholar]
- Heinrich, B.; Hristova, D.; Klier, M.; Schiller, A.; Szubartowicz, M. Requirements for data quality metrics. J. Data Inf. Qual. 2018, 9, 1–32. [Google Scholar] [CrossRef]
- Park, E.S. The Effect of Information Quality on Reliability and Utilization of School Information Disclosure. Ph.D. Thesis, Hongik University, Seoul, Korea, 2017. (In Korean). [Google Scholar]
- Pipino, L.L.; Lee, Y.W.; Wang, R.Y. Data quality assessment. Commun. ACM 2002, 45, 211–218. [Google Scholar]
- Vetro’, A.; Canova, L.; Torchiano, M.; Minotas, C.O.; Iemma, R.; Morando, F. Open data quality measurement framework: Definition and application to Open Government Data. Gov. Inf. Q. 2016, 33, 325–337. [Google Scholar]
- Wand, Y.; Wang, R.Y. Anchoring data quality dimensions in ontological foundations. Commun. ACM 1996, 39, 86–95. [Google Scholar]
- Côrte-Real, N.; Ruivo, P.; Oliveira, T. Leveraging internet of things and big data analytics initiatives in European and American firms: Is data quality a way to extract business value? Inf. Manag. 2020, 57, 103141. [Google Scholar] [CrossRef]
- Attard, J.; Orlandi, F.; Scerri, S.; Auer, S. A systematic review of open government data initiatives. Gov. Inf. Q. 2015, 32, 399–418. [Google Scholar] [CrossRef]
- Fox, C.; Levitin, A.; Redman, T. The notion of data and its quality dimensions. Inf. Process. Manag. 1994, 30, 9–19. [Google Scholar] [CrossRef]
- Fisher, C.W.; Kingma, B.R. Criticality of data quality as exemplified in two disasters. Inf. Manag. 2001, 39, 109–116. [Google Scholar]
- Geisler, S.; Quix, C.; Weber, S.; Jarke, M. Ontology-based data quality management for data streams. J. Data Inf. Qual. 2016, 7, 1–34. [Google Scholar] [CrossRef]
- Juddoo, S.; George, C.; Duquenoy, P.; Windridge, D. Data governance in the health industry: Investigating data quality dimensions within a big data context. Appl. Syst. Innov. 2018, 1, 43. [Google Scholar] [CrossRef] [Green Version]
- Kubler, S.; Robert, J.; Neumaier, S.; Umbrich, J.; Le Traon, Y. Comparison of metadata quality in open data portals using the Analytic Hierarchy Process. Gov. Inf. Q. 2018, 35, 13–29. [Google Scholar]
- Thomas, G. Introduction: Evidence and practice. In Evidence-Based Practice in Education; Thomas, G., Pring, R., Eds.; Open University Press: Maidenhead, UK, 2004; pp. 1–18. [Google Scholar]
- Pak, K.; DeSimone, L.M. Developing principals’ data-driven decision-making capacity: Lessons from one urban district. Phi Delta Kappan 2019, 100, 37–42. [Google Scholar]
- Stone, N.J. Exploring the relationship between calibration and self-regulated learning. Educ. Psychol. Rev. 2000, 12, 437–475. [Google Scholar] [CrossRef]
- Keuning, T.; Geel, M.; Visscher, A. Why a data-based decision-making intervention works in some schools and not in others. Learn. Disabil. Res. Pr. 2017, 32, 32–45. [Google Scholar]
- Faber, J.M.; Glas, C.A.W.; Visscher, A.J. Differentiated instruction in a data-based decision-making context. Sch. Eff. Sch. Improv. 2017, 29, 43–63. [Google Scholar] [CrossRef] [Green Version]
- Bryk, A.; Camburn, E.; Louis, K.S. Professional community in Chicago elementary schools: Facilitating factors and organizational consequences. Educ. Admin. Q. 1999, 35, 751–781. [Google Scholar]
- Copland, M.A. Leadership of inquiry: Building and sustaining capacity for school improvement. Educ. Eval. Policy Anal. 2003, 25, 375–395. [Google Scholar]
- Reynolds, D.; Teddlie, C. The International Handbook of School Effectiveness Research; Informa UK Limited: London, UK, 2002. [Google Scholar]
- Dembosky, J.W.; Pane, J.F.; Barney, H.; Christina, R. Data Driven Decision Making in Southwestern Pennsylvania School Districts; WR-326-HE/GF; RAND: Santa Monica, CA, USA, 2005. [Google Scholar]
- Mortimore, P.; Sammons, P.; Stoll, L.; Lewis, D.; Ecob, R. School Matters: The Junior Years; Open Books: London, UK, 1988. [Google Scholar]
- Jackson, D.; Temperley, J. From professional learning community to networked learning community. In Professional Learning Communities: Divergence, Depth and Dilemmas; Stoll, L., Louis, K.S., Eds.; McGraw-Hill: New York, NY, USA, 2007; pp. 45–62. [Google Scholar]
- Hoy, W.K.; Miskel, C.G. Educational Administration: Theory, Research, and Practice, 9th ed.; McGraw-Hill: New York, NY, USA, 2012. [Google Scholar]
- Resnick, L.B. Nested learning systems for the thinking curriculum. Educ. Res. 2010, 39, 183–197. [Google Scholar]
- Kippers, W.; Wolterinck, C.H.; Schildkamp, K.; Poortman, C.L.; Visscher, A.J. Teachers’ views on the use of assessment for learning and data-based decision making in classroom practice. Teach. Teach. Educ. 2018, 75, 199–213. [Google Scholar]
- Stecher, B.; Hamilton, L.; Gonzalez, G. Working Smarter to Leave No Child behind; WP-138-EDU; Rand: Arlington, VA, USA, 2003. [Google Scholar]
- Walsham, G. Interpretive case studies in IS research: Nature and method. Eur. J. Inf. Syst. 1995, 4, 74–81. [Google Scholar]
- Walsham, G. Doing interpretive research. Eur. J. Inf. Syst. 2006, 15, 320–330. [Google Scholar]
- Yin, R.K. Case Study Research: Design and Methods, 3rd ed.; Sage: Thousand Oaks, CA, USA, 2003. [Google Scholar]
- Lijphart, A. Comparative Politics and the Comparative Method. Am. Political Sci. Rev. 1971, 65, 682–693. [Google Scholar] [CrossRef]
- Webster, J.; Watson, R.T. Analyzing the past to prepare for the future: Writing a literature review. MIS Q. 2002, 26, 13–23. [Google Scholar]
- Coburn, C.E.; Stein, M.K. Communities of practice theory and the role of teacher professional community in policy implementation. In Complexity and Policy Implementation; Honig, M.I., Ed.; SUNY Press: Albany, NY, USA, 2006; pp. 25–46. [Google Scholar]
- Fullan, M. The New Meaning of Educational Change, 4th ed.; Teachers College Press: New York, NY, USA, 2007. [Google Scholar]
- Park, S.J.; Hong, H.I. Analysis of the use of publicly disclosed school information in media. Korean J. Educ. Adm. 2015, 33, 369–391. (In Korean) [Google Scholar]
- Oh, S.H.; Choi, S.D. Searching for rational ways of finding an education information disclosure system in elementary and secondary schools. Korean J. Local Gov. Stud. 2008, 12, 229–248. (In Korean) [Google Scholar]
- Vanlommel, K.; Vanhoof, J.; Van Petegem, P. Data use by teachers: The impact of motivation, decision-making style, supportive relationships and reflective capacity. Educ. Stud. 2016, 42, 1–18. [Google Scholar] [CrossRef]
- Lee, J.K. The design and evaluation of U.S. school accountability policy. In Proceedings of the Symposium Conducted at the Meeting of KEDI-KAERA, Seoul, Korea, 23 June 2010; pp. 75–101. [Google Scholar]
- Sutherland, D.; Joumard, I.; Nicq, C.; Price, R.W.R. Performance Indicators for Public Spending Efficiency in Primary and Secondary Education; Organisation for Economic Co-Operation and Development (OECD): Paris, France, 2007. [Google Scholar]
- Park, J.J. A theoretical study on the measurement scale of education information disclosure system service. CNU J. Educ. Stud. 2013, 34, 43–63. (In Korean) [Google Scholar]
- Data Quality Campaign. Creating a Longitudinal Data System: Using Data to Improve Student Achievement. 2006. Available online: http://www.dataqualitycampaign.org/ (accessed on 18 March 2010).
- Amrein-Beardsley, A.; Berliner, D.C. Re-analysis of NAEP Math and Reading Scores in States with and without High-stakes Tests: Response to Rosenshine. Educ. Policy Anal. Arch. 2003, 11. [Google Scholar] [CrossRef] [Green Version]
- Nichols, S.L.; Glass, G.V.; Berliner, D.C. High-stakes testing and student achievement: Does accountability pressure increase student learning? Educ. Policy Anal. Arch. 2006, 14. [Google Scholar] [CrossRef] [Green Version]
- Vasquez Heilig, J.; Darling-Hammond, L. Accountability Texas-Style: The Progress and Learning of Urban Minority Students in a High-Stakes Testing Context. Educ. Eval. Policy Anal. 2008, 30, 75–110. [Google Scholar]
- Choi, M.Y. “Can’t Trust the National-Level Standardized Test Result”: The Controversy Concerning the Reliability Was Amplified. The Kyounghyang Sinmun. Available online: http://news.khan.co.kr/ (accessed on 17 February 2009). (In Korean).
- Lee, K.H. A review on the school information disclosure system. J. Politics Educ. 2010, 17, 89–110. [Google Scholar]
- Wayman, J.C. Involving teachers in data-driven decision making: Using computer data systems to support teacher inquiry and reflection. J. Educ. Stud. Placed Risk (JESPAR) 2005, 10, 295–308. [Google Scholar] [CrossRef]
- OECD. Creating Effective Teaching and Learning Environments; Organisation for Economic Co-Operation and Development (OECD): Paris, France, 2009. [Google Scholar]
- Lee, S.Y. Study of the Relationship between teachers’ team learning and learning organization of school. J. Educ. Adm. 2007, 25, 95–115. (In Korean) [Google Scholar]
- Bolman, L.G.; Deal, T.E. Reframing Organizations: Artistry, Choice, and Leadership, 4th ed.; Jossey-Bass: San Francisco, CA, USA, 2008. [Google Scholar]
- Spillane, J.P. Distributed Leadership; Jossey-Bass: San Francisco, CA, USA, 2006. [Google Scholar]
- Hord, S.M. Professional learning communities: Educators work together toward a shared purpose—Improved student learning. J. Staff Develop. 2009, 30, 40–43. [Google Scholar]
- Muijs, D.; Harris, A. Teacher leadership-Improvement through empowerment? An overview of the literature. Educ. Manag. Adm. 2003, 31, 437–448. [Google Scholar]
- OECD. TALIS 2013 Results: An International Perspective on Teaching and Learning. Available online: https://www.oecd-ilibrary.org/. (accessed on 24 February 2019).
- Ikemoto, G.S.; Marsh, J.A. Chapter 5 Cutting through the “Data-driven” mantra: Different conceptions of data-driven decision making. Yearb. Natl. Soc. Study Educ. 2007, 106, 105–131. [Google Scholar]
- Hargreaves, A.; Fink, D. Distributed Leadership: Democracy or Delivery? In Distributed Leadership; Harris, A., Ed.; Springer Science and Business Media B.V.: Dordrecht, The Netherlands, 2009; Volume 7, pp. 181–193. [Google Scholar]
- Opfer, V.D.; Henry, G.T.; Mashburn, A.J. The district effect: Systemic responses to high stakes accountability policies in six southern states. Am. J. Educ. 2008, 114, 299–332. [Google Scholar]
- Chung, M.K.; Joo, Y.H.; Chung, P. A study of policy strategies and directions for reducing the burden of teachers in administrative affairs. J. Korean Teach. Educ. 2013, 30, 377–404. (In Korean) [Google Scholar]
- The Korean Federation of Teachers’ Associations. Many Teachers Neglect of Their Duties to Deal with an Official Document; The Korean Federation of Teachers’ Associations: Seoul, Korea, 2009. (In Korean) [Google Scholar]
- The Korean Teachers & Educational Workers’ Union. Analysis of Job Status and Opinion Survey for Teacher Tasks; KTU: Seoul, Korea, 2013. (In Korean) [Google Scholar]
- Jensen, B. Findings from TALIS and its Implication for South Korea. 2009 Korea-OECD International Seminar: New Millennium Learners and Teachers; KEDI: Seoul, Korea, 2009; pp. 231–248. [Google Scholar]
- Hevner, A.R.; March, S.T.; Ram, J.P. Design science in information systems research. MIS Q. 2004, 28, 75. [Google Scholar]
- Moody, L.; Dede, C. Models of data-based decision making: A case study of the Milwaukee public schools. In Data-Driven School Improvement: Linking Data and Learning; Mandinach, E.B., Honey, M., Eds.; Teachers College, Columbia University: New York, NY, USA, 2008; pp. 233–254. [Google Scholar]
- Loeb, H.; Knapp, M.S.; Elfers, A.M. Teachers’ response to standards-based reform: Probing reform assumptions in Washington State. Educ. Policy Anal. Arch. 2008, 16, 1–32. [Google Scholar]
- Scott, W.R. Institutions and Organizations; Sage Publications: Thousand Oaks, CA, USA, 1995. [Google Scholar]
Data Classification 1 | Paragraph for the KEIDS 2 | Data Content 3 | Frequency of Publication 4 | Timing of Publication 4 |
---|---|---|---|---|
Input data | Paragraph 1. Regulations concerning school operations, including school regulations | School regulations and school management regulations, excluding school regulations | Rolling basis | Rolling basis |
Paragraph 2. Matters with regard to the organization and operation of educational curricula | Status of organizing, operating, and evaluating school curricula; plan for subject and extracurricular activity and experience program outside school; status for the number of school days and for the number of classes and instructional hours | Once a year | May | |
Plan for open classes | Once a year | April | ||
Paragraph 3. Number of students per grade and class and status changes of students, including numbers of students moving in and out, as well as discontinuance of studies | Number of students by each grade and class and number of students for transfer and dropout | Once a year | May | |
Paragraph 9. Matters with regard to school meals | Status of school lunch service | Once a year | May | |
Paragraph 10. Matters with regard to health management, environmental sanitation, and safety management of schools | Status of healthcare, public hygiene at school, and safety management | Once a year | May | |
Paragraph 11. Matters with regard to the status and treatment of school violence | Status of school violence and its handling | Once a year | May | |
Paragraph 13. Matters with regard to the entrance status of students and careers of graduates | Status of newly enrolled students | Once a year | May | |
Paragraph 15. Matters with regard to the educational conditions and school operation status | Status of club activities, plan for distinctive business in educational management, status of school library, status of after-school management and support, performance for counseling students and their parents, and status of enhancement of students’ physical strength | Once a year | May | |
Evaluation items and the result for teaching and guidance by school | Once a year | February | ||
Output data | Paragraph 4. Status of studies by grades and subjects of the school | Fact for student achievement by each subject | Twice a year | February, September |
Plan for operation of subject progress | Twice a year | April, September | ||
Fact for evaluation plan by each grade and subject | Once a year | April | ||
Paragraph 12. Matters with regard to fundamental materials for academic research on the evaluation of educational achievements at the level of the nation, city, or Do | Status of national-level student achievement test; ratio for national-level student achievement test: above average, basic achievement, below average; degree of improvement of national-level student achievement test compared to last year | Once a year | November | |
Paragraph 13. Matters with regard to the entrance status of students and careers of graduates | Status of the careers of graduates | Once a year | May |
Author | Categories/Approaches | Dimensions |
---|---|---|
Fox et al. [27] (p. 17) | Objective category | Accuracy, completeness, consistency, and currentness |
Wand and Wang [24] (p. 92) | Internal category | Data-related accuracy, reliability, timeliness, completeness, currency, consistency, and precision System related reliability |
External category | Data-related timeliness, relevance, content, importance, sufficiency, usableness, usefulness, clarity, conciseness, freedom from bias, informativeness, level of detail, quantitativeness, scope, interpretability, and understandability System-related timeliness, flexibility, format, and efficiency | |
Fisher and Kingma [28] (p. 110) | Subjective category | Accuracy, timeliness, consistency, completeness, relevancy, and fitness for use |
Pipino et al. [22] (p. 212) | Subjective and objective categories | Accessibility, appropriate amount of data, believability, completedness, concise representation, consistent representation, ease of manipulation, free-of-error, interpretability, objectivity, relevancy, reputation, security, timeliness, understandability, and value-added |
Luo [1] (p. 612) | Subjective category | Believable, accurate, and reliable and good data sources |
Attard et al. [26] (pp. 410–411) | Subjective and objective categories | Usability, accuracy, completeness, consistency, timeliness, accessibility, and openness |
Geisler et al. [29] (p. 10) | Metadata-driven approach | Completeness, data volume, timeliness, accuracy, consistency, and drop rate |
Vetrò et al. [23] (p. 331) | Objective category | Understandability, currentness, expiration, completeness, traceability, and compliance |
Heinrich et al. [20] (p. 17) | Economic approach | Timeliness, completeness, reliability, correctness, and consistency |
Juddoo et al. [30] (p. 8) | Intrinsic category | Accuracy, objectivity, believability, and reputation |
Contextual category | Value-added, relevancy, timeliness, completeness, and appropriate amount of data | |
Representational category | Interpretability, ease of understanding, representational consistency, and concise representation | |
Accessibility category | Accessibility and access security | |
Kubler et al. [31] (pp. 15–16) | Data openness category | Complete, primary, timely, accessible, machine processable, non-discriminatory, non-proprietary, and license-free data |
Data transparency category | Reusability, understandability, and authenticity | |
Côrte-Real et al. [25] (p. 6) | Explicit knowledge category | Completeness, accuracy, format, and currency |
Key Factors | Interview Questions | |
---|---|---|
Data quality | Does KEIDS keep high-quality and accurate data? Does KEIDS keep accurate data? | |
School contexts | Calibration | How do educators define teaching and learning regarding data use? Do educators determine how to conduct teaching under these definitions? Do educators determine how they assess student learning? Do educators consider how they react to results? |
Principal leadership | Does a principal invest and support the use of data systems? Does a principal encourage educators to directly access data? | |
Faculty involvement and collaboration | Are teachers enthusiastic about the use of a data system? Do educators closely collaborate with each other for data use? | |
Institutional supports | Do education authorities supply in-depth, continuous training for data use in the system? Is sufficient time ensured for educators to access and examine data? |
© 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Joo, Y.H. Promoting Sustainable Data-Based Decision-Making in the Korean Educational Information Disclosure System. Sustainability 2020, 12, 6762. https://doi.org/10.3390/su12176762
Joo YH. Promoting Sustainable Data-Based Decision-Making in the Korean Educational Information Disclosure System. Sustainability. 2020; 12(17):6762. https://doi.org/10.3390/su12176762
Chicago/Turabian StyleJoo, Young Hyeo. 2020. "Promoting Sustainable Data-Based Decision-Making in the Korean Educational Information Disclosure System" Sustainability 12, no. 17: 6762. https://doi.org/10.3390/su12176762
APA StyleJoo, Y. H. (2020). Promoting Sustainable Data-Based Decision-Making in the Korean Educational Information Disclosure System. Sustainability, 12(17), 6762. https://doi.org/10.3390/su12176762