Digital Developmental Advising Systems for Engineering Students Based on Accreditation Board of Engineering and Technology Student Outcome Evaluations
Abstract
:1. Introduction
2. Purpose of Study
- To what extent should engineering programs shift from program- to student-centered models that incorporate learning outcomes for the evaluation of individual student performance besides program evaluations for accreditation requirements?
- To what extent can manual assessment processes collect, store, and utilize detailed outcome data for providing effective developmental academic advising to every student on an engineering campus where several hundred are enrolled?
- To what extent can the assessment process be automated using digital technology so that detailed outcome information for every student on campus can be effectively utilized for developmental advising?
- What specific benefits can digital automated advising systems provide to developmental advisors and their students?
3. Research Framework
3.1. Methodology
3.2. Participants
3.3. Developmental Advising and Its Assessment—A Qualitative Review
3.3.1. Developmental Advising
3.3.2. National Academic Advising Association and ABET Standards
3.3.3. Assessing Advising
4. Theoretical, Conceptual, and Practical Frameworks
4.1. Theoretical Framework
OBE Model
- Developing a clear set of learning outcomes, around which all of the educational system’s components can be focused;
- Establishing the conditions and opportunities within the educational system that enable and encourage all students to achieve those essential outcomes.
- Ensuring that all students are equipped with the knowledge, competence, and qualities needed to be successful after they exit the educational system;
- Structuring and operating schools so that those outcomes can be achieved and maximized for all students.
- All components of the education system, including academic advising, should be based on, achieve, and maximize a clear and detailed set of learning outcomes for each student;
- All students should be provided with detailed real-time and historical records of their performance based on learning outcomes to make informed decisions for improvement actions.
4.2. Conceptual Framework
FCAR + Specific/Generic Performance Indicator Assessment Model
4.3. Practical Framework—Digital Platform EvalTools®
4.4. Practical Framework—Summary of Digital Technology and Assessment Methodology
- Measurement of outcome information in all course levels of a program curriculum, as follows: introductory (100-/200-level course), reinforced (300-level course), and mastery (400-level course). Engineering fundamentals and concepts are introduced in 100-/200-level courses, then they are reinforced in 300-level courses through application and analysis problems, and finally, in 400-level courses, students attain mastery in skills with activities such as synthesis and evaluation [43,52,53,54];
- Well-defined performance criteria for course and program levels;
- Integration of direct, indirect, formative, and summative outcomes assessments for course and program evaluations;
- Program, as well as student performance, evaluations considering their respective measured ABET SOs and associated PIs as a relevant indicator scheme;
- Six comprehensive plan do check act (PDCA) quality cycles to ensure the quality standards, monitoring, and control of education process, instruction, assessment, evaluation, CQI, and data collection and reporting [47];
- Customized web-based software EvalTools® facilitating all of the above (Information on EvalTools®) [33].
5. Results
5.1. Mixed Methods Approach for Student Evaluations Using Automated ABET SO Data
- Level 1: Quantitative review of single- or multi-term SO data, followed by a qualitative semantic analysis of the language of SO statements coupled with a qualitative review of curriculum and course delivery information;
- Level 2: Quantitative review of single- or multi-term SO and PI data, followed by a qualitative semantic analysis of the language of SO and PI statements, course titles, and assessment types coupled with a qualitative review of curriculum and course delivery information;
- Level 3: Quantitative review of single- or multi-term SO and PI data, followed by a qualitative semantic analysis of the language of SO and PI statements, course titles, and assessment types coupled with a qualitative review of curriculum, course delivery, and FCAR instructor reflection information.
5.2. Automated ABET SO Data for Every Enrolled Student
5.3. Quantitative and Qualitative Analyses of Each Student’s ABET SO Data
5.4. Quantitative and Qualitative Analyses of Each Student’s PI and Assessment Data
5.5. An Outcome-Based Advising Example
5.6. Students as Active Participants
- (1)
- PI_9_1: demonstrate self-managing ability to articulate the student’s own learning goals;
- (2)
- PI_9_2: demonstrate self-monitoring ability to assess the student’s own achievements;
- (3)
- PI_9_3: demonstrate self-modifying ability to make mid-course corrections.
Process for Measuring Soft Skills in Student Advising Activities
5.7. Quantitative and Qualitative Analyses of Student Responses and Overall SO Results
5.8. Added Advantage for Evaluating Advising at the Program Level
6. Discussion
6.1. Quality Standards of Digital Developmental Advising Systems
6.2. Qualitative Comparison of Digital Developmental Advising with Prevalent Traditional Advising
6.3. Research Questions
6.3.1. Research Question 1: To What Extent Should Engineering Programs Shift from Program- to Student-Centered Models That Incorporate Learning Outcomes for the Evaluation of Individual Student Performances besides Program Evaluations for Accreditation Requirements?
6.3.2. Research Question 2: To What Extent Can Manual Assessment Processes Collect, Store, and Utilize Detailed Outcome Data for Providing Effective Developmental Academic Advising to Every Student on an Engineering Campus Where Several Hundred Are Enrolled?
6.3.3. Research Question 3: To What Extent Can the Assessment Process Be Automated Using Digital Technology So That Detailed Outcomes Information for Every Student on Campus Can Be Effectively Utilized for Developmental Advising?
6.3.4. Research Question 4: What Specific Benefits Can Digital Automated Advising Systems Provide to Developmental Advisors and Their Students?
- Detailed and accurate digital advising records for advisors and advisees showing trends and summaries of the results of performances related to SOs, PIs, and their corresponding assessments;
- Advisors can employ mixed methods approaches to achieve a consistent and structured mechanism for assessing student performance in meeting outcomes and can focus more on specific, relevant, and constructive advice for quality improvement;
- Students’ metacognitive skills are boosted with accurate indications of their strong/weak skills and/or knowledge areas to support corrective actions in meeting student outcomes each semester;
- The student’s self-directed remedial actions can align with the developmental advisor’s recommendations to reinforce overall performance improvement.
6.4. Limitations
6.5. Future Work
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Killen, R. Teaching Strategies for Outcome Based Education, 2nd ed.; Juta, & Co.: Cape Town, South Africa, 2007. [Google Scholar]
- Moon, J. Linking Levels, Learning Outcomes and Assessment Criteria. Bologna Process. European Higher Education Area. 2000. Available online: http://aic.lv/ace/ace_disk/Bologna/Bol_semin/Edinburgh/J_Moon_backgrP.pdf (accessed on 19 March 2022).
- Spady, W. Choosing outcomes of significance. Educ. Leadersh. 1994, 51, 18–23. [Google Scholar]
- Spady, W. Outcome-Based Education: Critical Issues and Answers; American Association of School Administrators: Arlington, VA, USA, 1994. [Google Scholar]
- Spady, W.; Hussain, W.; Largo, J.; Uy, F. Beyond Outcomes Accreditation; Rex Publishers: Manila, Philippines, 2018; Available online: https://www.rexestore.com/home/1880-beyond-outcomes-accredidationpaper-bound.html (accessed on 19 March 2022).
- Spady, W. Outcome-Based Education’s Empowering Essence; Mason Works Press: Boulder, CO, USA, 2020; Available online: http://williamspady.com/index.php/products/ (accessed on 21 March 2022).
- Harden, R.M. Developments in outcome-based education. Med. Teach. 2002, 24, 117–120. [Google Scholar] [CrossRef] [PubMed]
- Harden, R.M. Outcome-based education: The future is today. Med. Teach. 2007, 29, 625–629. [Google Scholar] [CrossRef] [PubMed]
- Adelman, C.; National Institute of Learning Outcomes Assessment (NILOA). To imagine a Verb: The Language and Syntax of Learning Outcomes Statements. 2015. Available online: http://learningoutcomesassessment.org/documents/Occasional_Paper_24.pdf (accessed on 21 March 2022).
- Provezis, S. Regional Accreditation and Student Learning Outcomes: Mapping the Territory; National Institute of Learning Outcomes Assessment (NILOA): Urbana, IL, USA, 2010; Available online: www.learningoutcomeassessment.org/documents/Provezis.pdf (accessed on 21 March 2022).
- Gannon-Slater, N.; Ikenberry, S.; Jankowski, N.; Kuh, G. Institutional Assessment Practices across Accreditation Regions; National Institute of Learning Outcomes Assessment (NILOA): Urbana, IL, USA, 2014; Available online: www.learningoutcomeassessment.org/documents/Accreditation%20report.pdf (accessed on 21 March 2022).
- Accreditation Board of Engineering & Technology (ABET) USA. Accreditation Criteria. 2023. Available online: http://www.abet.org/accreditation/accreditation-criteria/ (accessed on 11 December 2023).
- Dew, S.K.; Lavoie, M.; Snelgrove, A. An engineering accreditation management system. In Proceedings of the 2nd Conference Canadian Engineering Education Association, St. John’s, NL, Canada, 6–8 June 2011. [Google Scholar] [CrossRef]
- Essa, E.; Dittrich, A.; Dascalu, S.; Harris, F.C., Jr. ACAT: A Web-Based Software Tool to Facilitate Course Assessment for ABET Accreditation. Department of Computer Science and Engineering, University of Nevada. 2010. Available online: http://www.cse.unr.edu/~fredh/papers/conf/092-aawbsttfcafaa/paper.pdf (accessed on 12 April 2022).
- Kalaani, Y.; Haddad, R.J. Continuous improvement in the assessment process of engineering programs. In Proceedings of the 2014 ASEE South East Section Conference, Macon, GA, USA, 30 March–1 April 2014. [Google Scholar]
- International Engineering Alliance (IEA). Washington Accord Signatories. 2023. Available online: https://www.ieagreements.org/accords/washington/signatories/ (accessed on 11 December 2023).
- Middle States Commission of Higher Education. Standards for Accreditation, PA, USA. 2023. Available online: https://www.msche.org/ (accessed on 11 December 2023).
- Mohammad, A.W.; Zaharim, A. Programme outcomes assessment models in engineering faculties. Asian Soc. Sci. 2012, 8. [Google Scholar] [CrossRef]
- Wergin, J.F. Higher education: Waking up to the importance of accreditation. Change 2005, 37, 35–41. [Google Scholar]
- Aiken-Wisniewski, S.A.; Smith, J.S.; Troxel, W.G. Expanding Research in Academic Advising: Methodological Strategies to Engage Advisors in Research. NACADA J. 2010, 30, 4–13. [Google Scholar] [CrossRef]
- Appleby, D.C. The teaching-advising connection. In The Teaching of Psychology: Essays in Honor of Wilbert J. McKeachie and Charles L. Braver; Davis, S.F., Buskist, W., Eds.; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 2002. [Google Scholar]
- Appleby, D.C. Advising as teaching and learning. In Academic Advising: A Comprehensive Handbook, 2nd ed.; Gordon, V.N., Habley, W.R., Grites, T.J., Eds.; Jossey-Bass: San Francisco, CA, USA, 2008; pp. 85–102. [Google Scholar]
- Campbell, S. Why do assessment of academic advising? Part I. Acad. Advis. Today 2005, 28, 8. [Google Scholar]
- Campbell, S. Why do assessment of academic advising? Part II. Acad. Advis. Today 2005, 28, 13–14. [Google Scholar]
- Campbell, S.M.; Nutt, C.L. Academic advising in the new global century: Supporting student engagement and learning outcomes achievement. Peer Rev. 2008, 10, 4–7. [Google Scholar]
- Gordon, V.N. Developmental Advising: The Elusive Ideal. NACADA J. 2019, 39, 72–76. [Google Scholar] [CrossRef]
- Habley, W.R.; Morales, R.H. Advising Models: Goal Achievement and Program Effectiveness. NACADA J. 1998, 18, 35–41. [Google Scholar] [CrossRef]
- He, Y.; Hutson, B. Assessment for faculty advising: Beyond the service component. NACADA J. 2017, 37, 66–75. [Google Scholar] [CrossRef]
- Kraft-Terry, S.; Cheri, K. Direct Measure Assessment of Learning Outcome–Driven Proactive Advising for Academically At-Risk Students. NACADA J. 2019, 39, 60–76. [Google Scholar] [CrossRef]
- Lynch, M. Assessing the effectiveness of the advising program. In Academic Advising: A Comprehensive Handbook; Gordon, V.N., Habley, W.R., Eds.; Jossey-Bass: San Francisco, CA, USA, 2000. [Google Scholar]
- Powers, K.L.; Carlstrom, A.H.; Hughey, K.F. Academic advising assessment practices: Results of a national study. NACADA J. 2014, 34, 64–77. [Google Scholar] [CrossRef]
- Swing, R.L. (Ed.) Proving and Improving: Strategies for Assessing the First Year of College; (Monograph Series No. 33); University of South Carolina, National Resource Center for the First-Year Experience and Students in Transition: Columbia, SC, USA, 2001. [Google Scholar]
- Information on EvalTools®. Available online: http://www.makteam.com (accessed on 11 December 2023).
- Jeschke, M.P.; Johnson, K.E.; Williams, J.R. A comparison of intrusive and prescriptive advising of psychology majors at an urban comprehensive university. NACADA J. 2001, 21, 46–58. [Google Scholar] [CrossRef]
- Kadar, R.S. A counseling liaison model of academic advising. J. Coll. Couns. 2001, 4, 174–178. [Google Scholar] [CrossRef]
- Banta, T.W.; Hansen, M.J.; Black, K.E.; Jackson, J.E. Assessing advising outcomes. NACADA J. Spring 2002, 22, 5–14. [Google Scholar] [CrossRef]
- National Academic Advising Association (NACADA). Kansas State University, KS, USA. 2023. Available online: https://nacada.ksu.edu/ (accessed on 11 December 2023).
- Ibrahim, W.; Atif, Y.; Shuaib, K.; Sampson, D. A Web-Based Course Assessment Tool with Direct Mapping to Student Outcomes. Educ. Technol. Soc. 2015, 18, 46–59. [Google Scholar]
- Kumaran, V.S.; Lindquist, T.E. Web-based course information system supporting accreditation. In Proceedings of the 2007 Frontiers in Education Conference, San Diego, CA, USA, 10–13 October 2007; Available online: https://asu.elsevierpure.com/en/publications/web-based-course-information-system-supporting-accreditation (accessed on 11 March 2022).
- McGourty, J.; Sebastian, C.; Swart, W. Performance measurement and continuous improvement of undergraduate engineering education systems. In Proceedings of the 1997 Frontiers in Education Conference, Pittsburgh, PA, USA, 5–8 November 1997; IEEE Catalog no. 97CH36099. pp. 1294–1301. [Google Scholar]
- McGourty, J.; Sebastian, C.; Swart, W. Developing a comprehensive assessment program for engineering education. J. Eng. Educ. 1998, 87, 355–361. [Google Scholar] [CrossRef]
- Pallapu, S.K. Automating Outcomes Based Assessment. 2005. Available online: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.199.4160&rep=rep1&type=pdf (accessed on 12 April 2022).
- Hussain, W.; Mak, F.; Addas, M.F. Engineering Program Evaluations Based on Automated Measurement of Performance Indicators Data Classified into Cognitive, Affective, and Psychomotor Learning Domains of the Revised Bloom’s Taxonomy. In Proceedings of the ASEE 123rd Annual Conference and Exposition, New Orleans, LA, USA, 26–29 June 2016; Available online: https://peer.asee.org/engineering-program-evaluations-based-on-automated-measurement-of-performance-indicators-data-classified-into-cognitive-affective-and-psychomotor-learning-domains-of-the-revised-bloom-s-taxonomy (accessed on 21 May 2022).
- Hussain, W.; Spady, W. Specific, Generic Performance Indicators and Their Rubrics for the Comprehensive Measurement of ABET Student Outcomes. In Proceedings of the ASEE 124th Annual Conference and Exposition, Columbus, OH, USA, 25–28 June 2017. [Google Scholar]
- Mak, F.; Sundaram, R. Integrated FCAR Model with Traditional Rubric-Based Model to Enhance Automation of Student Outcomes Evaluation Process. In Proceedings of the ASEE 123rdAnnual Conference and Exposition, New Orleans, LA, USA, 26–29 June 2016. [Google Scholar]
- Eltayeb, M.; Mak, F.; Soysal, O. Work in progress: Engaging faculty for program improvement via EvalTools®: A new software model. In Proceedings of the 2013 Frontiers in Education Conference FIE, Oklahoma City, OK, USA, 23–26 October 2013; pp. 1–6. [Google Scholar] [CrossRef]
- Hussain, W.; Spady, W.G.; Naqash, M.T.; Khan, S.Z.; Khawaja, B.A.; Conner, L. ABET Accreditation During and After COVID19—Navigating the Digital Age. IEEE Access 2020, 8, 218997–219046. [Google Scholar] [CrossRef] [PubMed]
- Spady, W.; Marshall, K.J. Beyond traditional outcome-based education. Educ. Leadersh. 1991, 49, 71. [Google Scholar]
- Spady, W. Organizing for results: The basis of authentic restructuring and reform. Educ. Leadersh. 1988, 46, 7. [Google Scholar]
- Spady, W. It’s time to take a close look at outcome-based education. Outcomes 1992, 7, 6–13. [Google Scholar]
- Information on Blackboard®. Available online: https://www.blackboard.com/teaching-learning/learning-management/blackboard-learn (accessed on 19 May 2022).
- Hussain, W.; Addas, M.F.; Mak, F. Quality improvement with automated engineering program evaluations using performance indicators based on Bloom’s 3 domains. In Proceedings of the Frontiers in Education Conference (FIE), Erie, PA, USA, 12–15 October 2016; pp. 1–9. [Google Scholar]
- Hussain, W.; Addas, M.F. Digitally Automated Assessment of Outcomes Classified per Bloom’s Three Domains and Based on Frequency and Types of Assessments; University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA): Urbana, IL, USA, 2016; Available online: http://www.learningoutcomesassessment.org/documents/Hussain_Addas_Assessment_in_Practice.pdf (accessed on 21 May 2022).
- Hussain, W.; Addas, M.F. A Digital Integrated Quality Management System for Automated Assessment of QIYAS Standardized Learning Outcomes. In Proceedings of the 2nd International Conference on Outcomes Assessment (ICA), QIYAS, Riyadh, Saudi Arabia, 6–8 December 2015. [Google Scholar]
- Estell, J.K.; Yoder, J.-D.S.; Morrison, B.B.; Mak, F.K. Improving upon best practices: FCAR 2.0. In Proceedings of the ASEE 2012 Annual Conference, San Antonio, TX, USA, 10–13 June 2012. [Google Scholar]
- Liu, C.; Chen, L. Selective and objective assessment calculation and automation. In Proceedings of the ACMSE’12, Tuscaloosa, AL, USA, 29–31 March 2012. [Google Scholar]
- Mak, F.; Kelly, J. Systematic means for identifying and justifying key assignments for effective rules-based program evaluation. In Proceedings of the 40th ASEE/IEEE Frontiers in Education Conference, Washington, DC, USA, 27–30 October 2010. [Google Scholar]
- Miller, R.L.; Olds, B.M. Performance assessment of EC-2000 student outcomes in the unit operations laboratory. In Proceedings of the ASEE Annual Conference Proceedings, Charlotte, NC, USA, 20–23 June 1999. [Google Scholar]
- Hussain, W. Engineering Programs Bloom’s Learning Domain Evaluations CQI. 2016. Available online: https://www.youtube.com/watch?v=VR4fsD97KD0 (accessed on 21 May 2022).
- Mead, P.F.; Bennet, M.M. Practical framework for Bloom’s based teaching and assessment of engineering outcomes. In Education and Training in Optics and Photonics; Optical Society of America: Washington, DC, USA, 2009; paper ETB3. [Google Scholar] [CrossRef]
- Mead, P.F.; Turnquest, T.T.; Wallace, S.D. Work in progress: Practical framework for engineering outcomes-based teaching assessment—A catalyst for the creation of faculty learning communities. In Proceedings of the 36th Annual Frontiers in Education Conference, San Diego, CA, USA, 28–31 October 2006; pp. 19–20. [Google Scholar] [CrossRef]
- Hussain, W. Specific Performance Indicators. 2017. Available online: https://www.youtube.com/watch?v=T9aKfJcJkNk (accessed on 21 May 2022).
- Gosselin, K.R.; Okamoto, N. Improving Instruction and Assessment via Bloom’s Taxonomy and Descriptive Rubrics. In Proceedings of the ASEE 125th Annual Conference and Exposition, Salt Lake City, UT, USA, 25–28 June 2018. [Google Scholar]
- Jonsson, A.; Svingby, G. The use of scoring rubrics: Reliability, validity and educational consequences. Educ. Res. Rev. 2007, 2, 130–144. [Google Scholar] [CrossRef]
- Hussain, W.; Spady, W.; Khan, S.Z.; Khawaja, B.; Naqash, T.; Conner, L. Impact evaluations of engineering programs using abet student outcomes. IEEE Access 2021, 9, 46166–46190. [Google Scholar] [CrossRef]
Specification of EAMU Performance Indicator Levels | |
Category-Scale% | Description |
Excellent (E) (90–100) | Apply knowledge with virtually no conceptual or procedural errors |
Adequate (A) (75–90) | Apply knowledge without significant conceptual and only minor procedural errors |
Minimal (M) (60–75) | Apply knowledge with occasional conceptual and only minor procedural errors |
Unsatisfactory (U) (0–60) | Significant conceptual and/or procedural errors when applying knowledge |
Heuristic rules for Performance Vector Tables (PVT): | |
Category | General Description |
Red Flag | Any performance vector with an average below 3.3 and a level of unsatisfactory performance (U) that exceeds 10% |
Yellow Flag | Any performance vector with an average below 3.3 or a level of unsatisfactory performance (U) that exceeds 10% but not both |
Green Flag | Any performance vector with an average that is at least greater than 4.6 and no indication of unsatisfactory performance (U) |
No Flag | Any performance vector that does not fall into one of the above categories |
Area | Pedagogical Aspects | Digital Developmental Advising | Prevalent Traditional Advising | Sectional/Research References |
---|---|---|---|---|
X Authentic OBE and Conceptual Frameworks | Based on Authentic OBE Frameworks | Maximum fulfillment of authentic OBE frameworks | Partial or minimal fulfillment of authentic OBE frameworks | Section 1, Section 3.3 and Section 4.1 [21,22,23,24,25,34,35] |
Standards of Language of Outcomes | Maximum fulfillment of consistent OBE frameworks | Partial or minimal fulfillment and lack of any consistent frameworks | Section 1, Section 3.3, Section 4.1 and Section 4.2 [21,22,23,24,25,34,35] | |
Assess students | All students assessed | Random or select sampling | Section 1, Section 3.3, Section 4.1 and Section 4.2 [12,13,17,18,38,39,40,41,42] | |
Specificity of Outcomes | Mostly specific resulting in valid and reliable outcomes data | Mostly generic resulting in vague and inaccurate results | Section 1, Section 3.3, Section 4.1 and Section 4.2 [5,12,43,44,45] | |
Coverage of Bloom’s 3 Learning Domains and Learning Levels | Specific PIs that are classified according to Bloom’s 3 learning domains and their learning levels | Generic PIs that have no classification | Section 1 and Section 4.2 [5,12,43,44,45] | |
‘Design Down’ Implementation | OBE power principle design down is fully implemented with specific PIs used to assess the course outcomes | OBE power principle design down is partially implemented with generic PIs used to assess the program outcomes | Section 4.2 [5,12,43,44,45] | |
Assessment Practices | Description of Rubrics | Hybrid rubrics that are a combination of analytic and holistic, topic-specific, provide detailed steps, scoring information, and descriptors | Mostly holistic generic rubrics, some could be analytic, rarely topic-specific or provide detailed steps, without scoring information and detailed descriptors | Section 4.2 [5,12,43,44,45,63,64] |
Application of Rubrics | Applied to most course learning activities with tight alignment | Applied to just major learning activities at the program level with minimal alignment | Section 4.2 [12,13,17,18,38,39,40,41,42,63,64] | |
Embedded Assessments | The course outcomes and PIs follow consistent frameworks and are designed to enable embedded assessment methodology | The course outcomes and PIs do not follow consistent frameworks and are not designed to enable embedded assessment methodology | Section 4.2 and Section 4.3 [12,13,17,18,38,39,40,41,42] | |
Quality of Outcomes Data | Validity and Reliability of Outcomes Data | Specific outcomes and PIs, consistent frameworks, and hybrid rubrics produce comprehensive and accurate assessment data for all students. Therefore, outcome data can be used for advising purposes. | Generic outcomes and PIs, lack of consistent frameworks, and generic rubrics produce vague and inaccurate assessment data for small samples of students. Therefore, outcome data cannot be used for advising purposes. | Section 1, Section 3.3, Section 4.2, Section 4.3, Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6, Section 5.7 and Section 5.8 [5,12,43,44,45] |
Statistical Power | Heterogeneous and accurate data. All students, all courses, and all major assessments sampled | Random or selective sampling of students, courses, and assessments | Section 3.3, Section 4.1, Section 4.2 and Section 4.3 [12,13,17,18,38,39,40,41,42] | |
Quality of Multi-term SOs Data | Valid and reliable data. All data are collected from direct assessments by implementing the following:
| Usually not available and unreliable. Due to a lack of the following:
| Section 1, Section 3.3, Section 4.1, Section 4.2 and Section 4.3 [12,13,17,18,38,39,40,41,42] | |
Staff | Access to Students Skills and Knowledge Information | Advisors can easily access student outcomes, assessments, and objective evidence besides academic transcript information | Advisors cannot access student outcomes, assessments, and objective evidence. Advising is fully based on academic transcript information. | Section 1, Section 3.3, Section 4.2, Section 4.3, Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6, Section 5.7 and Section 5.8 [12,13,17,18,38,39,40,41,42] |
Advisor Interactions | Advisors have full access to detailed student past and present course performance, thereby providing accurate informational resources to facilitate productive advisor–course instructor dialogue | Advisors do not have access to any detail related to students’ past or present course performance, thereby lacking any information resources for productive advisor–course instructor dialogue | Section 1, Section 3.3, Section 5, Section 5.2 and Section 5.3 [12,13,17,18,38,39,40,41,42] | |
Performance Criteria | Advisors apply detailed performance criteria and heuristics rules based on a scientific color-coded flagging scheme to evaluate the attainment of student outcomes | Advisors do not refer to or apply any such performance criteria or heuristic rules due to a lack of detailed direct assessment data and associated digital reporting technology | Section 4.3, Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7 [12,13,17,18,38,39,40,41,42] | |
Access to Multi-term SOs Data | Advisors can easily access and use multi-term SO data reports and identify performance trends and patterns for accurate developmental feedback | Advisors cannot access any type of multi-term SO data reports and identify performance trends and patterns for accurate developmental feedback | Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7 [40,42,43,47,50,51,52,55] | |
Mixed Methods Approaches to Investigation | Advisors can easily apply mixed methods approaches to investigation and feedback for effective developmental advising due to the availability of accurate outcome data, specific PIs, and assessment information presented in organized formats using state-of-the-art digital diagnostic reports | Advisors cannot apply mixed methods approaches to investigation and feedback for effective developmental advising due to the lack of availability of accurate outcome data, specific PIs, and assessment information presented in organized formats using state-of-the-art digital diagnostic reports | Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7 [40,42,43,47,50,51,54,62] | |
Students | Student Accessibility of outcomes data | All students can review their detailed outcome-based performance and assessment information for multiple terms and examine trends in improvement or any failures | Students cannot review any form of outcome-based performance or assessments information for multiple terms and cannot examine trends in improvement or any failures | Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7 [40,42,43,47,50,51,54,62] |
Student Follow Up Actions for Improvement | Both students and advisors can track outcome-based performance and systematically follow up on recommended remedial actions using digital reporting features | Neither students nor advisors can track outcome-based performance, and therefore, they cannot systematically follow up on any recommended remedial actions | Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7 [40,42,43,47,50,51,54,62] | |
Student Attainment of Lifelong Learning Skills | Students can use self-evaluation forms and reinforce their remediation efforts with guidance from advisors to enhance metacognition capabilities and eventually attain lifelong learning skills | Students do not have any access to outcome data and therefore cannot conduct any form of self-evaluation for outcome performance and therefore cannot collaborate with any advisor guidance on outcomes | Section 5.5, Section 5.6 and Section 5.7 [40,42,43,47,50,51,54,62] | |
Process | Integration with Digital Technology | Pedagogy and assessment methodology fully support integration with digital technology that employs embedded assessments | Language of outcomes, alignment issues, and a lack of rubrics make it difficult to integrate with digital technology employing embedded assessments | Section 1, Section 4.3, Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7 [40,42,43,47,50,51,54,62] |
PDCA Quality Processes | Six comprehensive PDCA quality cycles for stringent quality standards, monitoring, and control of the education process | Lack of well-organized and stringent QA cycles or measures and technology for implementing the education process | Section 1, Section 3.3 and Section 4.3 [40,42,43,47] | |
Impact Evaluation of Advising | Credible impact evaluations of developmental advising can be conducted by applying qualifying rubrics to multi-year SO direct assessment trend analysis information. There is no need for control or focus groups and credibility issues related to student survey feedback. | Impact evaluations are usually based on indirect assessments collected using student surveys. Several issues related to use of control or focus groups and credibility of feedback have to be accordingly dealt with. | Section 5.8 [47,65] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hussain, W.; Fong, M.; Spady, W.G. Digital Developmental Advising Systems for Engineering Students Based on Accreditation Board of Engineering and Technology Student Outcome Evaluations. Information 2024, 15, 520. https://doi.org/10.3390/info15090520
Hussain W, Fong M, Spady WG. Digital Developmental Advising Systems for Engineering Students Based on Accreditation Board of Engineering and Technology Student Outcome Evaluations. Information. 2024; 15(9):520. https://doi.org/10.3390/info15090520
Chicago/Turabian StyleHussain, Wajid, Mak Fong, and William G. Spady. 2024. "Digital Developmental Advising Systems for Engineering Students Based on Accreditation Board of Engineering and Technology Student Outcome Evaluations" Information 15, no. 9: 520. https://doi.org/10.3390/info15090520
APA StyleHussain, W., Fong, M., & Spady, W. G. (2024). Digital Developmental Advising Systems for Engineering Students Based on Accreditation Board of Engineering and Technology Student Outcome Evaluations. Information, 15(9), 520. https://doi.org/10.3390/info15090520