3.1. Case Study: Innovation Management and Product Development Course at PoliTo
The current situation within education demands immediate attention and a comprehensive exploration of emerging perspectives. As per the directives issued by the Board of Directors at Politecnico, an academic institution must assert control over its data analytics infrastructure, ensuring accessibility for academic purposes. Transparent governance mechanisms must be established to foster the well-being and functionality of the academic community.
While the traditional scholarly publishing infrastructure is deeply entrenched and resistant to swift change, the integration of data analytics and artificial intelligence into academia remains in its infancy, subject to ongoing evolution. Consequently, it is crucial to prevent the complete relinquishment of control over these activities to profit-driven commercial entities, which, understandably, prioritize the maximizing of returns for their stakeholders.
The proposed concept suggests a solution that is capable of generating statistical insights from historical data about students and educators, spanning multiple academic periods. This approach seeks to collect information that can aid directors, educators, and even students in making informed decisions regarding their organizational and career trajectories. The shaped data and process model architecture underwent validation using data collected from students and educators over recent years. The resulting insights empowered educators and academic leaders to make informed systemic decisions concerning the utilization of data available in information system repositories across educational institutions.
Upon completion of the requisite political and technical validation processes, at Politecnico di Torino, it was established that students have the right to access their reports on their academic performance, facilitating their self-assessment endeavors. Similarly, educators can leverage these results to evaluate their learning processes and adapt their course content in accordance with emerging classroom dynamics. Furthermore, study program administrators, committed to supporting the broader teaching community, can base their decisions on the insights derived from the same data processing procedures.
Therefore, applying analytical techniques enables educational institutions to gather, assess, and process data concerning each student in connection with their specific context, revealing how this connection influences the learning experiences of the same student. Furthermore, it facilitates the description of students’ strengths and weaknesses, thereby shaping the quality and effectiveness of their output and performance and building individual and group profiles. Given the evolving landscape of students and educators needing innovative solutions to enhance education, the adoption of this methodology enables the collection of data generated by both learners and educators. The utilization of these data, coupled with analytical models, aids in uncovering valuable insights to craft optimal responses to the educational community’s demands.
The ongoing project, Data2Learn@Edu, introduces an “adaptive learning system”, as illustrated in
Figure 2. That system, working by construction as a closed-loop control/regulator, has—as a primary objective—enhanced learning process comprehension within the field of teaching and education through the utilization of data-driven methods. By integrating data mining and machine learning techniques, this platform transforms itself into an adaptive and intelligent tool that is capable of significantly influencing learning processes. It achieves this by reinforcing and personalizing educational experiences.
Notably, the automatic generation and allocation of learner profiles provide deeper insights into the dynamics underlying current educational and teaching practices. The adaptation of content and support for teaching activities heavily relies on data-driven methodologies and AI algorithms. This approach boosts the personalization and contextualization of learning materials and learner profiles, as depicted in
Figure 4. Consequently, learning outcomes can be aligned with the learner’s profile in terms of competencies, skills, and acquired knowledge; in the case of the academy testbed, readers could also refer to the work presented in [
15].
The extracted knowledge is crucial for enhancing tools and platforms that are centered around data and focused on individuals. The primary objective is to impact the educational community’s processes, which are typically formalized within the organization but are also informally represented in its perceived image.
Particular attention is dedicated to the learners’ profile difference detector, as depicted in
Figure 4. This detector, or comparator, works on the students’ offered learning outcome profiles, through the identification of the set of clusters’ specific profiles; these are output by the sensor, which can also access personal data repositories in conjunction with historical and current students’ performance data (PoliTo Data Lake). These offered profiles are then compared to the reference job profile, or expected profile, which is built around learning outcomes recommended by industry standards and market assessments (e.g., ISO 15288: Technical Standard in Systems Engineering [
18]). The difference computed by the detector, or error, determines the regulatory/control actions, as outlined in
Figure 4, where an intelligent decision-support system (IDSS) [
19] can provide appropriate recovery proposals for teachers, deans, or other responsible agents. These actions include, as an example, the establishment and subsequent execution of flipped classes, whether in physical or virtual living labs, further access to appropriate learning materials (available online), or further discussions with specific tutors on critical subjects.
Problem-based learning, lectures, and various specific learning methods or environments are under consideration. The primary focus of the regulatory approach is outlined as a recovery action plan, which will be devised based on data collected by the sensor. Within this context, relevant components of this investigation are marked by the bold lines on the image.
Consequently, the main research inquiries primarily revolve around two aspects: firstly, understanding the data-driven adaptive learning model, reported in
Figure 4, its components and operations, and the tools it employs, as clearly stated by RQ1; secondly, exploring the processing that needs to be conceived within the sensor to identify, represent, and classify the various offered profiles, which are subsequently compared with the given expected profile, so as to rectify relevant deviations from the chosen and stated reference, as traced by RQ2.
In the course under investigation, the learning process adheres to a six-stage life cycle [
20], as depicted in
Figure 5. In the initial stage, the problem stems from the context of a company or organization that participates in the course. This often entails analyzing images that are perceived through the lens of emerging innovation trends, prompting companies to reassess their value chain based on the evolving flow, shaped by technological advancements and competitive practices. The problem is submitted to the student’s team, who has to study and develop a sustainable solution to be proposed to the company.
The second stage involves problem-posing, where a selection of methods and tools specific to the domain is employed to comprehensively describe the issue at hand. This phase aims to fully grasp the problem and assess its impact on the systems and the environment.
The third stage focuses on developing effective strategies to address the problem, leveraging algorithmic approaches based on insights and perspectives gathered in the previous stage, which are then suitably formalized using established practices and standards.
Moving on to the fourth stage, a minimum viable product (MVP) begins to take form. Here, hardware platforms, software components, and programming languages are combined. Rapid developing principles are applied, emphasizing the “reuse” of previously developed components to create a sustainable performance level, ultimately enhancing cost-efficiency and reducing the time taken to bring the product to the market.
The subsequent stage, encompassing deployment and dissemination, explores how marketing and communication strategies are employed through the appropriate channels to engage various stakeholders and secure funding sources.
In conclusion, from an educational perspective, the assessment is based on predefined learning objectives and outcomes for both the specific course and the Master of Science program. The perspective of the enterprise or organization plays a pivotal role in the assessment process, and self-assessment is also encouraged to compel team members to account for their contributions and estimate the costs incurred throughout the entire life cycle development process.
3.1.1. Context and Data Framework
This case study is conducted within the framework of the Innovation Management and Product Development course (GISP: PoliTo Data Lake), which currently attracts over 100 new students, making it a popular course among all offerings in Politecnico’s Master of Science programs. Students simultaneously engage in various other subjects alongside the GISP course, including project management, object-oriented programming, business planning, quality management, and data-driven application development.
3.1.2. Classroom
The constructivist classroom in the 2023 academic year accommodates a diverse population of over 100 students and is organized weekly into two theory lessons and two teamwork-based lab sessions. In this classroom, instructors, teachers, and trainers have the role of creating a collaborative environment where learners are actively involved in their own learning. Each group, consisting of five individuals, autonomously manages their working process, enforcing internal collaboration as they tackle intricate challenges associated with specific projects. The latter are often shaped based on brainstorming sessions, sometimes with the assistance of external enterprise actors. Within each group, cooperation is pursued through various collaborative tools such as Dropbox and Google Drive for data storage, Skype, Teams, or Zoom for synchronous communication, and appropriate application and system development specification tools, such as Visio or StarUML.
The classroom layout includes multiple zones where students can convene and sit in circles, in stark contrast to the conventional teacher-centered arrangement where students sit in rows while receiving a continuous stream of lectures.
3.1.3. Course Delivery
The active learning methodology presented in [
20] represents a unique fusion of traditional and constructivist approaches within a dynamic learning framework, as described in
Figure 4. In this approach, the course structure functions as a living laboratory, mirroring the project’s life cycle in accordance with the project work syllabus. The weekly schedule is divided, with 50% of the time dedicated to project development and the remaining 50% devoted to conventional lecture-style teaching. This amalgamation addresses a dual challenge.
On the one hand, it aligns with a university’s corporate-style organization, where time is systematically regulated based on labor coordination and passive interactions. On the other hand, it accommodates the demands of creativity-driven processes, primarily rooted in stimulating student engagement. An intriguing aspect of this methodology is that students actively participate in the course’s organization. They kick off the course by engaging in a meeting with the Joint Steering Committee, which was formed specifically for this purpose. This meeting serves as a platform for addressing fundamental questions that unveil various dimensions of the proposed problem.
The course spans a duration of 13 calendar weeks. The initial week focuses on introductory activities, including an overview of the course schedule and organization. The second week delves into kickoff discussions regarding the challenges that companies, or other organizations, aim to tackle. Students can also build teams during this period, bringing together complementary skills, knowledge, and experiences. The team composition is finalized after considering the introductory insights regarding the issues raised by the companies and the problem-specific needs.
Students immerse themselves in the problem-posing phase between the second and fourth weeks. Here, projects begin to take shape through a top-down deductive approach. At this stage of project life cycle management, the existing framework is recognized and serves as a foundation for developing new proposals. Questions play a pivotal role within the “problem-posing” domain, enabling a comprehensive exploration of the problem’s context. This exploration is facilitated through Lean Model Canvas (LMC), logical framework analysis (LFA), and quality functional deployment (QFD).
Moving from the fifth to the seventh week, the focus shifts to problem solving, emphasizing formal and informal specification development, often involving algorithmic techniques. Building upon the earlier problem analysis and process planning, students become proficient in using integrated computer-aided manufacturing definition for function modeling (IDEF0) and unified modeling language (UML) notation for specification processing. Their goal is to create a “to be” model, which can be compared to existing benchmarks—the “as is” state of the art.
Weeks 8 through 10 are dedicated to building a sustainable prototype that aligns with the goals and constraints established by the Joint Steering Committee.
Students engage in deployment and dissemination activities during the final three weeks (11–13th weeks). They test the prototype on an appropriate testbed and plan comprehensive communication strategies for the closing exposition, which is presented to the Joint Steering Committee. This presentation includes videos, reports, and a complete technical demonstration for the final discussion. Intermediate release dates are strategically placed to ensure the timely delivery of the LFA, QFD, and UML specifications, as well as a preliminary prototype implementation. Additionally, a well-structured timetable is established to align individual skill development.
3.1.4. Assessment
In the conducted study, the assessment process for students was structured into four distinct steps, as outlined in
Table 1. The most impactful component of that assessment was the project work, which received collective feedback from the Joint Steering Committee. This feedback was generated following group discussions on project development. The concluding discussion was documented comprehensively and included a summary slide sequence. This documentation coped with the project work syllabus framework: it was accompanied by various elements, including a concise technical video illustrating the prototype’s functional behavior, a brief (three minutes) emotion-based Kickstarter-like video, the coding software used for the prototype, its testing, and the toolkit for its management and development.
The project work syllabus serves as the foundational reference point for planning and is a crucial aspect of the regulation unit (as depicted in
Figure 6). This syllabus provides a detailed job profile interface and corresponding descriptions, specifying primary activities within the enterprise/organization, and establishes a link to the learning outcome profile. The learning outcome profile encompasses the skills, attitudes, competencies, and knowledge elements.
The second component listed in the assessment table pertains to a test bank. This test bank includes a course reference text, “UML 2.0”, an Open Educational Resource (OER) that comprehensively covers software engineering. Students are required to complete a test based on this topic. For this purpose, a reverse-engineering section is introduced; here, students work with provided Python code segments to be processed and understood to create functional and system diagram interpretations. They also work on IoT devices using Arduino platforms to collect field data to be sent to the cloud using specific Wi-Fi devices to accomplish the task.
Figure 6 also illustrates the interconnected relationship between the project syllabus, UML, the reverse-engineering process, and the corresponding assessment tools.
Furthermore, an individualized self-assessment mechanism is also implemented. This mechanism allows for the differentiation of project work assessments based on each group member’s abilities and level of participation. In practice, it involves allocating a specific number of credits to each team member, who then distributes these credits among their peers based on their assessment of each colleague’s practical contributions to the prototype development.
3.1.5. Assessment Management
Managing a course while simultaneously enhancing the learning process can be a complex and non-intuitive task for educators, professionals, and their support staff. The challenges related to improving teaching and learning can be identified using questionnaires to detect sustainable evidence in practice.
Table 1 illustrates the assessment schema adopted in the GISP course.
As stated,
Figure 6 depicts an assessment scenario in the background, including the project work syllabus, course material, specific reverse-engineering activities, and a knowledge-, skill-, attitude-, and competence-based map. Corresponding assessment tools and targets for both teams and individuals complement this tool.
To gain insights into the challenges and improvements needed for course delivery, both teachers and students can express their perceptions and assess various aspects of the process using semi-quantitative scales. Educational data mining and learning analytics (EDM/LA) play a vital role in extracting hidden knowledge from educational data. These datasets often comprise data collected during course delivery periods from the university’s information system and digital learning platforms.
Educators can use tools to evaluate the course content’s structure and its effectiveness in facilitating the learning process. These tools can classify students based on their feedback and monitoring perspectives. In some instances, they can even identify regular and atypical patterns in students’ behavior, helping to pinpoint their most common mistakes and develop more effective teaching activities.
Beyond the broader domain of course management, it is essential to consider the individual student’s perspective. Both perspectives benefit from the knowledge generated through the methods described above, as teaching improvements also contribute to students’ success. In the realm of EDM/LA applications, which primarily focus on modeling behavior and evaluating students’ learning performance, various documents in the literature discuss theoretical concepts and practical implementations. These systems generate valuable feedback for both educators and students; in fact, they can detect learning behaviors and proactively flag potential issues. They follow a student-oriented approach, recommending relevant activities, resources, curriculum adjustments, or links to help foster and enhance the learning experience.