A Scalable Architecture for the Dynamic Deployment of Multimodal Learning Analytics Applications in Smart Classrooms
Abstract
:1. Introduction
- Use the MMLA literature to present a simulated but realistic scenario that can surface the limitations of the current technical approaches involved in the orchestration of complex technical ecosystems in educational practices.
- Propose an MMLA architecture implementing SDN/NFV principles and exemplify how this architecture can solve some of the detected challenges to deploy, dismantle and reconfigure the MMLA applications in a scalable way.
- Perform several experiments to demonstrate the feasibility and performance of the proposed architecture in terms of time required to deploy and reconfigure these applications.
2. Related Work
2.1. Smart Learning Environments and Classrooms
2.2. Architectures for Smart Learning Environments and Classrooms
2.3. Remote Classrooms and Labs
2.4. SDN and NFV Applied to Different Scenarios
3. Description of Simulated Case Study
3.1. Intelligent Tutoring System in the Classroom
- Context: In this scenario, students are practicing a specific topic through the use of an intelligent tutoring system. Each student is individually interacting with the environment with the computer. In order to provide just-in-time help, instructors need to know how students are advancing in this practice and what are their mistakes or misconceptions. A usual class would be around 20 to 40 students.
- Application: When students interact with the intelligent tutoring environment, they generate events and clickstream data that can be processed to make inferences about their learning process. Based on these data, the analytics engine generates a number of indicators of students’ current skill and behavioral states. For example, it can show if a student is confused, needs help, has been idle for a number of minutes or their areas of struggle, among other pieces of information. Additionally, each computer has a webcam capturing students’ face and expression, and the analytics engine applies an affect detection Machine Learning (ML) model to infer students’ affect status. Instructors receive all these info through a dashboard in real-time and can easily move within the classroom attending students’ needs.
- Sensors and devices:
- –
- Individual students’ devices: Students interact with the ITS by connecting to it as a web application. The ITS provides of a series of scaffolded exercises adapted to the current level of skill of each student. Students use the desktop PC available in the classroom.
- –
- Individual students’ webcam: Each student has a front camera in their computer that is capturing a video feed of their face expression continuously. This feed is used by the analytics engine to infer the emotional state in time windows.
- –
- Instructor device: The instructor consume the analytics via a dashboard by connecting from its device (tablet or laptop) to the visualizer provided by the architecture.
3.2. Tabletop Task Collaboration
- Context: In this case scenario, we have students interacting with a shared device known as interactive multi-touch tabletop, which can easily support face-to-face collaboration with multiple students interacting at the same time. Students carry out an activity on collaborative concept making, which is a technique where learners represent their understanding about a topic in a graphical manner by linking concepts and preposition [52]. At the same time, students are also conversing with each other and discussing their decisions, and this voice stream is also captured through a microphone. The class is organized in groups of three students, and a usual class could have around 7 to 14 groups.
- Application: The objective is to design an application that can help teachers become more aware of the collaborative process, by making visible interactions that would otherwise be hard to quantify or notice. The application study collaboration by considering both the verbal interactions when students are talking to each other, as well as physical touches with the table-top [53]. More specifically, it can use metrics to identify learners that are not contributing enough to the activity or are dominating it (both physical and verbal interaction), groups that can work independently or those that do not understand the task. The instructor accesses all these information though a visualization dashboard in a hand-held device.
- Sensors and devices:
- –
- Group multi table-top: Table-top learning environments are big tactile screens that allow the collaboration of multiple users at the same time.
- –
- Group overhead depth sensor: A Kinect sensor is used to track the position of each user automatically detecting which student did each touch.
- –
- Group microphone array: It is located above the tabletop and captures the voice of all the group members, distinguishing the person which is speaking.
- –
- Instructor device: The instructor consume the analytics via a dashboard by connecting from its device (tablet or laptop) to the visualizer provided by the architecture.
3.3. Programming Project-Based Learning and Instructor Indoor Positioning
- Context: Numerous programming courses have capstone projects where students need to implement an application that shows evidence of the different concepts acquired thorough the course. These courses usually have some sessions allocated for students to start developing these projects in groups while instructors move from one group to another solving doubts. Each group interacts with a shared programming environment (e.g., [61]) to develop the project collaboratively. The class is organized in groups of three students, and a usual class could have around 7 to 14 groups.
- Application: In this scenario, there are two main applications. The first one is to provide analytics regarding how the collaboration is working out and how the project is advancing. This can include information regarding areas of struggle based on the code written and code compilations [59], but also regarding the level of contribution to the project of each member, analysis of the conversation and engagement levels obtained through the analysis of the physiological signals to measure activation and engagement levels. The second one is an automatic control of how much time the instructor has spent helping each one of the groups through indoor positioning; this way, the instructor can balance the help that each group receives. The instructor can consult all this information through a dashboard in order to provide just-in-time and personalized support to each group.
- Sensors and devices:
- –
- Individual students’ devices: Students interact with the collaborative programmings environment by connecting to it through a web application.
- –
- Individual Empatica E4 wristband: Each student wears an E4 empatica wristband that captures the heart rate, a three-axis activity through an accelerometer, and the electrodermal activity of their skin.
- –
- Group microphone array: It is located above each one of the groups’ tables, distinguishing the person which is speaking.
- –
- Group positioning sensor: It is located in each one of the groups’ tables to detect the center position of each group.
- –
- Instructor positioning badge: It is carried by the instructor when moving around the class. It implements Pozyx (https://www.pozyx.io/) technology which is an ultra wide band solution that provides accurate positioning and motion information with sub-meter accuracy (10 cm).
- –
- Instructor device: The instructor consumes the analytics via a dashboard by connecting from its device (tablet or laptop) to the visualizer provided by the architecture.
3.4. Requirements of the Previous Scenarios
4. Architecture
- External Data Sources. This level contains different external databases and tools such as data from the Academic Records, Learning Management System (LMS) or Massive Open Online Courses (MOOC) that can feed our architecture with relevant students’ data.
- Learning Analytics Platform. It hosts the components focused on analysing data provided by external sources and generated during the realization of learning activities.
- MEC System Level Management. This level is focused on (1) processing requests from instructors to reconfigure heterogeneous classroom devices in real-time and on-demand, (2) making decision and orchestrating them to configure learning applications running on top of classroom devices, and (3) sensing classroom devices to detect misconfigurations or problems.
- MEC Host. Heterogeneous classroom devices, also known as MEC Hosts, such as electronic blackboards, tablets, personal computers, servers, or Raspberry Pi that need to be reconfigured according to the current learning course or subject.
- MEC Host Level Management. Level hosting the different managers able to control the life-cycle of the Virtualization infrastructure, MEC Platform, and MEC Apps running on the MEC Hosts.
- Network Level. This level contains the network infrastructure enabling the communication of MEC Hosts and the rest of the levels making up the architecture.
4.1. Learning Analytics Platform
4.2. MEC System Level Management
4.3. MEC Host Level
4.4. Network Level
4.5. Solutions Provided by our Architecture to the Previous Requirements
5. Experimentation Results
5.1. Testing Environment
5.2. Docker Container with High-Intensive Computing Application
5.3. Docker Container with Medium Computing Application
5.4. Docker Container with a High-Data Consuming Application
6. Discussion
7. Conclusions and Future Directions
Author Contributions
Acknowledgments
Conflicts of Interest
References
- Martín-Gutiérrez, J.; Mora, C.E.; Añorbe-Díaz, B.; González-Marrero, A. Virtual technologies trends in education. EURASIA J. Math. Sci. Technol. Educ. 2017, 13, 469–486. [Google Scholar]
- Timms, M.J. Letting artificial intelligence in education out of the box: Educational cobots and smart classrooms. Int. J. Artif. Intell. Educ. 2016, 26, 701–712. [Google Scholar] [CrossRef]
- Borthwick, A.C.; Anderson, C.L.; Finsness, E.S.; Foulger, T.S. Special article personal wearable technologies in education: Value or villain? J. Digit. Learn. Teach. Educ. 2015, 31, 85–92. [Google Scholar] [CrossRef]
- Ochoa, X.; Worsley, M. Augmenting Learning Analytics with Multimodal Sensory Data. J. Learn. Anal. 2016, 3, 213–219. [Google Scholar] [CrossRef]
- Blikstein, P.; Worsley, M. Multimodal Learning Analytics and Education Data Mining: Using computational technologies to measure complex learning tasks. J. Learn. Anal. 2016, 3, 220–238. [Google Scholar] [CrossRef] [Green Version]
- Romano, G.; Schneider, J.; Drachsler, H. Dancing Salsa with Machines—Filling the Gap of Dancing Learning Solutions. Sensors 2019, 19, 3661. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Roque, F.; Cechinel, C.; Weber, T.O.; Lemos, R.; Villarroel, R.; Miranda, D.; Munoz, R. Using Depth Cameras to Detect Patterns in Oral Presentations: A Case Study Comparing Two Generations of Computer Engineering Students. Sensors 2019, 19, 3493. [Google Scholar] [CrossRef] [Green Version]
- Shankar, S.K.; Prieto, L.P.; Rodríguez-Triana, M.J.; Ruiz-Calleja, A. A review of multimodal learning analytics architectures. In Proceedings of the 2018 IEEE 18th International Conference on Advanced Learning Technologies (ICALT), Mumbai, India, 9–13 July 2018; pp. 212–214. [Google Scholar]
- Hernández-García, Á.; Conde, M.Á. Dealing with complexity: Educational data and tools for learning analytics. In Proceedings of the Second International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 1–3 October 2014; pp. 263–268. [Google Scholar]
- Di Mitri, D.; Schneider, J.; Specht, M.; Drachsler, H. The Big Five: Addressing Recurrent Multimodal Learning Data Challenges. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, Syndey, Australia, 5–9 March 2018. [Google Scholar]
- ETSI NFV ISG. Network Functions Virtualisation (NFV); Network Operator Perspectives on NFV Priorities for 5G; Technical Report; ETSI White Paper; ETSI: Nice, France, 2017. [Google Scholar]
- Singh, S.; Jha, R.K. A survey on software defined networking: Architecture for next generation network. J. Netw. Syst. Manag. 2017, 25, 321–374. [Google Scholar] [CrossRef]
- Hwang, G.J. Definition, framework and research issues of smart learning environments-a context-aware ubiquitous learning perspective. Smart Learn. Environ. 2014, 1, 4. [Google Scholar] [CrossRef]
- Bautista, G.; Borges, F. Smart classrooms: Innovation in formal learning spaces to transform learning experiences. Bull. IEEE Tech. Committee Learn. Technol. 2013, 15, 18–21. [Google Scholar]
- Muhamad, W.; Kurniawan, N.B.; Yazid, S. Smart campus features, technologies, and applications: A systematic literature review. In Proceedings of the 2017 International Conference on Information Technology Systems and Innovation (ICITSI), Bandung, Indonesia, 23–24 October 2017; pp. 384–391. [Google Scholar]
- Xie, W.; Shi, Y.; Xu, G.; Xie, D. Smart classroom-an intelligent environment for tele-education. In Proceedings of the Pacific-Rim Conference on Multimedia, Beijing, China, 24–26 October 2001; Springer: Berlin/Heidelberg, Germany, 2001; pp. 662–668. [Google Scholar]
- Snow, C.; Pullen, J.M.; McAndrews, P. Network EducationWare: An open-source web-based system for synchronous distance education. IEEE Trans. Educ. 2005, 48, 705–712. [Google Scholar] [CrossRef]
- Qin, W.; Suo, Y.; Shi, Y. Camps: A middleware for providing context-aware services for smart space. In Proceedings of the International Conference on Grid and Pervasive Computing, Taichung, Taiwan, 3–5 May 2006; Springer: Berlin/Heidelberg, Germany, 2006; pp. 644–653. [Google Scholar]
- Suo, Y.; Miyata, N.; Morikawa, H.; Ishida, T.; Shi, Y. Open smart classroom: Extensible and scalable learning system in smart space using web service technology. IEEE Trans. Knowl. Data Eng. 2008, 21, 814–828. [Google Scholar] [CrossRef]
- Muñoz-Cristóbal, J.A.; Rodríguez-Triana, M.J.; Gallego-Lema, V.; Arribas-Cubero, H.F.; Asensio-Pérez, J.I.; Martínez-Monés, A. Monitoring for awareness and reflection in ubiquitous learning environments. Int. J. Hum.–Comput. Interact. 2018, 34, 146–165. [Google Scholar] [CrossRef] [Green Version]
- Serrano-Iglesias, S.; Bote-Lorenzo, M.L.; Gómez-Sánchez, E.; Asensio-Pérez, J.I.; Vega-Gorgojo, G. Towards the enactment of learning situations connecting formal and non-formal learning in SLEs. In Foundations and Trends in Smart Learning; Springer: Singapore, 2019; pp. 187–190. [Google Scholar]
- Huang, L.S.; Su, J.Y.; Pao, T.L. A context aware smart classroom architecture for smart campuses. Appl. Sci. 2019, 9, 1837. [Google Scholar] [CrossRef] [Green Version]
- Lu, Y.; Zhang, S.; Zhang, Z.; Xiao, W.; Yu, S. A Framework for Learning Analytics Using Commodity Wearable Devices. Sensors 2017, 17, 1382. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Miller, H.G.; Mork, P. From Data to Decisions: A Value Chain for Big Data. IT Profess. 2013, 15, 57–59. [Google Scholar] [CrossRef]
- Perales, M.; Pedraza, L.; Moreno-Ger, P. Work-In-Progress: Improving Online Higher Education with Virtual and Remote Labs. In Proceedings of the 2019 IEEE Global Engineering Education Conference (EDUCON), Dubai, UAE, 8–11 April 2019; pp. 1136–1139. [Google Scholar]
- Dziabenko, O.; Orduña, P.; García-Zubia, J.; Angulo, I. Remote Laboratory in Education: WebLab-Deusto Practice. In Proceedings of the E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education, Montréal, QC, Canada, 9–12 October 2012; pp. 1445–1454. [Google Scholar]
- University of Deusto and DeustoTech. WebLab-Deusto. 2018. Available online: http://weblab.deusto.es/website (accessed on 1 May 2020).
- Huertas Celdrán, A.; Garcia, F.; Saenz, J.; De La Torre, L.; Salzmann, C.; Gillet, D. Self-Organized Laboratories for Smart Campus. IEEE Trans. Learn. Technol. 2019. [Google Scholar] [CrossRef]
- De La Torre, L.; Neustock, L.T.; Herring, G.; Chacon, J.; Garcia, F.; Hesselink, L. Automatic Generation and Easy Deployment of Digitized Laboratories. IEEE Trans. Ind. Inform. 2020. [Google Scholar] [CrossRef]
- Salzmann, C.; Govaerts, S.; Halimi, W.; Gillet, D. The Smart Device specification for remote labs. In Proceedings of the 2015 12th International Conference on Remote Engineering and Virtual Instrumentation (REV), Bangkok, Thailand, 25–27 February 2015; pp. 199–208. [Google Scholar] [CrossRef] [Green Version]
- Salzmann, C.; Gillet, D. Smart device paradigm, Standardization for online labs. In Proceedings of the 2013 IEEE Global Engineering Education Conference (EDUCON), Berlin, Germany, 13–15 March 2013; pp. 1217–1221. [Google Scholar] [CrossRef]
- Halimi, W.; Salzmann, C.; Jamkojian, H.; Gillet, D. Enabling the Automatic Generation of User Interfaces for Remote Laboratories. In Online Engineering & Internet of Things; Springer: Cham, Switzerland, 2018; pp. 778–793. [Google Scholar] [CrossRef] [Green Version]
- Huertas Celdrán, A.; Gil Pérez, M.; García Clemente, F.J.; Martínez Pérez, G. Automatic monitoring management for 5G mobile networks. Procedia Comput. Sci. 2017, 110, 328–335. [Google Scholar] [CrossRef]
- Salahuddin, M.A.; Al-Fuqaha, A.; Guizani, M.; Shuaib, K.; Sallabi, F. Softwarization of Internet of Things Infrastructure for Secure and Smart Healthcare. Computer 2017, 50, 74–79. [Google Scholar] [CrossRef] [Green Version]
- Muñoz, R.; Nadal, L.; Casellas, R.; Moreolo, M.S.; Vilalta, R.; Fabrega, J.M.; Martinez, R.; Mayoral, A.; Vilchez, F.J. The ADRENALINE testbed: An SDN/NFV packet/optical transport network and edge/core cloud platform for end-to-end 5G and IoT services. In Proceedings of the 2017 European Conference on Networks and Communications (EuCNC), Oulu, Finland, 12–15 June 2017; pp. 1–5. [Google Scholar] [CrossRef]
- Nguyen, V.G.; Brunstrom, A.; Grinnemo, K.J.; Taheri, J. SDN/NFV-Based Mobile Packet Core Network Architectures: A Survey. IEEE Commun. Surv. Tutor. 2017, 19, 1567–1602. [Google Scholar] [CrossRef] [Green Version]
- Ge, X.; Zhou, R.; Li, Q. 5G NFV-Based Tactile Internet for Mission-Critical IoT Services. IEEE Internet Things J. 2019. [Google Scholar] [CrossRef]
- Qu, K.; Zhuang, W.; Ye, Q.; Shen, X.; Li, X.; Rao, J. Dynamic Flow Migration for Embedded Services in SDN/NFV-Enabled 5G Core Networks. IEEE Trans. Commun. 2020, 68, 2394–2408. [Google Scholar] [CrossRef]
- Huertas Celdrán, A.; Gil Pérez, M.; García Clemente, F.J.; Martínez Pérez, G. Sustainable securing of Medical Cyber-Physical Systems for the healthcare of the future. Sustain. Comput. Inform. Syst. 2018, 19, 138–146. [Google Scholar] [CrossRef]
- Molina Zarca, A.; Bernabe, J.B.; Trapero, R.; Rivera, D.; Villalobos, J.; Skarmeta, A.; Bianchi, S.; Zafeiropoulos, A.; Gouvas, P. Security Management Architecture for NFV/SDN-Aware IoT Systems. IEEE Internet Things J. 2019, 6, 8005–8020. [Google Scholar] [CrossRef]
- Long, Y.; Aleven, V. Educational game and intelligent tutoring system: A classroom study and comparative design analysis. ACM Trans. Comput.-Hum. Interact. (TOCHI) 2017, 24, 1–27. [Google Scholar] [CrossRef]
- Kangas, M.; Koskinen, A.; Krokfors, L. A qualitative literature review of educational games in the classroom: The teacher’s pedagogical activities. Teach. Teach. 2017, 23, 451–470. [Google Scholar] [CrossRef]
- Tissenbaum, M.; Slotta, J. Supporting classroom orchestration with real-time feedback: A role for teacher dashboards and real-time agents. Int. J. Comput.-Support. Collab. Learn. 2019, 14, 325–351. [Google Scholar] [CrossRef]
- Holstein, K.; McLaren, B.M.; Aleven, V. Intelligent tutors as teachers’ aides: Exploring teacher needs for real-time analytics in blended classrooms. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference, Vancouver, BC, Cannada, 13–17 March 2017; pp. 257–266. [Google Scholar]
- Holstein, K.; Hong, G.; Tegene, M.; McLaren, B.M.; Aleven, V. The classroom as a dashboard: Co-designing wearable cognitive augmentation for K-12 teachers. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, Sydney, Australia, 7–9 March 2018; pp. 79–88. [Google Scholar]
- UNESCO Bangkok Office. School and Teaching Practices for Twenty-First Century Challenges: Lessons from the Asia-Pacific Region—Regional Synthesis Report; Technical Report; UNESCO: Bangkok, Thailand, 2016; Available online: https://unesdoc.unesco.org/ark:/48223/pf0000244022 (accessed on 1 May 2020).
- Laal, M.; Laal, M.; Kermanshahi, Z.K. 21st century learning; learning in collaboration. Procedia-Soc. Behav. Sci. 2012, 47, 1696–1701. [Google Scholar] [CrossRef] [Green Version]
- Martinez-Maldonado, R.; Kay, J.; Buckingham Shum, S.; Yacef, K. Collocated collaboration analytics: Principles and dilemmas for mining multimodal interaction data. Hum.–Comput. Interact. 2019, 34, 1–50. [Google Scholar] [CrossRef]
- Praharaj, S.; Scheffel, M.; Drachsler, H.; Specht, M. Multimodal analytics for real-time feedback in co-located collaboration. In Proceedings of the European Conference on Technology Enhanced Learning, Leeds, UK, 3–6 September 2018; pp. 187–201. [Google Scholar]
- Schneider, B.; Wallace, J.; Blikstein, P.; Pea, R. Preparing for future learning with a tangible user interface: The case of neuroscience. IEEE Trans. Learn. Technol. 2013, 6, 117–129. [Google Scholar] [CrossRef]
- Maldonado, R.M.; Kay, J.; Yacef, K.; Schwendimann, B. An interactive teacher’s dashboard for monitoring groups in a multi-tabletop learning environment. In Proceedings of the International Conference on Intelligent Tutoring Systems, Chania, Greece, 14–18 June 2012; pp. 482–492. [Google Scholar]
- Novak, J.D.; Cañas, A.J. The Theory Underlying Concept Maps and How to Construct and Use Them; Technical Report; Florida Institute for Human and Machine Cognition: Pensacola, FL, USA, 2008; Available online: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.100.8995&rep=rep1&type=pdf (accessed on 1 May 2020).
- Fleck, R.; Rogers, Y.; Yuill, N.; Marshall, P.; Carr, A.; Rick, J.; Bonnett, V. Actions speak loudly with words: Unpacking collaboration around the table. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, Banff, Canada, 23–25 November 2009; pp. 189–196. [Google Scholar]
- Kokotsaki, D.; Menzies, V.; Wiggins, A. Project-based learning: A review of the literature. Improv. Schools 2016, 19, 267–277. [Google Scholar] [CrossRef]
- Topalli, D.; Cagiltay, N.E. Improving programming skills in engineering education through problem-based game projects with Scratch. Comput. Educ. 2018, 120, 64–74. [Google Scholar] [CrossRef]
- Marques, M.; Ochoa, S.F.; Bastarrica, M.C.; Gutierrez, F.J. Enhancing the student learning experience in software engineering project courses. IEEE Trans. Educ. 2017, 61, 63–73. [Google Scholar] [CrossRef]
- Martinez-Maldonado, R. “I Spent More Time with that Team” Making Spatial Pedagogy Visible Using Positioning Sensors. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge, Tempe, AZ, USA, 4–8 March 2019; pp. 21–25. [Google Scholar]
- Spikol, D.; Ruffaldi, E.; Cukurova, M. Using multimodal learning analytics to identify aspects of collaboration in project-based learning. In Proceedings of the CSCL’17: The 12th International Conference on Computer Supported Collaborative Learning, Philadelphia, PA, USA, 18–22 June 2017. [Google Scholar] [CrossRef]
- Blikstein, P. Using learning analytics to assess students’ behavior in open-ended programming tasks. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, 27 February–1 March 2011; pp. 110–116. [Google Scholar]
- Ahonen, L.; Cowley, B.U.; Hellas, A.; Puolamäki, K. Biosignals reflect pair-dynamics in collaborative work: EDA and ECG study of pair-programming in a classroom environment. Sci. Rep. 2018, 8, 1–16. [Google Scholar] [CrossRef]
- Goldman, M.; Little, G.; Miller, R.C. Collabode: Collaborative coding in the browser. In Proceedings of the 4th International Workshop on Cooperative And Human Aspects of Software Engineering, Waikiki, HI, USA, 21 May 2011; pp. 65–68. [Google Scholar]
- Prinsloo, P.; Slade, S. An elephant in the learning analytics room: The obligation to act. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference, Vancouver, BC, Canada, 13–17 March 2017; pp. 46–55. [Google Scholar]
- Shankar, S.K.; Rodríguez-Triana, M.J.; Ruiz-Calleja, A.; Prieto, L.P.; Chejara, P.; Martínez-Monés, A. Multimodal Data Value Chain (M-DVC): A Conceptual Tool to Support the Development of Multimodal Learning Analytics Solutions. IEEE Rev. Iberoam. Tecnol. Aprendiz. 2020, 15, 113–122. [Google Scholar] [CrossRef]
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Huertas Celdrán, A.; Ruipérez-Valiente, J.A.; García Clemente, F.J.; Rodríguez-Triana, M.J.; Shankar, S.K.; Martínez Pérez, G. A Scalable Architecture for the Dynamic Deployment of Multimodal Learning Analytics Applications in Smart Classrooms. Sensors 2020, 20, 2923. https://doi.org/10.3390/s20102923
Huertas Celdrán A, Ruipérez-Valiente JA, García Clemente FJ, Rodríguez-Triana MJ, Shankar SK, Martínez Pérez G. A Scalable Architecture for the Dynamic Deployment of Multimodal Learning Analytics Applications in Smart Classrooms. Sensors. 2020; 20(10):2923. https://doi.org/10.3390/s20102923
Chicago/Turabian StyleHuertas Celdrán, Alberto, José A. Ruipérez-Valiente, Félix J. García Clemente, María Jesús Rodríguez-Triana, Shashi Kant Shankar, and Gregorio Martínez Pérez. 2020. "A Scalable Architecture for the Dynamic Deployment of Multimodal Learning Analytics Applications in Smart Classrooms" Sensors 20, no. 10: 2923. https://doi.org/10.3390/s20102923
APA StyleHuertas Celdrán, A., Ruipérez-Valiente, J. A., García Clemente, F. J., Rodríguez-Triana, M. J., Shankar, S. K., & Martínez Pérez, G. (2020). A Scalable Architecture for the Dynamic Deployment of Multimodal Learning Analytics Applications in Smart Classrooms. Sensors, 20(10), 2923. https://doi.org/10.3390/s20102923