Next Article in Journal
Grafting Technology with Locally Selected Eggplant Rootstocks for Improvement in Tomato Performance
Next Article in Special Issue
Assessing the Effects of Flow, Social Interaction, and Engagement on Students’ Gamified Learning: A Mediation Analysis
Previous Article in Journal
A Study on the Psychological Field Model of Drivers in Traffic Conflict Environments
Previous Article in Special Issue
Learning Performance Styles in Gamified College Classes Using Data Clustering
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Inven!RA Architecture for Sustainable Deployment of Immersive Learning Environments

1
Regional Delegation of Coimbra, Universidade Aberta, R. Alexandre Herculano 52, 3000-019 Coimbra, Portugal
2
Institute for Systems and Computing Engineering, Technology and Science (INESC TEC), 4200-465 Porto, Portugal
3
Faculty of Engineering, University of Porto, 4200-465 Porto, Portugal
4
Department of Curriculum and Instruction, University of Arkansas, Fayetteville, AR 72701, USA
5
Institute of Interactive Systems and Data Science, Graz University of Technology, 8010 Graz, Austria
6
Regional Delegation of Porto, Universidade Aberta, 4200-055 Porto, Portugal
7
Department of Education and Psychology, University of Aveiro, 3810-193 Aveiro, Portugal
8
CIDTFF—Research Centre on Didactics and Technology in the Education of Trainers, 3810-193 Aveiro, Portugal
9
Department of Sciences and Technology, Universidade Aberta, 1250-100 Lisboa, Portugal
10
School of Sciences and Technology, Universidade de Trás-os-Montes e Alto Douro (UTAD), 5000-801 Vila Real, Portugal
11
Grupo Internacional de Pesquisa Educação Digital—GPe-dU, University of Vale do Rio dos Sinos (UNISINOS), São Leopoldo 93022-750, Brazil
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(1), 857; https://doi.org/10.3390/su15010857
Submission received: 2 December 2022 / Revised: 15 December 2022 / Accepted: 27 December 2022 / Published: 3 January 2023
(This article belongs to the Special Issue Sustainability in Educational Gamification)

Abstract

:
The objective of this work was to support the sustainable deployment of immersive learning environments, which face varied obstacles, including the lack of support infrastructures for active learning pedagogies. Sustainability from the perspective of the integration of these environments in educational practice entails situational awareness, workload, and the informed assessment ability of participants, which must be supported for such activities to be employed in a widespread manner. We have approached this wicked problem using the Design Science Research paradigm and produced the Inven!RA software architecture. This novel result constitutes a solution for developing software platforms to enable the sustainable deployment of immersive learning environments. The Inven!RA architecture is presented alongside four demonstration scenarios employed in its evaluation, providing a means for the situational awareness of immersive learning activities in support of pedagogic decision making.

1. Introduction

Immersive learning environments hold promise but are still relatively uncommon in educational and training settings [1]. Occasional use is one thing, but achieving the sustained use of immersive learning, for different learning goals and throughout the entire learning process, is quite another. In this paper, we discuss this status and its causes in the Section 2 and then focus on a contribution to solve one of its obstacles: the lack of support infrastructures for active learning pedagogies [2].
Initial preparations for adopting immersive learning may draw the attention of educators and learners, such as technical and cost constraints. But sustainability requires moving beyond initiating the use of immersive learning environments. It involves the perspective of their integration in educational practice. Educational agents (teachers, students, managers, etc.) require conditions to participate and make an informed decision in their practice, regardless of the environments, strategies, and tools. These conditions include situational awareness, manageable workloads, and the ability to conduct an informed assessment, among others [1].
As we put forward in this paper, even though immersive learning environments are pedagogy agnostic, the theoretical lens of immersion [3] leans toward the development of active learning pedagogies. Thus, the conditions for informed decision making and participation within immersive learning environments are entangled with the issues and obstacles of adopting an active learning pedagogy. Acknowledged in the literature, in this regard, is the need for support infrastructures [2].
In this paper, we approached this wicked problem using the Design Science Research paradigm [4], describing how we conducted four iterations of demonstration and evaluation scenarios. The resulting contribution, i.e., the design science artifact, is the Inven!RA software architecture, guiding software development efforts for such support infrastructures. The Inven!RA architecture is presented as a means for the awareness of immersive learning activities in support of pedagogic decision making in active learning approaches.

2. Background

2.1. Immersive Learning Environments

The phenomenon of immersion is addressed in the literature from several distinct and complementary perspectives. Presented by Janet Murray in 1997 as “the sensation of being surrounded by a completely other reality” [5], immersion has since been shown to arise not only from sensory perception provided by the physical, technological, or societal system but also from one’s attentional absorption. This attentional absorption was measured along two dimensions: the meaning/narrative and one’s agency/challenge [3,6]. “Within the immersive environment, the technical system acts and its properties emerge, the narrative content reaches, and the challenges are met” [7]. Immersive learning environments are thus the contexts where learning occurs in association with the phenomenon of immersion, along the dimensions of the system, narrative, and agency/challenge.
As an example, an individual may consider the classical classroom lecture under this immersion lens. It has some system immersion because both students and lecturer are perceptually present within the physical room. It also has some narrative immersion, depending on the amount of attention drawn by the lecturer’s performance (more immersive to the lecturer than to the students). Finally, the classical classroom possesses some agency immersion, most definitely for the lecturer, overcoming the challenge of providing the lecture, and possibly for students, should they be engaged in the challenge of taking notes, summarizing, and highlighting. However, while the system immersion dimension will be similar for everyone, the narrative and agency dimensions vary tremendously, depending on the participant. Over the long progress of educational science and pedagogy, proposals have emerged to both evolve, transform, or abandon the classical classroom learning environment.
When one considers immersion as the theoretical lens to interpret learning environments, it is expected that pedagogical attention will be drawn to the three dimensions: How to enhance the perception of being present within a system (physical, technological, societal, etc.)? How to enhance absorption with meaning (i.e., a narrative)? How to enhance absorption with agency (i.e., a challenge)? The latter is obviously a driver of active participation, but we assert that the system and narrative dimensions are also drivers. The enhanced perception of being present within a system leads to greater personal involvement with it. This is true in the sense that Slater described as the fusion of place illusion with plausibility illusion, or the awareness that if something is real, one is part of it [8]. Also, greater narrative immersion arises not only from the consideration of spatial and temporal aspects in the narrative but (once again) with involvement of the self: emotional elements [9]. Consequently, these drivers likely encourage most obviously “learning by doing” approaches, such as experiential learning [10], and participatory social approaches, such as Communities of Practice [11] or more recently Connectivism [12]. Also, approaches that engage the self on the narrative dimension, through reinterpretation, the finding of personal meaning—the “invention of problems” or “invention of the world” in inventive learning theory [2].
Thus, immersive learning environments may lead to the promotion of pedagogical approaches which seek to leverage the active role of learners, or indeed all participants. And while their learning effectiveness was demonstrated [10], significant obstacles remain toward their sustained use, as addressed in the next section.

2.2. Deployment Obstacles of Immersive Learning Environments

As we put forward in the previous section, an approach based on active learning is an eventual pedagogical choice when considering immersive learning environments. This requires addressing its sustainability. The predominant instructional approach remains quite traditional and teacher centered in spite of more than a century of calls for the adoption of various active learning pedagogies, as well as thousands of research studies supporting its effectiveness and proposing various implementation approaches. The traditional teacher-centered approach still dominates in higher education [13] and in primary and secondary education [14], with preschool and kindergarten being the notable exceptions [15]. That efforts have been taking place but not entirely taking hold points toward sustainability problems for active learning approaches and, consequently, of immersive learning environments. Barriers to active learning deployment are diverse, at the level of the administrative system, the students, the content, and the teachers themselves [14]. Some of these barriers may be addressed via professional development, learning designs, and the alignment of research with practice [13]. However, a critical aspect is the required supporting infrastructure [13]. Typical educational sciences recommendations to overcome this issue are that educators adopt a personal stance of experimentation, to the point of requiring their “tenacity” [14].
The issue of the sustainable adoption of active learning, or indeed immersive learning environments, is what Rittel and Webber defined as a “wicked problem”, i.e., those for which no definitive formulation exists: “One cannot understand the problem without knowing about its context; one cannot meaningfully search for information without the orientation of a solution concept; one cannot first understand, then solve. The systems-approach (…) is inadequate for dealing with wicked-problems” [16]. Consider the multiple aspects an instructor needs to keep track of to be aware of their educational context and then act accordingly. Many of the aspects an instructor relies upon while teaching are things that they become aware of while teaching. For example, an instructor who is lecturing can observe the facial expressions of their students and alter the direction of their lecture based on that information. However, when the instructor adopts an active learning approach, students may not be facing the instructor and thus this aspect is lost [17], impairing instructor awareness and restricting their ability to orchestrate learning. Consequently, basic tasks such as planning time, providing feedback, and how to conduct a grounded assessment become increasingly complex [1]. Enhanced learning analytics such as a cluster analysis [18] and other informational tools are required, enabling a level of situational awareness and insight that one starts to find in business-oriented scenarios [19] but not yet in educational scenarios, albeit some early proposals start to emerge [20,21].
These problems, summarized in Table 1, are mostly from the instructor perspective, but students and other participants in education similarly face adoption problems [14].
Considering the wicked nature of this problem, this critical adoption barrier points toward the need for infrastructure that supports transformation itself rather than encourages maintaining the current practices. Transformation in education can only be sustained if the enabling technological platforms are designed to empower and encourage the adoption of active learning pedagogies and immersive learning environments [2]. Left on its own, “Tenacity” will not overcome this barrier to sustainability and will result in an unintended reinforcement of the current state of affairs.

3. The Inven!RA Architecture

3.1. Overview

The design presented in this paper aims to tackle the wicked problem of the lack of support infrastructure for the active learning pedagogy presented above. It is a broker pattern [22] software architecture approach based on providing educational actors (instructors, students, collaborators, and managers) decision-support dashboards for awareness within immersive learning environments. The core concept of Inven!RA (pronounced [ɪn-vehn-i-rɑ]) is that learning designers can delineate a plan with activities and overall goals/indicators and associate these with individual analytics emerging from the various learning activities [23,24]. The name stands for “a means for Inventive agency amidst Reticular ecosystems of Atopic habitats within which knowledge (!) emerges” [23]. When a platform implements the Inven!RA architecture, the learning activities are not considered part of that platform. Rather, they are provided independently on third-party Activity Providers, coordinating with the platform via Inven!RA-mandated Web services. Also, Inven!RA does not compete with Learning Management Systems (LMS) regarding overall management concerns, such as course plans and instructions. Instead, Inven!RA inter-operates with the LMS. Figure 1 provides an overview of Inven!RA and its relationship with the LMS and Activity Providers.
In Figure 1, we represent the possible roles, which may be different people or indeed the same individual. A Creator designs learning plans, which Inven!RA calls Inventive Activity Plans (IAP). A Deployer deploys them, i.e., instantiates an IAP. A typical scenario involves an instructor providing an IAP to a specific class, but it could be an autonomous learner initiating a personal IAP or a managerial/technical individual. The instantiated IAPs can be included in an LMS course by the Deployer. An Active Agent accessing the course in the LMS can then access the activities in the IAP, brokered by the platform implementing Inven!RA, which provides personal data protection between the LMS and the third-party Activity Provider, as well as—critically—ensuring that analytics tracked at the Activity Provider are associated with the activity’s specific instance deployed for an IAP. The Active Agent is typically a student but could also be an instructor involved in cooperative learning scenarios or some other participant. Finally, an Awareness Agent accesses Inven!RA to check the status of the IAP goals and indicators. This can be an instructor seeking to orchestrate the class but could also be a student seeking to self-regulate or co-regulate learning, a manager/chair seeking to support the course, or some other relevant role. We based the design of the Inven!RA software architecture on two contributions. The first inspiration was the early effort of the BEACONING architecture [25], which put forward that a videogame could provide an overall narrative to a learning activity, and minigames could be embedded in it. BEACONING included an overall gamified lesson plan providing transversal analytics on plan goals collected from the various minigames. Inven!RA moves beyond the restrictions of having a videogame narrative controlling the activities path and does not require screen-based activities. Rather, in Inven!RA, any activity can be incorporated in an IAP as long as its Activity Provider collects individualized analytics in association with an instantiated IAP, enabling the use of immersive learning environments which are not screen based, such as pervasive Internet of Things environments or mixed-reality scenarios. The second inspiration was Baptista’s Triadic Certification Approach for game-based learning [26], which foresees that training activities are mapped to different contributions to competence levels, enabling awareness of competence development from fulfilling game or simulator challenges. Inven!RA takes this concept beyond certification and into overall awareness of learning aspects. The path from these two inspirations into the form we are presenting here is described in the Methodology Section 4.

3.2. Operation

The operation of Inven!RA is presented via the UML sequence diagrams for each use case associated with the various actors in Figure 1. The detailed specifications of the request formats and data formats are available as a technical report at a public repository [27].
The Inven!RA architecture assumes prior knowledge of the available activities, which provides the platform with information on the various Web services that enable its operation with Inven!RA as per Figure 1: the list of configuration parameters, the activity configuration interface, the list of available analytics (“Analytics contract”), the Deploy request endpoint for instantiating the activity, and the list of current analytics for an activity instance. Only the “Provide activity” service is not part of the registration, being communicated at run-time (see the Deployer use-cases operation below). The actual form of accomplishing this prior knowledge is not part of this specification. For instance, it may be implemented via the traditional prior registration of activities (as we have done for our prototypes) or via shared directory services, via blockchains of activities, or other methods.
The Creator case is presented in Figure 2 and its operation in Figure 3.
When the Creator sees a list of available activities and chooses one, Inven!RA requests and integrates into its interface the HTML code provided by the Activity Provider. This enables the provider to empower it with responsive behaviors rather than being a mere static form, as attested in Figure 3. For instance, a mechanical maintenance trainer could use this configuration interface to specify in 3D which tasks should be accomplished during the task. It could also serve as an entry-point to configuration activities taking place outside the Web interface, such as geotagging locations or interacting with physical items. Upon completion, the configuration interface is responsible for storing the resulting parameters within the Web interface as hidden form input values for Inven!RA to harvest and associate as that activity’s configuration parameters within the IAP where the activity is being inserted.
In parallel, the Creator can define the list of goals and indicators for the IAP as a whole and, as activities are included in the IAP, map their analytics to those goals and indicators.
The Deployer case is presented in Figure 4 and its operation in Figure 5.
When the Deployer selects an IAP for deployment, there is an opportunity to adjust the configuration of each activity. For instance, a teacher performing the deployment may want to reflect class sizes, linguistic preferences, etc. This could also be the moment to associate this deployment with some budget or service acquisition, or with access keys. The process is identical to the Creator’s in Figure 3. When the Deployer eventually requests actual deployment, Inven!RA instructs all Activity Providers behind the individual activities to instantiate them and return deploy URLs. These URLs will contain any keys or information that the Activity Provider requires in order to associate requests with these instances of activities. However, these URLs cannot be exposed outside Inven!RA because calling them directly would bypass Inven!RA and thus impede brokering the interactions. Therefore, as shown in Figure 5, Inven!RA generates a matching Inven!RA deploy URL for each activity instance. The Deployer can then access its LMS normally to design a course and include these Inven!RA deploy URLs, configuring them to attach the LMS UserIDs when those URLs are clicked/requested. This configuration will later enable an analytics association with individual LMS users, as shown in the Active Agent case below.
The Active Agent case is presented in Figure 6 and its operation in Figure 7.
The Active Agent accesses a course on the LMS and eventually may click on one of the activity URLs. The aforementioned configuration performed by the Deployer will cause the LMS to attach the LMS UserID to this request, identifying the agent and thus enabling associating analytics with it. Upon receiving this request, Inven!RA replaces this LMS UserID with an internal Inven!RA UserID to prevent the third-party Activity Provider from receiving this personal data identifier. Inven!RA also attaches the configuration parameters to the request for this instance of the activity and forwards the request to the Activity Provider. The Activity Provider responds with the Web interface for the activity, which is then redirected by Inven!RA to the Active Agent, thus enabling henceforth the performance of the activity. While the activity is taking place, the Activity Provider collects analytics, associating them with the Inven!RA UserID and the particular activity instance.
The “Activity” frame shown in Figure 7 can take place outside the Web. For instance, the Web interface that the Active Agent receives can launch a different application, such as a videoconferencing room, a shared online document, a metaverse space, or simply provide instructions and keys/authentication for pursuing the activity elsewhere. An example of this is provided in Section 5.2.
The Awareness Agent case is presented in Figure 8 and its operation in Figure 9.
As put forward in Section 3.1, the Awareness Agent can be anyone that requires better awareness to perform: an instructor, a student manager, program chair, etc. When the Awareness Agent accesses Inven!RA to check a dashboard, Inven!RA requests each Activity Provider with instances involved in the IAP for the current analytics for their activity instances. The various analytics are then combined into information for the goals and indicators specified by the Creator (see Figure 2 and Figure 3). Any user-specific restrictions are also employed here (for instance, if the Awareness Agent is a student, the pedagogical option may be to restrict visibility only to their personal information).
Some of the analytics may be qualitative and require access to custom Activity Provider analytics pages. For instance, suppose a georeferenced activity includes a map trajectory of all locations visited by the Active Agents as a qualitative analytic. In this case, the Inven!RA dashboard might elect to provide a link to the Activity Provider’s custom trajectories analytics page. Examples are provided in Section 5.2 and Section 5.3.

4. Methodology

Design Science Research Iterations

The class of wicked problems of Rittel and Webber, for which no definitive formulation exists, is often addressed via design thinking [16]. Thus, we chose to employ the Design Science Research (DSR) paradigm [4], following the methodology of Peffers et al. for applying DSR to information systems research [28], based on the phases (a) Problem Identification and Motivation, (b) Defining the Objectives for a Solution, (c) Design and Development, (d) Demonstration, (e) Evaluation, and (f) Communication. The two final phases, (e) and (f), cycle into either (b) or (c) for refining the knowledge developed in this process.
Our wicked problem is the sustainable deployment of immersive learning environments, with the motivation of supporting their application in the context of active learning pedagogies, as argued in Section 2.2. The objective of our solution was to develop a software architecture that could guide efforts to overcome the lack of support infrastructure for active learning pedagogies with immersive learning environments.
Our design started from early approaches by Bourazeri et al. [25] and Baptista et al. [26], as described in Section 3.1, and we developed a prototype platform implementing the first design inception of the Inven!RA architecture, presented in Section 5.1. In the same section, we showed how we demonstrated it by attempting to solve a scenario of microelectronics training with two activities, comprising (1) drafting circuit specifications after reading documentation and (2) programming an Arduino micro-controller. We also described the evaluation of this first prototype by conducting functional and integration testing. We also evaluated this first prototype in a second scenario, attempting to solve authoring and executing of location-based activities for tourists, and described this in Section 5.2.
The results from these two scenarios iterated into a new design and development phase, resulting into an improved prototype, functionally matching the architecture presented in Section 3. We demonstrated it by using it to solve a scenario of computer networking education using a remote laboratory, which we described in Section 5.3. We evaluated it by first conducting functional and integration testing and then by creating user accounts and simulating usage of the platform. We then interviewed eight computer networking lecturers regarding the adequacy of the indicators provided by Inven!RA to support the pedagogical use of the remote laboratory.
The results from this final scenario iterated into improvements at the design level in the prototype, resulting in clarifications of actors’ roles and, in specification diagrams, rendering explicit some aspects that were hitherto only expressed in the developed implementation. These improvements were reflected in the diagrams and descriptions of Section 3. We have begun the demonstration and evaluation of this version by attempting to solve a scenario of awareness for e-learning trainers when security forces trainees perform activities in interactive SCORM learning objects [29]. This scenario and its early results are presented in Section 5.4, for the pathways they open, but so far imply no changes to the specification diagrams presented in Section 3.

5. Design Science Iterations

5.1. Scenario 1— First Prototype and Microelectronics Education

As explained at the end of Section 3.1, the first design of the Inven!RA software architecture was based on two contributions: the BEACONING architecture [25] and the Triadic Certification Approach [26]. Our efforts sought to overcome BEACONING’s restriction of having a game narrative and engine driving their gamified learning plan, which included analytics only for minigames triggered in parts of that narrative. And to overcome the Triadic Certification Approach’s focus on the final certification of learning, we attempted to apply it to the continual gathering and presentation of learning analytics in support of teacher awareness and decision making.
To develop the design, we elicited requirements through interviews with educational technology researchers from Portugal, Brazil, and the USA. These resulted in the identification of three user profiles, the Learning Designer, Teacher, and Student, and of a series of user stories, as detailed by Cruzeiro [23]. Serendipitously, it also generated the name for the new learning plan that was no longer restricted to game-based learning, the Inventive Activities Plan (IAP), exposing the ambition to support inventive learning theory (as mentioned in the Section 2).
The structural design decision made in this scenario was the option to consider the Learning Management System and Activity Providers and third parties, integrated with platforms that implement the Inven!RA architecture via Web services. The analytics core design decision made in this scenario was the option to map the analytics from individual activities into global IAP goals. The sequence diagrams have since been improved as a result of the subsequent iterations, as described in Section 4.
A unit test of the basic operation of the Inven!RA platform (front end and back end) was performed for each atomic Web service use, as detailed by Cruzeiro ([23], Table 6.1). The early functional tests included the IAP creation (Figure 10), goal definition, analytics storage, and dashboard display.
The scenario for the demonstration and evaluation was supporting the vocational education and training teaching of microelectronics education. Two activities were designed, with distinct levels of complexity. The plain activity consisted of providing a set of readings. The more complex activity consisted of having students program their physical Arduino micro-controllers for data harvesting from environment sensors. Both are detailed by Cota et al. [24]. The configuration screens were similar, with a summary of the activity instructions and links or buttons to download resources, as shown in Figure 11.
The activity providing a set of readings enabled the integration testing of the Activity Provider implementation and the Inven!RA platform implementation but was otherwise trivial. The Arduino programming activity was more interesting as it required validating that Students (in the current terminology, Active Agents) could indeed conduct activities, some of which had no Web interface, and still generate analytics for the awareness of the teacher via Inven!RA. Figure 12 shows the specific process taking place after the (simulated) trainee accessed the activity URL in the LMS.
In the deploy screen (“Activity instance Web interface” in the current terminology used in Figure 7), the trainee downloads a learning resource: a piece of computer programming code with a template (or “skeleton”) firmware program for the Arduino micro-controller. The Activity Provider automatically generates a slightly different template for each trainee, embedding in it the Inven!RA UserID that was received in the activity request. That firmware code then “calls home”, i.e., upon being uploaded to the (simulated) trainee’s Arduino it communicates with the Activity Provider, acknowledging that an analytics item (“firmware uploaded into Arduino”) was achieved. Subsequently, if the trainee programs the Arduino micro-controller to gather environment data, the same firmware code uploads those data (e.g., temperature and humidity), enabling the trainer to check both the milestones and qualitative details on the operation (Figure 13).
The evaluation of this scenario, as it developed, detected the deficiencies in the data and requests/response formats, as well as the lack of clarification on some aspects of the operational workflow, detailed in two reports [23,24]. Those were combined with the results of Scenario 2, described in the following section, and informed the subsequent design iteration, described in Section 5.3.

5.2. Scenario 2—Second Prototype (Front-End Only): “CHIC’s Apps”

In parallel with Scenario 1, we employed Inven!RA to try solving the problem of providing an authoring tool for georeferenced tourist experiences and enabling tourists to enjoy those experiences ([23], pp. 66–67). This was the context of an activity produced in the CHIC project (Cooperative Holistic View on Internet and Content), a consortium between academia and private industry partners. The functional and integration tests offered similar insights to those of Scenario 1, but two significant requirements emerged, which impacted Inven!RA. Both are visible in the developed front end (Figure 14). The first is that the consortium required the authoring front end to have a consistent look and feel to that of the other experiences of users (e.g., their mobile app and Web site), quite distinct from Inven!RA’s early authoring front end shown in Figure 10. The second is that while georeferenced activities could be configured with georeferenced data using Inven!RA, that information had to be displayed in the authoring front end. That is, the activities could not simply be organized ad hoc, they had to be situated on a map, based on their georeferenced configurations.
The evaluation of this demonstration led us to redesign Inven!RA for the subsequent iteration. We now consider that the front-office authoring interface is a bespoke component, not an Inven!RA architecture core component. This is reflected in not being represented in the overview diagram (Figure 1).

5.3. Scenario 3—Third Prototype and Remote Networking Laboratory

The results of the two previous scenarios were combined into a new design and development phase of the Inven!RA architecture. It was demonstrated by attempting to solve the research problem in a new scenario (Figure 15): learning activities on a remote computer networking laboratory [30].
The learning activities were designed by interviewing computer networking instructors: two in higher education and two in vocational education and training. Videoconferencing interviews included two questions on the expected advantages of being able to track ongoing activities and which data would be relevant for that and were recorded, transcribed, and subjected to a thematic content analysis, as described by Grilo [30].
The stated expected advantages of being able to track ongoing activities in the remote networking laboratory were:
  • The ability to track the student learning process;
  • The ability to quickly help out students in activities they are struggling with;
  • The ability to encourage the initiation of activities soon after presenting them;
  • To enable the instructor to intervene before the submission phase of the activities.
The data stated as relevant for tracking activities were:
  • The success or failure status of each task;
  • The total progress percentage within an activity;
  • Whether a student has accessed the instructions or not;
  • Whether a student performed a network IP configuration or not;
  • Tracking both the operational tasks and the management/coordination tasks;
  • Whether a student has scheduled a remote laboratory use session or not;
  • The list of tasks performed during a laboratory session;
  • The total time spent to perform an activity;
  • The time spent performing each task within an activity;
  • Whether a student initiated an activity or not;
  • Being able to watch a video recording of the laboratory session.
The original Triadic Certification Model [26], and its redesign in Scenario 1, described in Section 5.1, both assumed that the analytics from the activities would be mapped onto the overall learning goals. However, these sets of answers revealed that the learning goals were not the entirety of the awareness required of instructors during the process:
  • Advantage 3 is about the initiation of activities; no learning has yet taken place.
  • Data items 3, 6, and 10 indicate if a student took steps to eventually perform an activity rather than whether learning has occurred.
This exposed that for the situational awareness of the learning activities, and the subsequent decisions on how to intervene pedagogically, the instructors wish to provide not only learning support but regulate student participation and encourage the self-regulation of learning [31,32,33]. Consequently, we redesigned Inven!RA’s employment of the Triadic Model to include not only a mapping between the activity analytics and learning goals but also to consider self-regulation indicators, as shown in Figure 16. This is reflected in the current version of the architecture, presented in Section 3, which always refers to the goals/indicators in combination.
At the structural design level, we reflected the results of Scenario 2 (Section 5.2) by refactoring the implementation and decoupling of the front-end internal Inven!RA component (the IAP creation and configuration) from the back-end internal Inven!RA component. However, we elected not to include this internal distinction in Section 3 because this decoupling was only applied in this scenario and not yet evaluated in a new scenario. There were some refinements to the analytics dashboard, mostly at the visual level. We also detected unspecified aspects of the architectural messages, such as which used HTTP GET methods and POST methods, and some details of the JSON format for the content of the messages. We also detected various quality issues with the original implementation, such as the methods for harvesting the configuration parameters from the Activity Provider Web interface. The major changes to the workflow of the Inven!RA architecture were (1) specifying that the configuration parameters for an activity would be provided by a Web service, decoupling Activity Providers’ ability to deploy updates from the registering of activities with the platform, and (2) specifying that Inven!RA would request a deploy URL from Activity Providers for each deployed instance, enabling Activity Providers more flexibility to employ distinct URLs for distinct configurations. These aspects were incorporated in Section 3 and are detailed by Grilo [30]. They are also available as a technical report at a public repository [27].
This scenario’s prototype was evaluated via unit and integration tests, simulation runs, and a live test with the instructors involved in the specification interviews, plus four more. All tried out the platform and responded to a questionnaire, as detailed by Grilo [30].
Regarding a Likert-scale question on whether the analytics provided by Inven!RA were relevant for tracking the activities, 62.5% fully agreed and 37.5% agreed. There were no medium or fully/disagree answers.
Regarding an open question on which analytics should be added to track the activities, the requests were about “richer” dashboards highlighting which tasks were more complex for the class and what was the overall progress of the class. These reflect the functional requirements at the output dashboard level and are not dependent on the underlying Inven!RA architecture.

5.4. Scenario 4—B-PREPARED

A new DSR iteration is underway as Scenario 4. It comprises a new demonstration and evaluation of Inven!RA, under its current status as described in Section 3. The new scenario for the demonstration and evaluation involves providing awareness to e-learning trainers in a security forces training course provided by a corporation. Within a current e-learning course, the trainees must perform some activities within an interactive SCORM learning object. SCORM (Shareable Content Object Reference Model) is a standard for e-learning content for the re-usability and portability within LMS platforms [29]. Common learning objects include slideshows or videos, combined with short quizzes, and reports on their final score, time spent, progress through the contents, and overall outcome (e.g., pass/fail, complete or not).
In this scenario, training managers at the corporation required analytics that would combine the transversal activity across several learning objects, and we sought to employ Inven!RA for this purpose. This demonstration and evaluation are still underway, but we report it detected a new unforeseen situation for Inven!RA: pre-existing interactions between the Activity Provider and the LMS.
A SCORM learning object embedded in a course within the LMS is already providing its Web deployment interface within the LMS. This raised two issues for Inven!RA:
  • The direct interaction between the LMS and a SCORM learning object bypass Inven!RA so the analytics collection within the SCORM learning object is not associated with an IAP;
  • The SCORM learning object is already within the LMS so it is pointless for Inven!RA to provide its deployment URL to the Deployer (e.g., the teacher assembling the course).
We have approached these issues from the conceptual design perspective of Inven!RA: an Activity Provider is the entity responsible for providing the Web deployment interface and collecting analytics, associating them with an activity instance within an IAP.
  • As shown in Section 3, Figure 5, on IAP deployment, the Deployer can customize the configuration of any activity. So, we idealized a “SCORM Activity Provider” requiring a configuration parameter: the URL or alternative identification of the SCORM object placed by the Deployer in the LMS course (while we represented this activity at the bottom of Figure 5, there is no dependence and the course design can be initiated prior to the Inven!RA deployment, as is shown in the use-cases diagram, Figure 4);
  • This enables the SCORM Activity Provider to collect the analytics from the LMS if that learning object is installed, if that system’s analytics Web services are accessible;
  • It also allows the SCORM Activity Provider to associate those analytics with the activity instance in the IAP because it was provided in the deployment of the IAP;
  • It does not allow the SCORM Activity Provider to associate those analytics with internal Inven!RA UserIDs because the SCORM learning object is receiving direct interactions from LMS users, i.e., LMS UserIDs;
  • In addition, the SCORM Activity Provider must not access Inven!RA UserIDs; it would expose internal associations (LMS UserID/Inven!RA UserID) to external third parties.
The last point was the critical realization: it is Inven!RA that must account for the possibility of having to collect data from the LMS, via an Activity Provider, and map the analytics accordingly. That is, Inven!RA needs to know if the analytics from a specific Activity Provider will be provided using Inven!RA UserIDs, as hitherto assumed, or using the same LMS UserIDs that Inven!RA avoids exposing. Consequently, this scenario only impacts the Awareness Agent case. All it takes is for Inven!RA to be aware of that circumstance with an activity to be able to perform the action “Combine analytics from the activity instances into goals and indicators”, with no redesign being required of the architecture (Figure 9).
The implication is that Inven!RA must account for the existence of two different categories of activities: those that take place outside the destination LMS and those taking place already within the very LMS employed by the Deployer and the Active Agent.
A front-end implication is that when Inven!RA shows its deploy URLs for IAP activities (Figure 5), the activity instances of this new category will not have a URL but rather simply be listed as “Already deployed within the LMS” or some other equivalent clarification. This does not require changes to the architectural diagrams shown in Section 3.
This early result is currently being implemented in a new demonstration and evaluation phase and is presented due to its clarification potential of the Inven!RA operation.

6. Conclusions

The Inven!RA software architecture was designed to overcome the problem of the lack of support infrastructure for active learning pedagogies with immersive learning environments. It is part of the larger, wicked problem of the sustainable deployment of immersive learning environments in the context of active learning pedagogies (see Section 2.2).
Four demonstration and evaluation scenarios were developed and presented, refining this approach. The demonstrated applicability of the design, with incremental refinements, across the scenarios, supports the feasibility of this approach toward the stated problem: collecting the analytics from ongoing independent learning activities, provided by third parties, associated to transversal goals and indicators, while enabling those activities to be coordinated from within the LMS the various actors employ. Consequently, this approach constitutes a significant contribution toward the resolution of the stated problem.

7. Limitations and Future Work

The Inven!RA scenarios 1–4 were largely academic prototypes (albeit Scenario 2 was in the context of an academia–industry consortium). This is a limitation of the conclusions, and the promising results recommend demonstrating and evaluating the feasibility and effectiveness of this approach on higher technology readiness-level contexts. Also, the scenarios address a small subset of immersive learning environments, which is a limitation of the validation of this approach. Further research should seek to demonstrate and evaluate its feasibility for scenarios including the Internet of Things, mixed and augmented reality, virtual reality, digital assistants, etc. We also find a limitation in that all the scenarios resorted to Activity Providers that were developed as such. It is necessary to demonstrate and evaluate the feasibility of this approach as a façade pattern for legacy immersive environments, and thus of the decoupling of the architecture. Finally, the current scenarios are all based on free and open Activity Providers. The architecture foresees that individual Deployers (e.g., teachers, trainers, team leaders, game masters, etc.) can introduce keys or identifiers, or indeed authenticate, in the Deployment phase (see discussion on Figure 5). However, the scenarios did not demonstrate or evaluate this, which is a limitation and a recommendation for future work.
Among other relevant pathways of research to develop a more robust solution to the stated problem, we point out the need to evaluate the trust relationships among the participants (Inven!RA, the LMS, and Activity Providers) and the potential impacts therein onto the architecture or its operation. The front-office and back-office decoupling also requires further exploration, as does identifying solutions from the body of literature on learning dashboards regarding enabling non-technical actors to specify and create their custom front ends for Inven!RA. In particular, we anticipate the need to exploit novel emerging semantic concepts in the field of learning analytics, such as virtual choreographies, to enable the interpretation of temporal patterns across activities rather than static status indications. Ultimately, we encourage the research community to implement and evaluate their own demonstrations of Inven!RA and assess not only its feasibility but the impact on the awareness level of the learning process actors (teachers, students, program chairs, managers, etc.) and the outcomes on the quality and efficiency of their decision-making processes.

Author Contributions

Conceptualization, E.S., L.M., D.P. and C.G.; methodology, L.M., A.C. and R.B.; software and validation, T.C., D.C., R.G. and F.C.; Scenario 2 activities concept, M.v.Z.; writing—original draft preparation, L.M.; writing—review and editing, D.B.; supervision, L.M. and A.C.; funding acquisition, A.C., L.M., F.C. and E.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work is financed by National Funds through the Portuguese funding agency, FCT—Fundação para a Ciência e a Tecnologia, within the project UIDP/50014/2020 and within project UIDB/50014/2020. It is also financed by the CAPES-PRINT Projet “Transformação Digital e Humanidades”. Daniela Pedrosa wishes to thank Fundação para a Ciência e Tecnologia (FCT) and CIDTFF (UID/CED/00194/2020)—Universidade de Aveiro, Portugal, for Stimulus of Scientific Employment—CEECIND/00986/2017 Individual Support 2017.

Institutional Review Board Statement

The scenarios involving human participants were conducted within the scope of a Master thesis in Engineering, reviewed and approved by the respective institutions: Universidade do Porto and Universidade Aberta.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Acknowledgments

We would like to thank Rodrigo Medeiros Lehnemann and Antônio Augusto Borges Coelho for their efforts to explore the pedagogical challenges that identified new scenarios for the further evaluation and development of Inven!RA in the future.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Abbreviations

The following abbreviations are used in this manuscript:
3DThree dimensional
BEACONINGBreaking Educational Barriers with Contextualised, Pervasive and
Gameful Learning
CHICCooperative Holistic View on Internet and Content
DSRDesign Science Research
HTMLHyperText Markup Language
IDIdentifier
IAPInventive Activities Plan
ILEImmersive Learning Environments
Inven!RAInventive agency amidst Reticular ecosystems of Atopic habitats within which
knowledge (!) emerges
IPInternet Protocol
JSONJavaScript Object Notation
LMSLearning Management System
SCORMSharable Content Object Reference Model
UMLUnified Modeling Language
URLUniform Resource Locator
UserIDUser Identifier

References

  1. Marklund, B.B.; Taylor, A.S.A. Educational Games in Practice: The challenges involved in conducting a game-based curriculum. Electron. J. e-Learn. 2016, 14, 122–135. [Google Scholar]
  2. Schlemmer, E.; Felice, M.D.; Serra, I.M.R.d.S. OnLIFE Education: The ecological dimension of digital learning architectures. Educ. Rev. 2020, 36, e76120. [Google Scholar] [CrossRef]
  3. Nilsson, N.C.; Nordahl, R.; Serafin, S. Immersion Revisited: A review of existing definitions of immersion and their relation to different theories of presence. Hum. Technol. 2016, 12, 108–134. [Google Scholar] [CrossRef] [Green Version]
  4. Hevner, A.R. A Three Cycle View of Design Science Research. Scand. J. Inf. Syst. 2007, 19, 87–92. [Google Scholar]
  5. Murray, J.H. Hamlet on the Holodeck: The Future of Narrative in Cyberspace; Free Press: New York, NY, USA, 1997. [Google Scholar]
  6. Agrawal, S.; Simon, A.; Bech, S. Defining Immersion: Literature Review and Implications for Research on Immersive Audiovisual Experiences. In Proceedings of the 147th AES Pro Audio International Convention, New York, NY, USA, 16–19 October 2019; Audio Engineering Society: New York, NY, USA, 2019; p. 14. [Google Scholar]
  7. Beck, D.; Morgado, L.; O’Shea, P. Finding the Gaps about Uses of Immersive Learning Environments: A Survey of Surveys. J. Univers. Comput. Sci. 2020, 26, 1043–1073. [Google Scholar] [CrossRef]
  8. Slater, M. Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philos. Trans. R. Soc. B Biol. Sci. 2009, 364, 3549–3557. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Ryan, M.L. Narrative as Virtual Reality 2: Revisiting Immersion and Interactivity in Literature and Electronic Media, 2nd ed.; Johns Hopkins University Press: Baltimore, MD, USA, 2015. [Google Scholar]
  10. McCarthy, M. Experiential Learning Theory: From Theory To Practice. J. Bus. Econ. Res. (JBER) 2016, 14, 91–100. [Google Scholar] [CrossRef]
  11. Lave, J.; Wenger, E. Situated Learning: Legitimate Peripheral Participation; Cambridge University Press: Cambridge, UK, 1991. [Google Scholar]
  12. Downes, S. Connectivism. Asian J. Distance Educ. 2022, 17, 58–87. [Google Scholar]
  13. Børte, K.; Nesje, K.; Lillejord, S. Barriers to student active learning in higher education. Teach. High. Educ. 2020, 1–19. [Google Scholar] [CrossRef]
  14. Edwards, S. Active Learning in the Middle Grades Classroom: Overcoming the Barriers to Implementation. Middle Grades Res. J. 2015, 10, 65–81. [Google Scholar]
  15. Resnick, M. Lifelong Kindergarten: Cultivating Creativity through Projects, Passion, Peers, and Play; MIT Press: Cambridge, MA, USA, 2017. [Google Scholar]
  16. Rittel, H.W.; Webber, M.M. Dilemmas in a general theory of planning. Policy Sci. 1973, 4, 155–169. [Google Scholar] [CrossRef]
  17. de Lima, C.C.; Morgado, L.C.; Schlemmer, E. Relevant Aspects To Promote Teacher Awareness in the Pedagogical Orchestration of Learning Activities Where Students Move. Res. Sq. 2021. [CrossRef]
  18. Tang, Y.; Pan, Z.; Pedrycz, W.; Ren, F.; Song, X. Viewpoint-Based Kernel Fuzzy Clustering With Weight Information Granules. IEEE Trans. Emerg. Top. Comput. Intell. 2022, 1–15. [Google Scholar] [CrossRef]
  19. Kovacova, M.; Horak, J.; Higgins, M. Behavioral Analytics, Immersive Technologies, and Machine Vision Algorithms in the Web3-powered Metaverse World. Linguist. Philos. Investig. 2022, 21, 57. [Google Scholar] [CrossRef]
  20. Reis, R.; Marques, B.P. Learning Analytics in the Monitoring of Learning Processes: 3D Educational Collaborative Virtual Environments. In Advances in Educational Technologies and Instructional Design; Azevedo, A., Azevedo, J.M., Onohuome Uhomoibhi, J., Ossiannilsson, E., Eds.; IGI Global: Hershey, PA, USA, 2021; pp. 142–169. [Google Scholar] [CrossRef]
  21. Maderer, J.; Gütl, C. Antares: A Flexible Assessment Framework for Exploratory Immersive Environments. In Workgroups eAssessment: Planning, Implementing and Analysing Frameworks; Babo, R., Dey, N., Ashour, A.S., Eds.; Intelligent Systems Reference Library; Springer: Singapore, 2021; Volume 199, pp. 181–207. [Google Scholar] [CrossRef]
  22. Stal, M. The broker architectural framework. In Proceedings of the OOPSLA95: Conference on Object Oriented Programming Systems Languages and Applications, Austin, TX, USA, 15–19 October 1995; Volume 95. [Google Scholar]
  23. Cruzeiro, T.J.L. Inven!RA-Platform for Authoring and Tracking of Inventive Activity Plans. Ph.D. Thesis, Universidade do Porto, Porto, Portugal, 2020. [Google Scholar]
  24. Cota, D.; Cruzeiro, T.; Beck, D.; Coelho, A.; Morgado, L. InventiveTr@ining—Inven!RA architecture Activity Provider modules for online tracking of microelectronics student projects. Rev. Ciênc. Comput. 2021, 16, 113–136. [Google Scholar] [CrossRef]
  25. Bourazeri, A.; Arnab, S.; Heidmann, O.; Coelho, A.; Morini, L. Taxonomy of a gamified lesson path for STEM education: The beaconing approach. In Proceedings of the 11th European Conference on Games Based Learning, ECGBL 2017, Graz, Austria, 5–6 October 2017; pp. 29–37. [Google Scholar]
  26. Baptista, R.; Coelho, A.; Vaz de Carvalho, C. Relation between game genres and competences for in-game certification. In Proceedings of the International Conference on Serious Games, Interaction, and Simulation, Novedrate, Italy, 16–18 September 2015; Springer: Berlin/Heidelberg, Germany, 2015; pp. 28–35. [Google Scholar]
  27. Morgado, L.; Cassola, F. Activity Providers for Inven!RA; Technical Report; Universidade Aberta: Lisbon, Portugal, 2022. [Google Scholar]
  28. Peffers, K.; Tuunanen, T.; Rothenberger, M.A.; Chatterjee, S. A Design Science Research Methodology for Information Systems Research. J. Manag. Inf. Syst. 2007, 24, 45–77. [Google Scholar] [CrossRef]
  29. Bohl, O.; Scheuhase, J.; Sengler, R.; Winand, U. The sharable content object reference model (SCORM)—A critical review. In Proceedings of the International Conference on Computers in Education, Auckland, New Zealand, 3–6 December 2002; Volume 1, pp. 950–951. [Google Scholar] [CrossRef]
  30. Grilo, R. Laboratórios Remotos—Interação Remota com Braços Robóticos e Integração com Plataformas de e-Learning. Master Dissertation, Universidade Aberta & Universidade de Trás-os-Montes e Alto Douro, Lisbon, Portugal, 2022. [Google Scholar]
  31. Schunk, D.H.; Zimmerman, B.J. Self-regulation and learning. In Handbook of Psychology: Educational Psychology, 2nd ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2013; Volume 7, pp. 45–68. [Google Scholar]
  32. Zimmerman, B.J. From Cognitive Modeling to Self-Regulation: A Social Cognitive Career Path. Educ. Psychol. 2013, 48, 135–147. [Google Scholar] [CrossRef]
  33. Morais, C.; Pedrosa, D.; Rocio, V.; Cravino, J.; Morgado, L. Using BPMN to Identify Indicators for Teacher Intervention in Support of Self-regulation and Co-regulation of Learning in Asynchronous e-learning. In Technology and Innovation in Learning, Teaching and Education; Reis, A., Barroso, J., Lopes, J.B., Mikropoulos, T., Fan, C.W., Eds.; Communications in Computer and Information Science; Springer International Publishing: Cham, Switzerland, 2021; Volume 1384, pp. 210–222. [Google Scholar] [CrossRef]
Figure 1. Inven!RA software architecture overview.
Figure 1. Inven!RA software architecture overview.
Sustainability 15 00857 g001
Figure 2. Creator use cases in Inven!RA.
Figure 2. Creator use cases in Inven!RA.
Sustainability 15 00857 g002
Figure 3. Creator sequence diagram for Inven!RA.
Figure 3. Creator sequence diagram for Inven!RA.
Sustainability 15 00857 g003
Figure 4. Deployer use cases in Inven!RA.
Figure 4. Deployer use cases in Inven!RA.
Sustainability 15 00857 g004
Figure 5. Deployer sequence diagram for Inven!RA.
Figure 5. Deployer sequence diagram for Inven!RA.
Sustainability 15 00857 g005
Figure 6. Active Agent use cases in Inven!RA.
Figure 6. Active Agent use cases in Inven!RA.
Sustainability 15 00857 g006
Figure 7. Active Agent sequence diagram for Inven!RA.
Figure 7. Active Agent sequence diagram for Inven!RA.
Sustainability 15 00857 g007
Figure 8. Awareness Agent use cases in Inven!RA.
Figure 8. Awareness Agent use cases in Inven!RA.
Sustainability 15 00857 g008
Figure 9. Awareness Agent sequence diagram for Inven!RA.
Figure 9. Awareness Agent sequence diagram for Inven!RA.
Sustainability 15 00857 g009
Figure 10. Inven!RA front end—IAP being created with three activities.
Figure 10. Inven!RA front end—IAP being created with three activities.
Sustainability 15 00857 g010
Figure 11. Activity Provider configuration screens.
Figure 11. Activity Provider configuration screens.
Sustainability 15 00857 g011
Figure 12. Arduino programming activity—process detail.
Figure 12. Arduino programming activity—process detail.
Sustainability 15 00857 g012
Figure 13. Class analytics (left) and Individual Qualitative Analytics (right).
Figure 13. Class analytics (left) and Individual Qualitative Analytics (right).
Sustainability 15 00857 g013
Figure 14. Inven!RA authoring front end for the “CHIC’s apps” scenario.
Figure 14. Inven!RA authoring front end for the “CHIC’s apps” scenario.
Sustainability 15 00857 g014
Figure 15. Remote computer networking lab employed in Scenario 3—deployed activity interface.
Figure 15. Remote computer networking lab employed in Scenario 3—deployed activity interface.
Sustainability 15 00857 g015
Figure 16. Analytics mapping model from Scenario 3, updating the original Triadic Certification Model.
Figure 16. Analytics mapping model from Scenario 3, updating the original Triadic Certification Model.
Sustainability 15 00857 g016
Table 1. Summary of deployment obstacles faced by immersive learning environments related to the support infrastructure, from the instructor’s perspective.
Table 1. Summary of deployment obstacles faced by immersive learning environments related to the support infrastructure, from the instructor’s perspective.
IDObstacle
1Situational awareness
2Complexity of planning
3Complexity of providing feedback
4Complexity of grounded assessment
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Morgado, L.; Coelho, A.; Beck, D.; Gütl, C.; Cassola, F.; Baptista, R.; van Zeller, M.; Pedrosa, D.; Cruzeiro, T.; Cota, D.; et al. Inven!RA Architecture for Sustainable Deployment of Immersive Learning Environments. Sustainability 2023, 15, 857. https://doi.org/10.3390/su15010857

AMA Style

Morgado L, Coelho A, Beck D, Gütl C, Cassola F, Baptista R, van Zeller M, Pedrosa D, Cruzeiro T, Cota D, et al. Inven!RA Architecture for Sustainable Deployment of Immersive Learning Environments. Sustainability. 2023; 15(1):857. https://doi.org/10.3390/su15010857

Chicago/Turabian Style

Morgado, Leonel, António Coelho, Dennis Beck, Christian Gütl, Fernando Cassola, Ricardo Baptista, Maria van Zeller, Daniela Pedrosa, Tiago Cruzeiro, Duarte Cota, and et al. 2023. "Inven!RA Architecture for Sustainable Deployment of Immersive Learning Environments" Sustainability 15, no. 1: 857. https://doi.org/10.3390/su15010857

APA Style

Morgado, L., Coelho, A., Beck, D., Gütl, C., Cassola, F., Baptista, R., van Zeller, M., Pedrosa, D., Cruzeiro, T., Cota, D., Grilo, R., & Schlemmer, E. (2023). Inven!RA Architecture for Sustainable Deployment of Immersive Learning Environments. Sustainability, 15(1), 857. https://doi.org/10.3390/su15010857

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop