1. Introduction
Industrial design is a creative discipline that aims to attribute aesthetic and functional qualities to products that are to be mass-produced. This discipline has a significant impact at the application level since the properties of a product are always the result of the integration of several factors, including technology [
1,
2].
Nowadays, emerging technologies are opening new spaces for the new generation of designers. New devices and services emerge, new practices arise around them, and new problems originate. Among them, Mixed Reality (MR) technology, as a continuum of Virtual Reality and Augmented Reality (VR, AR) [
3], contributes to the development of the Metaverse, a novel persistent virtual space that combines features of social media, online games, VR, AR, and cryptocurrencies [
4]. Additionally, the COVID-19 pandemic has accelerated this process [
5,
6]. As a result, MR devices are available at low cost, increasing the immersion of experiences and services.
Moreover, the Metaverse is expected to become a space where many users will spend their work and leisure time, demanding a new form of interaction between real and digital products and services [
7,
8]. For instance, users will be able not only to view digital content but also to use or control services and products in the real world [
9]. Conversely, real controls on a physical device will trigger an action in the Metaverse.
Generation Z will be the main player in this ongoing change, as the Metaverse will be a way for them to live their digital lives more fully. In the Metaverse, users will be able to learn, socialize, participate in events, play games, and even earn money from games such as Fortnite and Roblox. Therefore, the Metaverse will profoundly impact the meaning, function, appearance, and form of products or services. For example, buttons, controls, status lights, and displays could be eliminated because they will be accessed through virtual or augmented interfaces. In this way, product aesthetics could be simplified, becoming cheaper to produce, with less waste, less labor time, and easier disposal. The Metaverse could also provide some of the services that the real world already has but in a virtual version [
7,
8].
Therefore, we aim to promote this imminent change by analyzing its impact on the education system in the context of design. We present a novel interdisciplinary laboratory that originates from the interaction design (or design for interaction) discipline. By definition, interaction design focuses on the interaction between users and products to support the way people communicate and interact in their daily and working lives [
10]. In fact, the development of design methodologies to unlock technology for design is necessary, with the idea that perhaps we will not design for these technologies but with them [
11].
Our laboratory aims to extend the discipline of interaction design to current debates, providing students with the knowledge and tools needed to design new services for the Metaverse. However, a Metaverse-oriented design course in industrial design programs is complex for several reasons. One of the most critical aspects is the need to provide multidisciplinary and deeply integrated subjects such as computer science, graphics, computer-aided design, computer graphics, human cognition, user-centered design, human–computer interaction, and product design [
12]. Most interaction design programs in art, design, or engineering schools usually fragment these disciplines, and students do not integrate them into practical product redesign and innovation, with profound implications for use, aesthetics, and functionality [
12].
Another aspect is more cultural and limited to some industrial design schools that find no interest in the new challenges of interaction tools provided by emerging MR technologies and do not include these technologies in their programs. In addition, established methods for MR education in the context of design are lacking in the literature, and there is no common agreement on format and content [
13]. Therefore, this paper presents a multidisciplinary laboratory in an industrial design program that aims to integrate interaction design concepts and MR technologies knowledge.
The research questions are as follows:
RQ1—Is it feasible to integrate MR teaching into an interaction design course within the Industrial Design program?
RQ2—What is the Laboratory feedback from industrial design students in terms of acceptance, effectiveness, usefulness, efficiency, and satisfaction?
2. Background
MR is increasingly deployed in educational environments as it contributes to increasing motivation, engagement, knowledge transfer, and critical thinking [
14]. Nevertheless, the literature lacks studies in the field of design that integrate education with respect to MR technologies. Thus, MR can be a tool for designers who are expected to design innovative products and interfaces that can improve the “real” world.
In fact, the ongoing digital transformation has generated a progressive dematerialization of the product that must be designed to be used no longer “physically” in the real world but on a display (e.g., smartphones, laptops, and HMDs). With this aim, interaction designers could play an increasingly important role as they should acquire methods and tools for designing these products. In fact, we are surrounded by multiple visual stimuli, overcrowding of products, new technologies, and devices, and we need to interact with each of them.
Additionally, the emerging Metaverse requires the design of interfaces and experiences that could “extend” the real world. However, how people interact in their daily lives is fundamental [
15]. By connecting people and technology, design has the opportunity and responsibility to design experiences and services for the Metaverse.
Therefore, designers are asked to design new interactions with products (real or virtual) within the Metaverse. Giaccardi and Redström stated that “the human-centered design perspective implies that the interaction between a person (or persons) and a technology forms the basis of how the designed artifact should be presented” [
11]. Hence, training and educating the designers of the future with respect to such knowledge that responds to current needs and changes is crucial.
3. Design for Interaction Lab
We, therefore, present a novel laboratory named “Design for Interaction” that takes place in the first semester of the first year of the master’s degree program in Industrial Design at the Polytechnic University of Bari. The students are heterogeneous in terms of both nationality and education and graduated in industrial design, architecture, or engineering. The Lab is multidisciplinary and is organized by three professors from different subject areas (i.e., computer science, design, and engineering). It provides the foundations, methods, and tools for developing interactive experiences for innovative products and services. The Lab, which has a value of 18 in the European Credit Transfer and Accumulation System (ECTS), is organized into three courses of 6 ECTS: Information Design, Information Systems, and Virtual Design and Simulation.
3.1. Information Design
This course provides methodological approaches and theoretical foundations for visual communication activities [
16]. The objective is to enable students to design effective and memorable experiences for users at the level of representation and interaction. To do this, students must learn how to organize and present data, interaction, and representation techniques. The processes engage the students in problem-solving, audience response, and communication issues. In addition, students must learn to empathize with users, research them, and analyze their profiles, behaviors, and habits. In this way, it is possible to trace existing user problems and learn how to handle them in the ideation phase. By the end of the course, students will be able to translate complex information into engaging visual and narrative formats that are understandable to different audiences and can support comprehensive communication strategies.
3.2. Information Systems
The course provides students with methods for analyzing technological evolution in the field of information technology. The course aims to help students experience enabling technologies by promoting participation and design skills with the use of information technology [
17]. It provides the theoretical basis for the use of information technology and information systems through specific exercises. The course aims to integrate the Internet of Things and information systems into the industrial design program to make the student aware of the feasibility of the analyzed case studies. It also focuses on how people and products interact, what a user perceives about a product, how a user understands the use and experience of a product, and how designers can conceptualize products relevant to the user. Lastly, the course teaches the student to use information systems, define what they can and cannot accomplish, and assess cost and complexity. During the project phase, students apply and integrate what they have learned.
3.3. Virtual Design and Simulation
This course provides the theoretical foundation and practical skills to design and prototype innovative interfaces for the Metaverse. The theory describes the evolutionary history of Graphical User Interfaces (GUIs) from the command line to next-generation interfaces, including MR technologies. The course uses the Unity engine and Vuforia SDK for prototyping MR experiences. The basics of programming are provided within this course, and students are instructed with respect to programming with C#. Specifically, students are provided with the very basics of coding, such as variables, functions, and classes, and how to use them. For the interface development phase, the students are provided with a series of scripts written in C# that are not to be modified at the code level but are to be used directly within Unity 3D. Each game object could be associated with one of these ready-made scripts and, through variables of the public type, could be adapted to the needs of those who are to use them.
In this way, we provide students with the tools and methods useful for developing MR-based knowledge and designing future products and GUIs for the Metaverse [
18]. A GUI is a type of user interface through which users interact with electronic devices through visual representations [
19]. Designing for the Metaverse will be the next-generation challenge that can only be addressed by studying the design methods and processes for making a GUI. The program consists of a theory part (3 ECTS) and a laboratory part (3 ECTS).
4. Methods
The three courses of the lab balance theoretical and practical activities equally. The theoretical foundations refer to the underlying principles and concepts that inform the design of the lab activities and are complementary in all the courses. For example, Information Design provides students with approaches to collect, organize, and present data obtained from the user research conducted on a sample. Then, Information Systems provides students with methods for integrating information technology into the design process, considering the user sample and the identified need. Finally, Virtual Design and Simulation provides students with the skills to design and prototype MR interfaces for the Metaverse to address that need using the Unity engine.
The three courses also require mandatory weekly assignments to experience individual teaching units. A final assignment is required at the end of the semester as a final group work project. This aims to develop innovative interfaces based on a specific product by exploiting MR technologies to hypothesize a possible interaction in the Metaverse. Each team must design an MR user interface.
In addition, students must use a specific design methodology consisting of several steps, as follows:
Investigate the historical evolution of the chosen product;
Perform user research to uncover key problems, discovering solutions and innovation points considering MR technologies;
Hypothesize personas and scenarios concerning the interface and design a storyboard;
Build an MR application using the Unity engine (See
Figure 1);
Test the prototype with a sample of real users.
Due to COVID-19 restrictions, students have been unable to test their interface with a sample of participants. In the end, students write a design report to explain the whole design process and design a presentation to show during the final exam. The individual students’ final grade is calculated as an average of the individual assignments in each unit and the final grade for the teamwork.
4.1. Product Historical Evolution
Students start with the history and state of the art of the chosen object, analyzing the history of leading companies, technologies, tools, and interfaces over the years (See
Figure 2). Starting work on a project, the first objective is to learn about the product and domain and then about target users. A search of a database dedicated to academic publications such as Google Scholar and then a search about industrial case studies during history can yield insights that can start product development.
4.2. User Research, Needs, and Solution
Students apply the user-centered design (or human-centered design) approach [
20]. This approach consists of a sequence of iterative steps, starting from the identification of the context of use and research on users. In this specific case, users prepare and submit a semi-structured questionnaire using Google Forms. User research provides a consistent, rapid, controlled, and thorough method of examining the users’ perspectives (See
Figure 3) [
21].
Next, students define user needs from their answers and formulate user and system requirements. Then, students develop a design solution that meets the previously identified needs and requirements by exploiting MR technologies and evaluate the design solution against the requirements.
4.3. Personas, Scenarios, and Storyboards
Following the provided process, students use personas to identify target users and simulate behaviors and goals to design the interface. Personas are fictional individuals created to describe the typical user based on the user profile, and their purpose is to represent a group of end users during design discussions and keep everyone focused on the same target. The personas are usually defined by identity and photo, status, goals and tasks, skill set, requirements and expectations, and relationships (See
Figure 4) [
22]. The scenario represents a story that describes how a particular persona completes a task or behaves in each situation. The purpose is to bring users to life and see if the GUI hypothesized meets users’ needs.
The scenario is usually defined by setting, actors, objectives or goals, sequence of events, and result (See
Figure 5) [
22]. This phase also includes the creation of storyboards and drawn tables that elaborate the scenarios related to the target user. A design storyboard is powerful means for the designer because by telling a story about (parts of) the interaction(s), it enables the reader access to the expressed ideas on two levels: communication and experience [
23]. Storyboards support industrial designers in getting a grip on context and time, forcing them to pay attention to different aspects, integrate these aspects, and address implications that might be put off with abstract considerations. Storyboards can be sketched (i.e., by pencil; See
Figure 6) or digital (i.e., made with vector-graphics editing programs such as Adobe Illustrator), evoking comment and reaction or detailed and closed, conveying facts and convincing.
4.4. User Interface Design with Mixed Reality Technologies
Students move on to the design phase, creating an interface combined with an appliance that incorporates what the users want, meeting their needs. Students produce 3D models based on their design requirements using Rhinoceros software, which they export to fbx to create scenes in Unity 3D. In some cases, they use 3D models available for free from the Unity Asset Store [
24]. As a result, students realize GUI prototypes that are designed to work with the upcoming generation of smart glasses. Students prefer to use VR to simulate AR visualization of the GUIs to avoid tracking robustness issues. Just one project is implemented exploiting Vuforia image targets (See
Figure 7) [
25].
4.5. Test the Protype
The exam procedure involves testing on a sample of at least n = 20 participants who are administered a user experience questionnaire (UEQ) [
26] to measure the subjective impression of users towards the user experience of products and the System Usability Scale (SUS) [
27] to measure usability perception of computer interfaces. Based on the results, the interface can then be implemented by students.
5. User Study
The 2020–2021 class consisted of 27 students (12 males and 15 females). Due to COVID-19 restrictions, teaching and laboratories were conducted completely online through Microsoft Teams. The topic of the final exam was the development of MR interfaces in the household appliances context. The choice of topic was in line with government restrictions that prevented outdoor experiments and mobility. In addition, the home environment was the most experienced during the lockdown period. Students were grouped into ten teams and selected different products such as a coffee machine, domotic lighting control, Hi-Fi system, kitchen robot, hob, microwave, oven, fridge, alarm clock, or vacuum cleaner.
5.1. Espresso Coffee Machine
The coffee machine is a widespread product in some countries for cultural reasons related to its production and consumption. Current interaction with coffee machines occurs through physical buttons or a touch screen in the latest models, although in many coffee machines, the lack of a display and the presence of buttons make some simple operations difficult to interpret.
The user research contains personal and behavioral data from a sample of 81 participants who volunteered to help students investigate the relationship between the user and the coffee machine. The questionnaire consists of 15 multiple-choice questions. User research shows that, nowadays, coffee machines present some important limitations. First, they do not give feedback to users in the case of errors during normal functions (e.g., filling the water container). Additionally, it is not possible to program the machine or to buy the reserve of capsules using the machine or its possible application on devices, such as smartphones, tablets, or laptops.
The main idea (See
Figure 8a) is to simulate the typical Italian coffee shop experience, consisting of a barista avatar (i.e., Ambrobot) and a virtual television that can be used in various types of scenarios. The barista avatar facilitates the users’ approach to MR and has the task of guiding them in all their choices. Students assumed a relaxed environment in which users can relax with the sound of sea waves and a business environment in which users can always stay up-to-date on the economic and social developments of their interest. The GUI allows users to interact in a bar-like environment near their home in a shared or private scenario in the Metaverse. This solution revisits the concept of “bar” since the simulated AR allows users to have an experience of living inside a real bar, either in a group or alone, and with a friendly avatar.
5.2. Domotic Lighting Control
Domotic is a combined system of information and telematics technologies to support domestic activities with consolidated technical systems and protocols.
Students formulate a questionnaire on home automation in general to understand how well it is known and what expectations users have. The questionnaire is administered to a sample of 150 participants and is made up of a total of 17 questions, of which 14 are multiple-choice and 3 are open-ended. The questionnaire is divided into three main topic areas. The first part concerns the user, the second concerns the degree of knowledge of the user towards home automation, and the third concerns the feelings and opinions that users have about this new technology. User research shows that users would like to save money on their energy bills but are resistant to using home automation systems because they are too expensive to install. User expectations concern the home of the future that is smart and provides energy and functional savings.
The project (See
Figure 8b) presents, through simulated AR, an interactive control panel that suggests to the user the most suitable light mode for certain times of the day. This solution allows for a customizable, easy-to-use home automation lighting system that facilitates energy savings. In addition, the system can also be considered valid and usable in the Metaverse scenario by controlling the light settings virtually to trigger feedback from the physical system.
5.3. Hi-Fi System
The word Hi-Fi comes from the concept of high fidelity, which indicates the action of the system to simulate music that is played live (e.g., orchestra or concerts) in the most realistic way. All Hi-Fi equipment must perform a complex task related to the high quality of music.
The user research begins with an in-depth analysis and study of users who use Hi-Fi systems. The study is followed by the creation of a questionnaire to investigate user behaviors that is distributed through social networks, audiophile blogs, and to people outside the world of Hi-Fi. The questionnaire reaches a sample of 80 participants, and as a result, personas and consequent scenarios are developed, leading to the development of the final idea of the project. Many people do not know what a Hi-Fi system is or how it works, and there are several reasons for this, such as misinformation on the subject, the classification of Hi-Fi as an elite product, or difficulty of use. Although high fidelity has become increasingly user-friendly over the years, there are still some problems to be addressed. In addition, Hi-Fi systems are not directly customizable to the user’s humor but require a precise musical choice from the user.
The project (See
Figure 9a) concerns a simulated AR interface for home Hi-Fi systems. The system is controlled by three-dimensional icons, matching colors, and special effects recreated for each musical break. The project also leads the user to a particular experience in which music and emotions merge uniquely, spreading the message that Hi-Fi is not an elitist object for a few but more universal.
5.4. Kitchen Machine
Cooking machines are excellent allies in the kitchen that allow us to speed up numerous preparations.
The user research starts with a questionnaire made up of 18 multiple-choice questions administered through Google Forms to a sample of 463 users. The questions are aimed at understanding the needs of the users to understand their behaviors and to outline personas and scenarios. The questionnaire shows that not all kitchen robot functions are intuitively understandable to users, and this problem leads to dissatisfaction among users, who see the preparation of a dish with their kitchen machine as much more complex than it is. Often, this complexity leads users to abandon their kitchen machines and prefer traditional cooking methods, manually performing the various preparation processes. In recent years, the guided recipe function of many cooking machines has attempted to break down this difficulty. In this way, the novice user can start using the machine from scratch, and it is the machine itself that suggests all the operations be performed step by step.
By projecting a scenario in which the user wears AR glasses, it is possible to apply the concept of a guided recipe in AR to solve this issue. Users are presented with a display of the kitchen machine in front of them and a three-dimensional interface with which they can interact by wearing AR glasses (See
Figure 9b). GUI assists the user through an avatar, “Chef Gino”, with whom it is possible to communicate vocally [
28]. The presence of an avatar makes the experience enjoyable, interesting, and engaging for the user. Throughout the preparation of the recipe, Chef Gino congratulates users, gives suggestions, and reinforces the positive aspects of the preparation, as if between the avatar and users, there is a good friendship, addressing phrases such as “Fantastic, good job!”. The GUI allows for the sharing of the cooking experience with groups of people who love food and cooking together, even in the Metaverse scenario. This solution also offers an immersive and fun experience that allows users not to make mistakes in the preparation of their dish’s, together with a friendly avatar with whom to dialogue throughout the recipe, celebrating the completion of the latter.
5.5. Hob
Current hobs are available on the market and feature a mechanical or electrical interface directly on the surface of the hob itself.
The user research starts with a questionnaire administered to a sample of 125 people of different cultures—Italian and Iranian—aimed at discovering their typical behaviors towards the hob they use. The questionnaire is structured into topic areas (i.e., anagraphic data, behavior in the kitchen, kitchen hob knowledge, main interaction problem, AR knowledge and satisfaction, and their dream) that provide valuable information to develop the concept. The user research identifies several problems related to the hob’s different features. For example, if users have some unforeseen situation and need to move away from their hob, it is impossible to manage the cooking time and turn off/on/adjust the flame. Moreover, users often forget an ingredient while cooking (e.g., salt for pasta), and unless users remember it themselves, it is impossible to solve this problem. Additionally, when users must follow a recipe, they are forced to follow either video recipes or physical cookbooks, which are difficult to handle on the hob for several reasons (e.g., stove danger, hygiene, busy hands, etc.).
The idea (See
Figure 10a) developed by students is to assist the user through an intuitive and interactive three-dimensional interface using simulated AR. The GUI is based on the use of three-dimensional icons that can be easily interpreted by any kind of user so as not to burden and hinder the view of the user, who is already busy preparing food. For this reason, students opted for only three icons positioned outside the worktop, while all the other icons can be temporarily hidden and activated only when the user wants.
The GUI allows for sharing of the experience of cooking on the stove with groups of people who love food and want to cook together—even user of different ethnicities—within the Metaverse. Through the creation of new scenarios, simulated AR allows users to experience a new world of culinary cultures, so users can have an international experience and try to cook a recipe from another country in the world.
5.6. Microwave
A microwave is an appliance with a strong utility, as it can manage the cooking and defrosting of a meal in a short time.
User research is conducted through the administration of a questionnaire consisting of 20 questions—15 multiple-choice and 5 open-ended—to a sample of 59 users. The user research identifies some weaknesses of microwaves based on users’ experience. Most of the time, the user needs to check whether the food inside is ready or not and know which foods have problems during cooking and which materials are not recommended for use at all. Microwave cleaning is also a very important aspect of the microwave. This is a very tedious and nerve-wracking task, which is mostly done either by hand or through the self-cleaning option. A possible implementation concerns the opportunity to know and visualize the state of dirtiness of the microwave and the bacteria present before self-cleaning.
The GUI (See
Figure 10b) lets users use the microwave correctly, presenting a simple and intuitive graphic canvas that guides the user step by step in the insertion of food and its cooking, paying attention to the safety of these steps. This solution allows the user to enhance the interaction with the microwave, facilitating some operations that are difficult in real life without simulated AR. It also allows for remote interaction with friends and those who want to participate in the cooking experience with users in the Metaverse.
5.7. Oven
An oven is an appliance that has existed since the primitive era, and its evolution has reflected the technological changes both in functionality and interfaces. Initially, the oven presented traditional interfaces with simple knobs; later, interfaces became hybrid and characterized by both physical and digital parts.
To accomplish user research, students administer a questionnaire made up of 19 questions (17 multiple-choice questions and 2 open-ended questions) to a sample of 150 users. The user research shows the issues related to this kitchen appliance concerning the impossibility of remotely controlling the oven (e.g., turn off/turn on) and the food inside during cooking and assessing its status (e.g., through a webcam). The oven could also provide advice to the users on the recipe settings as if it were their friend.
The project idea (See
Figure 11a) is to increase the oven functionalities and control using AR contact lenses to remotely control ovens that are currently on the market. The GUI allows users to choose the food to be cooked in the oven and set the cooking mode through three-dimensional and intuitive icons. The project enhances human–machine interaction through a GUI that uses not only simulated AR but also artificial intelligence to make human–oven interaction easier and more successful. Additionally, in the Metaverse, it could be possible to share the experience of cooking in the oven and, while waiting for the dish, to talk with friends.
5.8. Fridge
A fridge is a household appliance whose function is to store food in a thermoregulated low-temperature condition. It is a relatively new appliance, but it has quickly become an essential and indispensable product in the home.
Students formulate a questionnaire consisting of 18 questions (17 multiple-choice and 1 open-ended question) that is administered to 500 users. The user survey contains personal and behavioral data from individuals who voluntarily wished to participate. For this research, students use social media (i.e., Instagram and Facebook) as channels to disseminate the questionnaire. The user research shows that although the fridge is one of the most “popular” household appliances and is used with a certain regularity, users are not always sure of what it contains. The availability of certain foods or their expiration date may escape daily control, and users must resort to unexpected shopping or, even worse, to the dustbin. The possibility of controlling the quantity, temperature, and expiry date of foods even when users are not at home allows for greater control and less waste of money, as well as an optimization of resources.
The simulated AR interface (See
Figure 11b) allows users to see inside the fridge, without opening its door wearing AR smart glasses. The user, through this interface, will be able to see all the products, which are divided into compartments. The products highlighted in red by the interface are the first to expire, while the products highlighted in green indicate the freshest products. The GUI allows users to view from a distance what the fridge contains and how to manage all the food it contains, avoiding waste. It is also possible to have this experience in Metaverse by discussing with friends about foods to buy from the supermarket and dishes to cook.
5.9. Alarm Clock
Today, the traditional alarm clock usually placed on the bedside table in the bedroom is replaced by the alarm clock application on smartphones.
Students administer a questionnaire consisting of 14 multiple-choice questions administered to a sample of 215 users. User research identifies several user-related issues such as difficulty waking up and falling asleep. Sleep disturbances can be the result of several factors such as annoying sleep from one’s partner, the presence of disturbing sounds, and the presence of annoying lights. On the other hand, difficulty in waking up is mainly due to the annoyance of having to turn off one’s phone alarm clock. In some cases, users cannot hear the alarm clock ringing or turn it off involuntarily.
In this project (See
Figure 12a), the interface presents different three-dimensional icons and related functionalities in simulated AR. The GUI allows users to wake up with another person and/or share sleep and fall asleep with other people with the same problem, even in the Metaverse scenario. Since many users have difficulty falling asleep and waking up, this solution presents an immersive experience—a reality amplified through the enrichment of human sensory perception—by superimposing digital content on reality.
5.10. Vacuum Cleaner
Robot vacuum cleaners are equipped with new and sophisticated smart technology by which they can perform cleaning completely autonomously.
The user research starts with a questionnaire on Google Forms that is distributed on various social channels, including personal profiles (i.e., Facebook and Instagram) and instant messaging platforms (i.e., WhatsApp and Telegram). The questionnaire consists of 17 multiple-choice questions and received 187 responses. The user research shows that users are students and full-time workers, and both categories have little time to dedicate to cleaning the house. A possible way to solve this problem is to set the robot to select the path to be accomplished, even remotely.
Thus, the project (See
Figure 12b) stems from the need for a precise category of users (i.e., students and full-time workers) who have little time to devote to cleaning the house. The interaction between the user and the robot vacuum cleaner occurs through gestures, as the user indicates the environment and/or the path that the robot vacuum cleaner must take through simulated AR. The user can also control the robot’s path in the Metaverse.
6. Evaluation
The group projects demonstrate how the students acknowledge the theory and laboratory part of the courses and can synthesize MR adoption, GUI design, and product innovation. We also want to evaluate other aspects of the students’ experience in order to obtain preliminary results related to the novel multidisciplinary laboratory. Therefore, at the end of the final presentation of each team, we ask the students the following questions using a 7-point Likert scale [
29] to evaluate acceptance, effectiveness, utility, efficiency, and satisfaction [
30].
Do you think exploiting MR technologies for designing GUIs can improve traditional interfaces? (to evaluate students’ acceptance);
Do you believe MR technologies can improve product innovation processes? (to evaluate the method’s effectiveness);
How do you evaluate these digital technologies concerning your professional work as a future designer? (to evaluate the course utility);
Does the Design for Interaction Laboratory give you tools and methods to prototype your design ideas? (to evaluate the course efficiency);
How satisfied are you with this laboratory? (to evaluate student satisfaction).
Additionally, we include an open-ended question to understand what students want to improve about the Laboratory. In this study, we used self-reported data, which was deemed to be the most appropriate method to assess students’ perceptions and experiences with the teaching method in this preliminary study.
We recognize that the use of the 7-point Likert scale and open-ended questions in the evaluation may not provide a comprehensive or objective assessment of the teaching method. However, these methods have been commonly used in educational research to gather subjective data and provide valuable insights into students’ perceptions and experiences [
31]. Additionally, we took steps to minimize potential biases by incorporating measures to reduce response bias, such as ensuring anonymity.
Results
A sample of n = 27 students attended the Laboratory and took part in the final exam. Only n = 21 students agreed to complete the questionnaire. The course evaluation questionnaire is implemented in Microsoft Forms and delivered to students through Microsoft Teams. The average completion time is 03′ 08′′.
Students’ evaluation of the possibility that MR technologies could improve traditional interfaces presents an average score of 4.95 (See
Figure 13a). Their satisfaction related to the course reports a positive average score of 5.24 (See
Figure 13a). The questions about the possibility MR technologies improving product innovation processes has an average of 5.62 (See
Figure 13b). Student ratings related to MR technologies regarding their adoption for designing report positive results, with an average score of 5.57 (See
Figure 13b). Finally, students agreed that the Laboratory provided them with tools and methods to prototype their design ideas, with an average score of 5.48 (See
Figure 13b).
Out of the sample of n = 27, only n = 8 responded to the open question. Responses were heterogeneous and reported as follows:
Use HMDs such as HoloLens 2. During the testing phase, the user could interact with the GUI to validate it or not;
Delve deeper into user experience (UX) design, which is a key part when building an interface and subsequently an app in MR;
Get more practice after COVID-19. Students during the 2020/2021 academic year were not able to experience live demos on HMDs as they have in past years. As a result, they were able to delve more into the theory than the lab;
Integrate more information technology (IT) knowledge, as it was conceptually possible to work on MR but not in-depth with programming;
Add a course on 3D modeling at the beginning of the semester. The course could facilitate the modeling of scenarios useful for GUI implementation.
The questionnaire helps to evaluate several parameters related to the teaching method used in this novel Design for Interaction Laboratory: acceptance, effectiveness, utility, efficiency, and satisfaction. The obtained results are comparable to those found in the literature related to student adoption of MR [
32,
33,
34,
35,
36], confirming positive feedback [
37].
7. Discussion and Conclusions
The digital transformation we are currently experiencing marked by the development and use of technologies such as MR and the spread of HMDs available at low cost is causing a shift in design education toward the Metaverse. Thus, the future of industrial design education is shaped by new and exciting challenges due to the fluidity of real products and services designed for the upcoming Metaverse space.
However, the existing literature lacks studies in the field of design that integrate education with respect to MR technologies. We, therefore, present a novel lab with an integrated multidisciplinary approach that starts from the fundamentals of interaction design and aims to teach students how to design next-generation MR interfaces for the Metaverse. The lab combines theory and practice within three courses: Information Design, Information Systems, and Virtual Design and Simulation. Industrial design students are asked to adopt a precise multidisciplinary method in five steps, from analyzing the state of the art to proposing a final group design of an MR user interface.
We conducted a classroom case study in the field of household appliances, showing the results of the lab, which demonstrate the validity of the proposed multidisciplinary approach (RQ1) with the integration of MR technologies to develop GUIs for the Metaverse. All projects provided interesting aspects of design innovation, showing the potential of MR in solving real problems when supported by a combination of creative and technological skills. The students’ group work succeeded in giving new functionality to products such as a coffee machine, domotic lighting control, a Hi-Fi system, kitchen machine, hob, microwave, oven, fridge, alarm clock, and vacuum cleaner. Additionally, the feedback of industrial design students in terms of acceptance, effectiveness, usefulness, efficiency, and satisfaction showed results, although preliminary, but extremely positive and encouraging (RQ2).
In conclusion, this lab represents an initial attempt to integrate the concepts of interaction design and the use of MR technologies within the industrial design program to develop novel interfaces for the Metaverse. We plan to conduct follow-up studies to explore the long-term impact of this approach and provide a more extensive understanding of the impact of the teaching method, as well as with respect to other topics other than household appliances.
One of the main limitations of this study is that the results are limited to the specific discipline of industrial design and may not apply to other design disciplines or educational contexts. We will therefore conduct further research to establish the generalizability of these results. Another limitation concerns the limited sample size considered, which affects the robustness of the results. However, our study was designed as a preliminary evaluation of the new teaching method, and we believe that the results provide valuable insights for future research with larger sample sizes. Indeed, we understand the importance of having a larger sample to confirm the robustness of our results. Therefore, future research will focus on extending the methodology applied in the multidisciplinary lab to other research topics and on comparing it with other master’s degree course programs in the industrial design field. However, we aim to extend the study by administering questionnaires to expert users, such as designers or university professors, so that a broad sample of responses can be obtained.
We will also further evaluate the impact of these MR interfaces by expanding the study of the following metrics: user experience through a UEQ and system usability through the SUS. In this way, it will be possible to better comprehend whether the laboratory projects find adequate and wider validation in the design phase.