Design of a Gaze-Controlled Interactive Art System for the Elderly to Enjoy Life
Abstract
:1. Introduction
1.1. Background
1.2. Research Motivation
1.3. Research Problems and Objectives
- (1)
- How can the elderly be provided with a smooth and burden-free experience through the design of interactive art experiences with gaze estimation interfaces?
- (2)
- How can the use of interactive art and gaze estimation be developed to promote active aging in the elderly, allowing for them to enjoy meaningful activities in their later life?
- (1)
- The analysis of physical and mental conditions and needs of elderly individuals to develop an effective gaze estimation system using highly portable sensing equipment.
- (2)
- The exploration of performance forms of interactive art applications for elderly experiences and the summarization of system design principles.
- (3)
- The development of an interactive art system that integrates gaze estimation with the aim of promoting active aging among the elderly.
- (4)
- The organization of a public exhibition and the evaluation of the system’s usability, enjoyment, and experiential quality through expert interviews and questionnaire surveys.
- (1)
- User friendliness: Assessing how effectively a system allows users, particularly older adults, to achieve goals with ease of learning, quick mastery, and thoughtful design elements such as appearance, process simplification, and prompts [19].
- (2)
- (3)
- Interactive experience: Measuring users’ satisfaction during interactions, encompassing sensory experiences, emotional responses, and social interactions. A high score indicates positive impacts on physical and mental health, stimulating the body, enriching the mind, and slowing cognitive aging [22,23].
1.4. Research Scope and Process
1.5. Research Limitations
- (1)
- Participants are elderly individuals aged between 50 and 80 years old (inclusive).
- (2)
- Participants must exhibit basic self-awareness and have no cognitive impairments.
- (3)
- Participants must possess basic visual abilities, free from specific eye diseases, and achieve a visual acuity of 0.8 with corrective measures.
- (4)
- The research content and system design do not involve any medical practices.
2. Literature Review
2.1. The Needs of Elderly Users
2.1.1. Introduction of Technology and Elderly Physical and Mental Aging
2.1.2. Introduction of Technology and Elderly Life Satisfaction
2.1.3. Elderly Health and Well-Being
2.1.4. A Summary
2.2. Orange Technology
2.2.1. Background and Development of Orange Technology
2.2.2. Concepts and Applications of Orange Technology
- (A)
- Types of orange computing
- (1)
- Affective computing: involving capturing emotional cues like tone of voice, facial expressions, and body movements, and including humanizing robots to express emotions.
- (2)
- Biosignal processing technology: focusing on gathering vital signs such as blood pressure and heart rate through sensors, aiding healthcare professionals in providing accurate medical advice.
- (3)
- Advanced companions or assistive technology: offering advanced companions or assistive technologies that support elderly independence at home, leveraging human-computer interaction, robotics, and sensor devices.
- (B)
- Types of orange technology
- (1)
- Health technology: This includes health and safety care for the elderly and children, disease prevention, and medical diagnosis systems. It also integrates computer communication with medical systems and establishes cloud services for remote medical care.
- (2)
- Happiness technology: This involves caring for individuals with physical and mental disabilities, promoting social and humanistic literacy, and measuring happiness indices. Common applications include using biological signals such as blood pressure, heart rate, and exercise frequency to assess happiness, or measuring emotional responses like smiles and laughter.
- (3)
- Warm technology: This type focuses on rescue and care for victims of natural disasters and supports socially disadvantaged groups like low-income families. It emphasizes innovative technological applications to enhance human and social interaction, fostering mutual care and community support.
2.2.3. Relevant Cases of Orange Technology Applications for the Elderly
- (1)
- Technology advancements enable innovative applications in elderly care, embodying the compassion and warmth of orange technology.
- (2)
- Robots provide diverse services including assistance and companionship, enhancing smart home care for a comfortable and secure living environment.
- (3)
- Interactive devices designed for leisure and entertainment purposes improve hand–eye coordination and mitigate cognitive aging through physical engagement and sensory stimulation.
2.3. Interactive Art
2.3.1. Characteristics and Expression Modes of Interactive Art
- (1)
- Distance interaction: Involving cameras or sensors to detect viewers’ behaviors like body movements, eye movements, or sound, triggering feedback events.
- (2)
- Contact interaction: Enabling direct physical contact between the system and viewers for tangible interaction and feedback.
- (3)
- Symbiotic interaction: Utilizing viewers’ biological signals, such as electro-encephalograms and heart rates, for interaction.
- (4)
- Chance-based distance and contact interaction: Incorporating unpredictable viewer behavior into interactive feedback designs, often combining illusions and digital presentations for unexpected effects.
2.3.2. Interactive Art and Promoting the Health of the Elderly
2.3.3. Relevant Cases of Interactive Art Applications for the Elderly
2.4. Gaze Estimation
2.4.1. Visual Estimation and Eye Movement in the Elderly
2.4.2. Gaze Estimation and Human–Computer Interface
- (1)
- Three-dimensional gaze prediction: Predicting eye gaze direction vectors in three-dimensional space based on the relative positions of users and sensing devices.
- (2)
- Gaze point prediction: Predicting the focus point of vision, represented in two-dimensional plane coordinates.
- (3)
- Gaze target prediction: Predicting specific objects in the input image that a person is looking at.
2.4.3. Relevant Case Studies on Gaze Estimation in Elderly Applications
- (1)
- Compared to traditional human–machine interfaces, gaze estimation enables users to overcome physical limitations with minimal bodily movement, promoting independent living and enhancing quality of life.
- (2)
- Remote sensors are preferred over wearable sensors in device design, maintaining an effective distance to reduce user discomfort during interaction.
- (3)
- Gaze estimation interfaces typically employ computer vision technology, utilizing various types of cameras (depth, infrared, RGB) coupled with deep learning or machine learning to track eye positions.
3. Research Methods
- (1)
- Experimental site: Elderly learning centers in Changhua, Yunlin, and Chiayi counties and cities.
- (2)
- Subjects: Seniors aged 50 and above, with basic self-awareness and no serious visual impairments.
- (3)
- System experiencing time: Approximately 30 min in total, including 5 min for explanations and operation instructions for the system interaction, 15 min for experiencing the system interaction, and 10 min for questionnaire completion.
- (4)
- Experiencing experiment content: Firstly, a researcher of this study explains the interactive process and basic operation instructions to the participants. After participants have a basic understanding of the overall process, each participating user proceeds with the formal experiencing process while the researcher uses a camera to record the user’s interactions with the system. Afterwards, the participant is invited to complete a questionnaire, with assistance if needed.
3.1. Prototyping
- (1)
- Requirements analysis: Through literature review and case analysis, the forms of interactive art expression, potential interaction modes, and usage requirements of the elderly are summarized, while the initial concept of the system is outlined.
- (2)
- Prototype development: Based on the initial concept, a system prototype is implemented, with the system’s interactive framework being constructed and the content design conducted, followed by testing the preliminary effectiveness of the prototype.
- (3)
- Revision and improvement: Interviews are conducted with some target users or experts using the preliminary system prototype, their suggestions to revise the system are referenced, and the details are adjusted until the prototype is perfected.
- (4)
- System experience: After the system development is completed, a public demonstration of the system experience is conducted for users to participate in, and user feedback is collected during the process for the final evaluation of system effectiveness.
3.2. Expert Interview
3.3. Questionnaire Survey
- (1)
- Survey timing: After the elderly participants had experienced the system, they were asked to fill out the questionnaire, which took approximately 5 min. We estimated that 50 participants would be surveyed.
- (2)
- Survey method: The questionnaire was distributed anonymously, and it was administered sequentially after the system experience, which took about 10 min. The questionnaires were distributed to the elderly participants before the explanation of the interactive process and operational instructions began.
- (3)
- Steps for conducting the survey: (i) the questionnaire was explained to the participants; (ii) the questionnaire was distributed among the participants; and (iii) the participants were asked to fill out the questionnaire.
3.3.1. Technology Acceptance Model
3.3.2. Aesthetic Emotions Scale
- (1)
- Prototypical aesthetic emotions: Primarily evaluating evaluative aesthetic emotions regarding the style or design of the work, focusing on aesthetic aspects. Expressions used for assessment include “finding it beautiful”, “I like it very much”, “it impresses me”, “deeply moved”, and “very admirable”.
- (2)
- Epistemic emotions: Involving emotional experiences of seeking meaning or being touched by the work, delving into deeper levels of observation, covering the novelty or complexity of the work. Expressions used for assessment include “feeling curious”, “sparking my interest”, “challenging”, and “sensing deeper meanings”.
- (3)
- Pleasing emotions: Accompanying the experiential feelings during the interaction with the work, including the level of sensory stimulation or finding other meaningful aspects in the work, resulting in pleasant and comfortable emotions. Expressions used for assessment include “making me happy”, “feeling interesting”, “energetic”, “uplifting”, and “feeling relaxed”.
- (4)
- Negative emotions: Representing negative feelings without any other implications, although inappropriate aesthetic emotions do exist. Expressions used for assessment include “feeling ugly”, “feeling boring”, “feeling confused”, “feeling repelled”, “feeling anxious”, and “feeling sad”.
3.3.3. Strategic Experiential Models
- (1)
- Sense experience: Focusing on system-generated sensory stimuli (visual, auditory, gustatory, olfactory, and tactile) to evoke positive user responses.
- (2)
- Think experience: Stimulating active problem-solving and creativity, encouraging users to seek information and engage deeply.
- (3)
- Feel experience: Aiming to evoke emotional responses and enrich user interaction with positive feelings.
- (4)
- Relate experience: Fostering user identification and connection with the system, triggering associations with personal and cultural values.
- (5)
- Act experience: Transforming passive engagement into active involvement, prompting users to share experiences, discuss, and take meaningful actions.
3.3.4. Design of Questionnaires for Questionnaire Survey
- (A)
- Questionnaire design for the scale of user friendliness
- (B)
- Questionnaire design for the scale of user satisfaction
- (C)
- Questionnaire design for the scale of user experience
4. Design of Proposed System
4.1. Design Concept
4.2. System Design
- (1)
- Gaze estimation system design: focusing on developing an interactive interface based on gaze estimation.
- (2)
- Interactive art system design: focusing on presenting interactive content based on the identified gaze position.
4.3. System Process
- (1)
- Stage 1: Visual design set in the scene of Deep sea, where the feedback camera captures the screen continuously until the user is near the center of the camera. This stage primarily adjusts and confirms the initial position of the user’s head.
- (2)
- Stage 2: Visual design set in the scene of Shallow sea, featuring a regular animation of fish movements to attract user attention. It guides the user’s gaze to track the fish’s position, while the system captures gaze data to achieve gaze calibration.
- (3)
- Stage 3: Visual design set in the scene of Desert, with randomly appearing dots of light to stimulate user response. Users are encouraged to fix their gaze on the light dots, triggering their disappearance. The number of triggers confirms the effectiveness of the calibration and the user’s intuitive control.
- (4)
- Stage 4: Visual design set in the scene of Starry sky, where three themed and one random preview images are evenly distributed on the screen. Users can choose a theme to experience by gazing at their preferred option, and all choices can be repeated.
- (5)
- Stage 5: Entry into the selected theme experience. After experiencing the chosen theme, users obtain a theme experience result based on their interactive operations. The four theme choices are described as follows:
- (i)
- Star Garden: The scene Garden, themed with flower species, interacting with flowers through scanning saccades and gaze eye movements.
- (ii)
- Animal Forest: The scene Forest, themed with animal species, interacting with animals through gaze eye movements.
- (iii)
- Forest Trees: The scene Rainforest, themed with tree species, interacting with trees through scanning saccades and gaze eye movements.
- (iv)
- Random Theme: Choosing this option randomly enters Star Garden, Animal Forest, or Forest Trees for the experience.
- (6)
- Stage 6: Visual design set in the Cloud layer scene. After the user completes experiences in three themes, the system generates a correlated natural landscape drawing based on the user’s interactions with the three themes, providing the final experience result.
- (1)
- System input: Users interact with the system through eye movements (Eye Movement). The system captures user facial images using a webcam (WebCam) as the input source for the interactive system. These inputs are integrated by the interactive art system and further processed to interpret user interaction behaviors.
- (2)
- Gaze estimation system: Built using C++, integrating TensorFlow Lite (TFLite) and the OpenCV open-source library to construct deep learning and machine learning models. It analyzes user facial images received from the interactive art system, extracts image features, and estimates the user’s gaze coordinates on the display screen.
- (3)
- Interactive art system: Constructed using Unity for creating interactive art experience environments. Front-end control of the interactive system was programmed in C#, and visual effects were developed using CG scripting. This system also integrates the gaze estimation system, continuously receiving estimated gaze coordinates. These coordinates are used for real-time front-end interaction control and feedback for visual performance.
- (4)
- System output: Visual images and audio effects from the interactive art system are outputted through the monitor and speakers, providing users with sensory feedback (Sensory Feedback) in terms of visual and auditory experiences.
4.4. System Development
- (A)
- In the artistic aspect
- (B)
- In the system development aspect
4.4.1. Gaze Estimation System Process
- (A)
- Face feature extraction stage
- (B)
- Space transform stage
- (C)
- Eye feature extraction stage
- (D)
- Regression stage
4.4.2. Interactive Art System Process
- (1)
- Gazer manager: developed in C++ to manage gaze estimation, compiled into a DLL, and wrapped with an interface written in C# within Unity3D, allowing for it to be called as an API and executed according to Unity’s operational standards.
- (2)
- Game manager: a central management tool written in Unity’s scripting language that controls the overall system state and processes, including scene initialization and interactions between modules.
- (3)
- Scene manager: initializing scenes based on their themes; managing object generation, scene effects, user interface (UI) display, and transitions/loading between scenes; and handling of relatively static scene object controls.
- (1)
- Static sensory feedback scene module: Controlled by the scene manager, this module handles the initialization of themed scenes. It includes setting background music and sound effects specific to the theme, initializing scene objects, adjusting parameters, and displaying dynamic background visuals. This module focuses on static visual and auditory feedback.
- (2)
- Dynamic interactive feedback module: This module utilizes the output of gaze estimation, particularly the gaze point, to interact with objects within the interactive art system. It manages effects generation, object manipulation, model animations, display of gaze-point heatmaps, and other dynamic interactions within the system.
4.4.3. Major Steps in implementing the Gaze Estimation and Interactive Art Processes
- (A)
- Camera calibration
- (B)
- Face and iris landmark detection
- (C)
- Head pose estimation
- (D)
- Space transformation
- (E)
- Extraction of eye features
- (F)
- Use of ridge regression and Karman filtering to enhance gaze accuracy
- (G)
- Interactive interfacing triggered by collision of the gaze vector and the object
- (1)
- Transform the screen coordinates into the world coordinate system of the virtual space.
- (2)
- Determine whether the center point Pc of an object is contained within the expanded volume range of the camera ray’s direction. There are two ways to make this decision, as described in the following:
- (a)
- Under the assumption of using a perspective camera: As illustrated in Figure 22a, in this case, the goal becomes to decide whether the point Pc appears within the cone shape formed by the gaze vector as the central axis.
- (b)
- Under the assumption of using an orthographic camera: As illustrated in Figure 22b, in this case, the goal becomes to decide whether the point Pc appears within the cylinder shape formed by the gaze vector as the central axis.
- (H)
- Design of system scenes in the process of the interactive art system
5. Research Results
- (1)
- Experimental sites:
- (a)
- Zhuzi Township Farmers’ Association in Chiayi County, Taiwan;
- (b)
- Fenyuan Citou Community in Changhua County, Taiwan;
- (c)
- Zhuzi Township Library in Chiayi County, Taiwan;
- (d)
- Lelin Township Community Service Center in Yunlin County, Taiwan.
- (2)
- Participants: Seniors aged 50 (inclusive) to 80 (inclusive) with basic cognitive and visual abilities.
- (3)
- Number of Participants: A total of 52 senior participants.
5.1. Public Demonstrations of the System
5.2. Analysis of Expert Interview Results
5.2.1. Comments Collected from Expert Interviews
5.2.2. Conclusions from Expert Interviews
- (1)
- The interactive interface of the system introduces a new digital experience for older adults, not only reducing physical burdens but also providing attention training.
- (2)
- Experiencing the system can contribute to the physical and mental development of older adults, including benefits like attention training and delaying cognitive decline.
- (3)
- The proposed system uses gaze estimation for interaction interfacing, ensuring that the system is easy to use, with real-time responses to the elderly’s actions.
- (4)
- The proposed system can incorporate interactive elements that better align with older adults’ past experiences and feature a simple and bright interface design.
- (5)
- The system can introduce timely interactive processes akin to overcoming challenges, providing enjoyment from immediate feedback to enhance older adults’ sense of participation and satisfaction.
5.3. Analysis of Questionnaire Survey Results
5.3.1. Sample Structure Analysis
5.3.2. Analysis of Reliability and Validity of Questionnaire Survey Results
- (1)
- Step 1: Verification of the adequacy of the questionnaire dataset
- (2)
- Step 2: Finding the latent dimensions of the questions from the collected data
- (3)
- Step 3: Verifying the reliability of the collected questionnaire data
- (4)
- Step 4: Verification of applicability of the structural model established with the dimensions
- (5)
- Step 5: Verification of the validity of the collected questionnaire data
5.3.3. Analysis of Questionnaire Data about the Scale of User Friendliness
- (A)
- Data analysis for the latent dimension “intention to act”
- (1)
- The average values of the dimension “intention to act” are all greater than 3.5, with agreement rates exceeding 60%, indicating that a majority of the older adults expressed a certain level of recognition and acceptance towards interacting with and using the estimated gaze interface.
- (2)
- Question R1 and R6 have average values above 4, indicating that the older adults were satisfied with the accuracy of the gaze estimation system. Through this interactive interface, older adults were more willing to engage with technology. However, a small percentage of older adults disagreed, believing that the interactive interface was not easy to operate smoothly.
- (3)
- The average vagrealues for items R3 and R2 are both below 4, but the agreement rates exceed 60%. This indicates that some older adults showed no significant interest in the system and did not intend to incorporate its use into their daily leisure activities. A few individuals even expressed disagreement with these statements.
- (B)
- Data analysis for the latent dimension “usefulness”
- (1)
- The average values of this dimension are all greater than 4, with agreement rates exceeding 70%. This indicates that a majority of the older adults find gaze estimation helpful for their use of technology.
- (2)
- Question R4 has a standard deviation close to 1, and both R4 and R5 show some disagreement options, suggesting divergent views among the older adults regarding the convenience of gaze estimation and its effectiveness in reducing physical burden when operating technology. However, with agreement rates exceeding 70% and average scores above 4, it shows that most of the older adults perceived the system interface as beneficial for their use of technology.
- (3)
- Questions R7 and R8 have zero percentages for disagreement and strong disagreement, with agreement rates exceeding 80%, indicating that the older adults found the gaze estimation operations simple and felt confident during the experiencing process.
5.3.4. Analysis of Questionnaire Data about the Scale of User Satisfaction
- (A)
- Data analysis for the latent dimension “pleasure and relaxation”
- (1)
- The average values of this latent dimension are all greater than 4, with agreement rates exceeding 80%, indicating that the majority of the older adults exhibited positive emotions during the system experience process.
- (2)
- Question S8 and S1 have standard deviations below 0.7, and agreement rates exceeding 90%, indicating that the older adults were satisfied with the system’s visual design and interactive content.
- (3)
- Question S7 has a standard deviation above 0.8 and includes disagreement options, suggesting that some older adults experienced a sense of relaxation during the experience. However, overall, the relaxed pace contributed to calming their emotions.
- (4)
- Question S2 and S1 have zero percentages for disagreement and strong disagreement, indicating that the older adults perceived the visual presentation as aesthetically pleasing and mood-enhancing.
- (B)
- Data analysis for the latent dimension “cognitive stimulation”
- (1)
- Questions S4 and S3 have standard deviations above 0.8, with disagreement options present. Among these, S4 have average scores below 4, indicating that for some older adults, the interactive content may have seemed somewhat bland, failing to create an immersive or memorable experience.
- (2)
- Question S5 shows a higher percentage of agreement, suggesting that the exploratory interactive content piqued curiosity among the older adults.
- (3)
- Question S7 has no disagreement or strong disagreement, indicating that overall, the older adults exhibited emotionally driven responses to action following their system experience.
5.3.5. Analysis of Questionnaire Data about the Scale of User Experience
- (A)
- Data analysis for the latent dimension “experience satisfaction”
- (1)
- The average scores for the “experience satisfaction” dimension are all above 4, with no responses indicating disagreement or strong disagreement. This indicates that the majority of older adults, after experiencing the system, were satisfied with the benefits it brought to them.
- (2)
- Question T8 shows a proportion of agreement exceeding 90%, with a standard deviation below 0.7. This suggests that most of the older adults perceived the system as beneficial to enhancing their social participation.
- (3)
- Question T3 has a relatively high proportion of responses in the neutral category, indicating that for some older adults, there was less noticeable relief from stress after experiencing the system.
- (B)
- Data analysis for the latent dimension “enrichment in life”
- (1)
- The average scores for this latent dimension are all above 4.0, indicating that the majority of the older adults believed that the activities experienced had a positive impact on their lives.
- (2)
- Question T7 has a higher proportion of neutral responses, indicating that for some the older adults, the system did not stimulate an active mindset towards social interaction.
5.3.6. A Summary of Questionnaire Survey Data Analysis
- (1)
- The integration of gaze estimation with interactive art activities has a positive impact on the quality of life factors for older adults, including their physical and mental development and social interactions.
- (2)
- Integration of gaze estimation with interactive art positively impacts older adults’ quality of life, enhancing physical and mental development, and social interactions.
- (3)
- Abstract visual designs may not effectively stimulate active responses in older adults due to their diverse life backgrounds.
- (4)
- Diverse themes and interactive elements provide more opportunities for older adults to engage with systems, sparking curiosity and enhancing enjoyment.
- (5)
- Older adults highly accept visual designs themed around nature, creating a relaxing emotional state.
- (6)
- Lack of challenging interactive elements in system design leads to varying perceptions among older adults regarding immersion and engagement.
- (7)
- Older adults find eye-interaction interfaces convenient and beneficial for physically aging populations.
- (8)
- Stringent pre-calibration procedures increase learning burden for older adults, impacting system usability and user acceptance.
6. Conclusions and Suggestions
6.1. Discussion and Contributions
- (1)
- Older adults perceive the proposed system “Natural Rhythm through Eyes” positively for its interaction methods and social inclusivity, indirectly supporting attention training.
- (2)
- Usability of the proposed system is noted to be relatively low, highlighting areas for improvement in gaze estimation calibration and operational feedback.
- (3)
- The natural ecological visual design of the proposed system relaxes older adults emotionally, while its interactive format stimulates enjoyable engagement.
- (4)
- Older adults reported positive emotional experiences with the proposed system, enhancing their quality of life.
- (5)
- The proposed system lacks challenging interactive elements, which could be enhanced to increase older adults’ participation and satisfaction.
- (6)
- Integration of interactive elements in the proposed system aligns with older adults’ past experiences, strengthening life connections.
6.2. Conclusions
- (1)
- Enhanced Usability of Gaze Estimation for Elderly Users’ Convenience in life
- (2)
- Increased Enjoyment and Engagement Through Nature-Themed Interactive Art
- (3)
- Positive Influence on Active Aging Through the Integration of Gaze Estimation and Interactive Art
6.3. Suggestions for Future Studies
- (1)
- Simplifying the calibration process of the gaze estimation interactive interface and enhancing user guidance through visual feedback using animations and sound.
- (2)
- Enhancing the system’s relevance to older adults’ lives to increase satisfaction, such as incorporating nostalgic music and bright, simple interfaces.
- (3)
- Introducing challenging interactive content within feasible limits to promote active engagement and immersion.
- (4)
- Enriching the system interface with animations, images, and interactive feedback to diversify themes and attract attention, thereby improving usability.
- (5)
- Expanding the application of the developed interactive system to various settings for a broader range of older adults, modularizing the system based on different experiential contexts.
- (6)
- Introducing multiplayer interaction modes to enhance social engagement among older adults, allowing for interaction with friends and family to foster a sense of belonging and satisfaction.
- (7)
- Regarding the evaluation of the effectiveness of the proposed system, it may be beneficial to include a sample of older adults as a control group. This would allow for analysis and comparison of the differences between the experimental group and the control group to reach a more precise conclusion.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- World Health Organization. United Nation’s Decade of Healthy Ageing (2021–2030); World Health Organization: Geneva, Switzerland, 2020. [Google Scholar]
- Park, D.C.; Lautenschlager, G.; Hedden, T.; Davidson, N.S.; Smith, A.D.; Smith, P.K. Models of visuospatial and verbal memory across the adult life span. Psychol. Aging 2002, 17, 299–320. [Google Scholar] [CrossRef]
- Mazzone, D.M. Digital or Death: Digital Transformation: The Only Choice for Business to Survive Smash and Conquer; Smashbox Consulting Inc.: New York, NY, USA, 2014. [Google Scholar]
- Barea, R.; Boquete, L.; Mazo, M.; Lopez, E. System for Assisted Mobility Using Eye Movements Based on Electrooculography. IEEE Trans. Neural Syst. Rehabil. Eng. 2002, 10, 209–218. [Google Scholar] [CrossRef] [PubMed]
- Paing, M.P.; Juhong, A.; Pintavirooj, C. Design and Development of an Assistive System Based on Eye Tracking. Electronics 2022, 11, 535. [Google Scholar] [CrossRef]
- Eisapour, M.; Cao, S.; Boger, J. Participatory design and evaluation of virtual reality games to promote engagement in physical activity for people living with dementia. J. Rehabil. Assist. Technol. Eng. 2020, 7, 2055668320913770. [Google Scholar] [CrossRef] [PubMed]
- Fozard, J.L.; Rietsema, J.; Bouma, H.; Graafmans, J.A.M. Gerontechnology: Creating Enabling Environments for the Challenges and Opportunities of Aging. Educ. Gerontol. 2000, 26, 331–344. [Google Scholar] [CrossRef]
- Rama, M.D.; de Ridder, H.; Bouma, H. Technology generation and Age in using layered user interfaces. Gerontechnology 2001, 1, 25–40. [Google Scholar] [CrossRef]
- Giuli, C.; Papa, R.; Lattanzio, F.; Postacchini, D. The Effects of Cognitive Training for Elderly: Results from My Mind Project. Rejuvenation Res. 2016, 19, 485–494. [Google Scholar] [CrossRef] [PubMed]
- Chen, X.; Huang, F.; Wang, Y. The integration and development of piano art and media education and its influence on the long-term care and happiness of elderly people. Front. Psychol. 2021, 12, 593835. [Google Scholar] [CrossRef]
- Tao, T.; Sato, R.; Matsuda, Y.; Takata, J.; Kim, F.; Daikubara, Y.; Fujita, K.; Hanamoto, K.; Kinoshita, F.; Colman, R.; et al. Elderly Body Movement Alteration at 2nd Experience of Digital Art Installation with Cognitive and Motivation Scores. J. Multidiscip. Sci. J. 2020, 3, 138–150. [Google Scholar] [CrossRef]
- Luyten, T.; Braun, S.; Jamin, G.; van Hooren, S.; de Witte, L. How nursing home residents with dementia respond to the interactive art installation ‘VENSTER’: A pilot study. Disabil. Rehabil. Assist. Technol. 2018, 13, 87–94. [Google Scholar] [CrossRef] [PubMed]
- Melyani, M.; Prabowo, H.; Hidayanto, A.N.; Gaol, F.L. Smart home component using orange technology for elderly people: A systematic literature. In Proceedings of the 2018 Indonesian Association for Pattern Recognition International Conference (INAPR), Jakarta, Indonesia, 7–8 September 2018; pp. 166–171. [Google Scholar] [CrossRef]
- Salichs, M.A.; Castro-González, Á.; Salichs, E.; Fernández-Rodicio, E.; Maroto-Gómez, M.; Gamboa-Montero, J.J.; Malfaz, M. Mini: A New Social Robot for the Elderly. Int. J. Soc. Robot. 2020, 12, 1231–1249. [Google Scholar] [CrossRef]
- Nishio, T.; Yoshikawa, Y.; Sakai, K.; Iio, T.; Chiba, M.; Asami, T.; Isoda, Y.; Ishiguro, H. The Effects of Physically Embodied Multiple Conversation Robots on the Elderly. Front. Robot. AI 2021, 8, 633045. [Google Scholar] [CrossRef] [PubMed]
- Klaib, A.F.; Alsrehin, N.O.; Melhem, W.Y.; Bashtawi, H.O. IoT Smart Home Using Eye Tracking and Voice Interfaces for Elderly and Special Needs People. J. Commun. 2019, 14, 614–621. [Google Scholar] [CrossRef]
- Pinheiro, C.G.; Naves, E.L.; Pino, P.; Losson, E.; Andrade, A.O.; Bourhis, G. Alternative communication systems for people with severe motor disabilities: A survey. Biomed. Eng. Online 2011, 10, 31. [Google Scholar] [CrossRef] [PubMed]
- Park, J.H. The effects of eyeball exercise on balance ability and falls efficacy of the elderly who have experienced a fall: A single-blind, randomized controlled trial. Arch. Gerontol. Geriatr. 2017, 68, 181–185. [Google Scholar] [CrossRef]
- Bissoli, A.; Lavino-Junior, D.; Sime, M.; Encarnação, L.; Bastos-Filho, T. A Human–Machine Interface Based on Eye Tracking for Controlling and Monitoring a Smart Home Using the Internet of Things. Sensors 2019, 19, 859. [Google Scholar] [CrossRef] [PubMed]
- Wikström, B.M. Social Interaction Associated with Visual Art Discussions: A Controlled Intervention Study. Aging Ment. Health 2002, 6, 82–87. [Google Scholar] [CrossRef] [PubMed]
- Relf, P.D. The therapeutic values of plants. Pediatr. Rehabil. 2005, 8, 235–237. [Google Scholar] [CrossRef]
- Watson, D.; Clark, L.A.; Tellegen, A. Development and validation of brief measures of positive and negative affect: The PANAS scales. J. Personal. Soc. Psychol. 1988, 54, 1063–1070. [Google Scholar] [CrossRef] [PubMed]
- Morrison, A.J.; Mitchell, P.; Brereton, M. The lens of ludic engagement: Evaluating participation in interactive art installations. In Proceedings of the 15th ACM International Conference on Multimedia (MM’07), Augsburg, Germany, 24–29 September 2007; pp. 509–512. [Google Scholar] [CrossRef]
- Nunes, F.; Silva, P.A.; Abrantes, F. Human-computer interaction and the older adult. In Proceedings of the 3rd International Conference on PErvasive Technologies Related to Assistive Environments (PETRA’10), Samos, Greece, 23–25 June 2010; Volume 49, pp. 1–8. [Google Scholar] [CrossRef]
- Hawthorn, D. Possible Implications of Aging for Interface Designers. Interact. Comput. 2000, 12, 507–528. [Google Scholar] [CrossRef]
- Hunter, A.; Sayers, H.; McDaid, L. An Evolvable Computer Interface for Elderly Users. In Proceedings of the Supporting Human Memory with Interactive Systems Workshop at the 2007 British HCI Conference, Lancaster, UK, 4 September 2007; pp. 29–32. [Google Scholar]
- Williams, D.; Alam MA, U.; Ahamed, S.I.; Chu, W. Considerations in designing human-computer interfaces for elderly people. In Proceedings of the 2013 13th International Conference on Quality Software, Nanjing, China, 29–30 July 2013; pp. 372–377. [Google Scholar] [CrossRef]
- Kantner, L.; Rosenbaum, S. Usable Computers for the Elderly: Applying Coaching Experiences. In Proceedings of the IEEE International Professional Communication Conference (IPCC), Orlando, FL, USA, 21–24 September 2003; pp. 10–15. [Google Scholar] [CrossRef]
- Şahin, D.S.; Özer, Ö.; Yanardağ, M.Z. Perceived social support, quality of life and satisfaction with life in elderly people. Educ. Gerontol. 2019, 45, 69–77. [Google Scholar] [CrossRef]
- Diener, E. Guidelines for national indicators of subjective well-being and ill-being. Appl. Res. Qual. Life 2006, 1, 151–157. [Google Scholar] [CrossRef]
- Veenhoven, R. The Study of Life-Satisfaction; Eötvös University Press: Budapest, Hungary, 1996; Available online: http://hdl.handle.net/1765/16311 (accessed on 20 October 2022).
- Pavot, W.; Diener, E. Review of the Satisfaction with Life Scale. Psychol. Assess. 1993, 5, 164–172. [Google Scholar] [CrossRef]
- Kasprzak, E. Perceived Social Support and Life-Satisfaction. Pol. Psychol. Bull. 2010, 41, 144–154. [Google Scholar] [CrossRef]
- Cuadra-Peralta, A.; Veloso-Besio, C.; Puddu-Gallardo, G.; Salgado-García, P.; Peralta-Montecinos, J. Effects of a positive psychology program in depressive symptoms and life satisfaction in the elderly. Psicol. Reflexão e Crítica 2012, 25, 644–652. [Google Scholar] [CrossRef]
- Cuadra, L.H.; Florenzano, U.R. Subjective well-being: Towards a positive psychology. J. Psychol. Univ. Chile 2003, 12, 83–96. [Google Scholar]
- Ghimire, S.; Baral, B.K.; Karmacharya, I.; Callahan, K.; Mishra, S.R. Life Satisfaction among Elderly Patients in Nepal: Associations with Nutritional and Mental Well-Being. Health Qual. Life Outcomes 2018, 16, 118. [Google Scholar] [CrossRef] [PubMed]
- Cha, Y.J. Correlation between Leisure Activity Time and Life Satisfaction: Based on KOSTAT Time Use Survey Data. Occup. Ther. Int. 2018, 2018, 5154819. [Google Scholar] [CrossRef]
- Blit-Cohen, E.; Litwin, H. Computer Utilization in Later-Life: Characteristics and Relationship to Personal Well-Being. Gerontechnology 2005, 3, 76–86. [Google Scholar] [CrossRef]
- Wang, C.M.; Tseng, S.M.; Huang, C.S. Design of an Interactive Nostalgic Amusement Device with User-Friendly Tangible Interfaces for Improving the Health of Older Adults. Healthcare 2020, 8, 179. [Google Scholar] [CrossRef]
- World Health Organization. Active Ageing: A Policy Framework; World Health Organization: Geneva, Switzerland, 2002. [Google Scholar]
- Lu, L. Creating wellbeing among older people: An Eastern perspective. In The Handbook of Stress and Health: A Guide to Research and Practice; Cooper, C.L., Quick, J.C., Eds.; John Wiley & Sons: Hoboken, NJ, USA, 2017. [Google Scholar]
- Kaeberlein, M.; Rabinovitch, P.S.; Martin, G.M. Healthy Aging: The Ultimate Preventative Medicine. Science 2015, 350, 1191–1193. [Google Scholar] [CrossRef]
- Rowe, J.W.; Kahn, R.L. Human aging: Usual and successful. Science 1987, 237, 143–149. [Google Scholar] [CrossRef]
- Rowe, J.W.; Kahn, R.L. Successful aging. Gerontologist 1997, 37, 433–440. [Google Scholar] [CrossRef] [PubMed]
- Baltes, P.B.; Baltes, M.M. Psychological Perspectives on Successful Aging: The Model of Selective Optimization with Compensation. In Successful Aging: Perspectives from the Behavioral Sciences; Cambridge University Press: Cambridge, UK, 1990; pp. 1–34. [Google Scholar] [CrossRef]
- Garfein, A.J.; Herzog, A.R. Robust Aging among the Young-Old, Old-Old, and Oldest-Old. J. Gerontol. Ser. B Psychol. Sci. Soc. Sci. 1995, 50, S77–S87. [Google Scholar] [CrossRef] [PubMed]
- Tornstam, L. Gerotranscendence: The contemplative dimension of aging. J. Aging Stud. 1997, 11, 143–154. [Google Scholar] [CrossRef]
- Ziyae, B. Presenting an Innovation Model in Orange Technology. J. Serv. Sci. Manag. 2016, 9, 433–442. [Google Scholar] [CrossRef]
- Wang, J.F.; Chen, B.W. Orange computing: Challenges and opportunities for awareness science and technology. In Proceedings of the 2011 3rd International Conference on Awareness Science and Technology (iCAST), Dalian, China, 27–30 September 2011; pp. 533–535. [Google Scholar] [CrossRef]
- Malan, D.J.; Fulford-Jones, T.; Welsh, M.; Moulton, S. Codeblue: An ad hoc sensor network infrastructure for emergency medical care. In Proceedings of the International Workshop on Wearable and Implantable Body Sensor Networks, London, UK, 6–7 April 2004. [Google Scholar]
- PARO Robots. PARO: PARO Therapeutic Robot. Available online: http://www.parorobots.com/ (accessed on 20 October 2022).
- ASUS. Asus ZenBo Is a Companion Robot Aimed at the Elderly. 30 May 2016. Available online: https://www.gsmarena.com/asus_zenbo_is_a_household_companion_robot_aimed_at_the_elderly-blog-18499.php (accessed on 2 October 2022).
- Industrial Technology Research Institute (ITRI). Personal Companion Robot for Older People Living Alone (PECOLA). Available online: https://www.itri.org.tw/english/ListStyle.aspx?DisplayStyle=01_content&SiteID=1&MmmID=1037333532423704272&MGID=1073044725611652176 (accessed on 20 October 2022).
- SHARP. RoBoHoN. Available online: https://robohon.com/ (accessed on 20 October 2022).
- Intuition Robotics. Introducting ElliQ: The Sidekick for Healthier Happier Aging. Available online: https://elliq.com/ (accessed on 20 October 2022).
- Martinez-Martin, E.; Cazorla, M. A socially assistive robot for elderly exercise promotion. IEEE Access 2019, 7, 75515–75529. [Google Scholar] [CrossRef]
- Swan, M. Sensor Mania! The Internet of Things, Wearable Computing, Objective Metrics, and the Quantified Self 2.0. J. Sens. Actuator Netw. 2012, 1, 217–253. [Google Scholar] [CrossRef]
- Herpich, M.; Rist, T.; Seiderer, A.; André, E. Towards a Gamified Recommender System for the Elderly. In Proceedings of the 2017 International Conference on Digital Health (DH’17), London, UK, 2–5 July 2017; pp. 211–215. [Google Scholar] [CrossRef]
- Manovich, L. The Language of New Media; MIT Press: Cambridge, MA, USA, 2002. [Google Scholar]
- Krueger, M.W.; Gionfriddo, T.; Hinrichsen, K. VIDEOPLACE—An Artificial Reality. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, San Francisco, CA, USA, 14–18 April 1985; Volume 16, pp. 35–40. [Google Scholar] [CrossRef]
- Paul, C. Digital Art; Thames & Hudson: London, UK, 2003; Volume 14. [Google Scholar]
- Edmonds, E. The art of interaction. Digit. Creat. 2010, 21, 257–264. [Google Scholar] [CrossRef]
- Wang, J.; Wang, Y.; Zhang, N.; Lee, E.-J.; Yabin, L.; Liao, G. The Key to The Future Development of Interactive Art—Virtual Reality Technology. J. Multimed. Inf. Syst. 2018, 5, 277–282. [Google Scholar] [CrossRef]
- Kelomees, R. Reversing the Spectator Paradigm: Symbiotic Interaction and the “Gaze” of the Artwork. Digit. Creat. 2019, 30, 143–160. [Google Scholar] [CrossRef]
- Siegler, E.L.; Lama, S.D.; Knight, M.G.; Laureano, E.; Reid, M.C. Community-Based Supports and Services for Older Adults: A Primer for Clinicians. J. Geriatr. 2015, 2015, 678625. [Google Scholar] [CrossRef]
- Roswiyani, R.; Kwakkenbos, L.; Spijker, J.; Witteman, C.L.M. The Effectiveness of Combining Visual Art Activities and Physical Exercise for Older Adults on Well-Being or Quality of Life and Mood: A Scoping Review. J. South. Gerontol. Soc. 2017, 38, 1784–1804. [Google Scholar] [CrossRef] [PubMed]
- Stephenson, R.C. Promoting well-being and gerotranscendence in an art therapy program for older adults. Art Ther. 2013, 30, 151–158. [Google Scholar] [CrossRef]
- Vi, C.T.; Ablart, D.; Gatti, E.; Velasco, C.; Obrist, M. Not just seeing, but also feeling art: Mid-air haptic experiences integrated in a multisensory art exhibition. Int. J. Hum. Comput. Stud. 2017, 108, 1–14. [Google Scholar] [CrossRef]
- Charitos, D.; Bourdakis, V.; Gavrilou, E. Embedding an audiovisual interactive installation environment in urban space for enhancing social interaction. In Proceedings of the 2006 2nd IET International Conference on Intelligent Environments, Athens, Greece, 5–6 July 2006; pp. 93–99. [Google Scholar] [CrossRef]
- Grinde, B.; Patil, G. Biophilia: Does Visual Contact with Nature Impact on Health and Well-Being? Int. J. Environ. Res. Public Health 2009, 6, 2332–2343. [Google Scholar] [CrossRef] [PubMed]
- Valk, C.; Lin, X.; Fijes, L.; Rauterberg, G.; Hu, J. Closer to Nature—Interactive Installation Design for Elderly with Dementia. In Proceedings of the 2017 3rd International Conference on Information and Communication Technologies for Ageing Well and e-Health (ICT4AWE), Porto, Portugal, 28–29 April 2017; pp. 228–235. [Google Scholar] [CrossRef]
- Bruil, L.; Adriaansen, M.J.M.; Groothuis, J.W.M.; Bossema, E.R. Kwaliteit van Leven van Verpleeghuisbewoners met Dementie Voor, Tijdens en Na het Spelen met de Tovertafel. Tijdschr. Gerontol. Geriatr. 2017, 49, 72–80. [Google Scholar] [CrossRef] [PubMed]
- Vertigo System. Interactive Aquarium as a Place for Social Gathering in the Nursing Home. Available online: https://www.vertigo-systems.de/en/business-sectors/detail/interactive-aquarium-as-a-place-for-social-gathering-in-the-nursing-home (accessed on 25 October 2022).
- OM Interactive. Motion Activated Projections in Aged Care. Available online: https://sensorywizard.com.au/aged-care/ (accessed on 25 October 2022).
- Planinc, R.; Nake, I.; Kampel, M. Exergame design guidelines for enhancing elderly’s physical and social activities. In Proceedings of the Third International Conference on Ambient Computing (AMBIENT 2013), Porto, Portugal, 29 September–4 October 2013; pp. 58–63. [Google Scholar]
- Zheng, Z.K.; Zhu, J.; Fan, J.; Sarkar, N. Design and System Validation of Rassle: A Novel Active Socially Assistive Robot for Elderly with Dementia. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Miller, C.A. The connection between drugs and falls in elders. Geriatr. Nurs. 2002, 23, 109–110. [Google Scholar] [CrossRef] [PubMed]
- Pinsault, N.; Vuillerme, N. The Effects of Scale Display of Visual Feedback on Postural Control During Quiet Standing in Healthy Elderly Subjects. Arch. Phys. Med. Rehabil. 2008, 89, 1772–1774. [Google Scholar] [CrossRef]
- Aartolahti, E.; Häkkinen, A.; Lönnroos, E.; Kautiainen, H.; Sulkava, R.; Hartikainen, S. Relationship between Functional Vision and Balance and Mobility Performance in Community-Dwelling Older Adults. Aging Clin. Exp. Res. 2013, 25, 545–552. [Google Scholar] [CrossRef]
- Bae, Y. Saccadic Eye Movement Improves Plantar Sensation and Postural Balance in Elderly Women. Tohoku J. Exp. Med. 2016, 239, 159–164. [Google Scholar] [CrossRef]
- Leigh, R.J.; Zee, D.S. The neurology of eye movements. In Contemporary Neurology; Oxford University Press: Oxford, UK, 2015. [Google Scholar]
- Marandi, R.Z.; Gazerani, P. Aging and eye tracking: In the quest for objective biomarkers. Future Neurol. 2019, 14, 4. [Google Scholar] [CrossRef]
- Morimoto, H.; Asai, Y.; Johnson, E.G.; Lohman, E.B.; Khoo, K.; Mizutani, Y.; Mizutani, T. Effect of oculo-motor and gaze stability exercises on postural stability and dynamic visual acuity in healthy young adults. Gait Posture 2011, 33, 600–603. [Google Scholar] [CrossRef] [PubMed]
- Ogudo, K.A.; Muwawa Jean Nestor, D.; Ibrahim Khalaf, O.; Daei Kasmaei, H. A Device Performance and Data Analytics Concept for Smartphones’ IoT Services and Machine-Type Communication in Cellular Networks. Symmetry 2019, 11, 593. [Google Scholar] [CrossRef]
- Liu, Y.; Sivaparthipan, C.B.; Shankar, A. Human–computer interaction based visual feedback system for augmentative and alternative communication. Int. J. Speech Technol. 2022, 25, 305–314. [Google Scholar] [CrossRef]
- Huang, L.; Li, Y.; Wang, X.; Wang, H.; Bouridane, A.; Chaddad, A. Gaze Estimation Approach Using Deep Differential Residual Network. Sensors 2022, 22, 5462. [Google Scholar] [CrossRef]
- Kim, H.; Suh, K.H.; Lee, E.C. Multi-Modal User Interface Combining Eye Tracking and Hand Gesture Recognition. J. Multimodal User Interfaces 2017, 11, 241–250. [Google Scholar] [CrossRef]
- Ezekiel, S. Look to Speak. Available online: https://experiments.withgoogle.com/looktospeak (accessed on 2 November 2022).
- Sidorakis, N.; Koulieris, G.A.; Mania, K. Binocular eye-tracking for the control of a 3D immersive multimedia user interface. In Proceedings of the 2015 IEEE 1st Workshop on Everyday Virtual Reality (WEVR), Arles, France, 23 March 2015; pp. 15–18. [Google Scholar] [CrossRef]
- Luo, W.; Cao, J.; Ishikawa, K.; Ju, D. A human-computer control system based on intelligent recognition of eye movements and its application in wheelchair driving. Multimodal Technol. Interact. 2021, 5, 50. [Google Scholar] [CrossRef]
- Eyeware. How Eye Tracking and Head Tracking Help Disabled Gamers Level Up. Available online: https://beam.eyeware.tech/disabled-gamers-level-up-head-eye-tracker/ (accessed on 2 November 2022).
- Special Effect. EyeMine 2 Play Minecraft with Your Eyes! Available online: https://www.specialeffect.org.uk/how-we-can-help/eyemine (accessed on 2 November 2022).
- Special Effect. StarGaze. Available online: https://www.specialeffect.org.uk/how-we-can-help/use-your-eyes-for-independence (accessed on 2 November 2022).
- Scalera, L.; Seriani, S.; Gallina, P.; Lentini, M.; Gasparetto, A. Human–Robot Interaction through Eye Tracking for Artistic Drawing. Robotics 2021, 10, 54. [Google Scholar] [CrossRef]
- Kubacki, A. Use of Force Feedback Device in a Hybrid Brain-Computer Interface Based on SSVEP, EOG and Eye Tracking for Sorting Items. Sensors 2021, 21, 7244. [Google Scholar] [CrossRef]
- Naumann, J.D.; Jenkins, A.M. Prototyping: The new paradigm for systems development. MIS Q. 1982, 6, 29–44. [Google Scholar] [CrossRef]
- Döringer, S. “The problem-centred expert interview”: Combining qualitative interviewing approaches for investigating implicit expert knowledge. Int. J. Soc. Res. Methodol. 2020, 24, 265–278. [Google Scholar] [CrossRef]
- Taherdoost, H. Validity and Reliability of the Research Instrument; How to Test the Validation of a Questionnaire/Survey in a Research. Int. J. Acad. Res. Manag. 2016, 5, 28–36. [Google Scholar] [CrossRef]
- Roopa, S.; Rani, M. Questionnaire Designing for a Survey. J. Indian Orthod. Soc. 2012, 46, 273–277. [Google Scholar] [CrossRef]
- Guner, H.; Acarturk, C. The Use and Acceptance of ICT by Senior Citizens: A Comparison of Technology Acceptance Model (TAM) for Elderly and Young Adults. Univers. Access Inf. Soc. 2018, 19, 311–330. [Google Scholar] [CrossRef]
- Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
- Schindler, I.; Hosoya, G.; Menninghaus, W.; Beermann, U.; Wagner, V.; Eid, M.; Scherer, K.R. Measuring aesthetic emotions: A review of the literature and a new assessment tool. PLoS ONE 2017, 12, e0178899. [Google Scholar] [CrossRef]
- Schmitt, B.H. Experiential Marketing: How to Get Customer to Sense, Feel, Think, Act, and Relate to Your Company and Brands; The Free Press: New York, NY, USA, 1999. [Google Scholar]
- Kartynnik, Y.; Ablavatski, A.; Grishchenko, I.; Grundmann, M. Real-Time Facial Surface Geometry from Monocular Video on Mobile GPUs. arXiv 2019, arXiv:abs/1907.06724. [Google Scholar] [CrossRef]
- IBM. KMO and Bartlett’s Test. Available online: https://www.ibm.com/docs/en/spss-statistics/28.0.0?topic=detection-kmo-bartletts-test (accessed on 10 May 2023).
- Trochim, W.M.K.; Hosted by Conjointly. Research Methods Knowledge Base. Available online: https://conjointly.com/kb/?_ga=2.202908566.666445287.1649411337-790067422.1649411337 (accessed on 10 May 2023).
- Taber, K.S. The use of Cronbach’s alpha when developing and reporting research instruments in science education. Res. Sci. Educ. 2018, 48, 1273–1296. [Google Scholar] [CrossRef]
- Guilford, J.P. Psychometric Methods, 2nd ed.; McGraw-Hill: New York, NY, USA, 1954. [Google Scholar]
- Hu, L.T.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. 1999, 6, 1–55. [Google Scholar] [CrossRef]
Theory | Year | Scholar/Organization | Main Content |
---|---|---|---|
Healthy Aging | 1990 | WHO | Maintaining healthy physical and mental functions, including complete normalcy in physiological, psychological, and social aspects. |
Successful Aging | 1987 and 1997 | Rowe and Kahn | Avoiding the occurrence of diseases and disabilities, maintaining good cognitive abilities and physical functions, and actively participating in life. |
1990 | Baltes and Baltes | The process of good psychological adaptation in older adults, covering three elements: selection, optimization, and compensation, abbreviated as the SOC model. | |
1999 | Torres | Perspectives considering cultural factors, including five value tendencies: human nature, human–nature relationships, interpersonal relationships, time, and activities. | |
Robust Aging | 1995 | Garfein and Herzog | The robustness of body functions, mental health, cognitive functions, and participation in productive activities. |
Gerotranscendence | 1997 | Tornstam | Promoting a positive attitude towards aging, advocating for an optimistic, natural attitude to re-evaluating everything in life. |
Active Aging | 2002 | WHO | Enjoying life in a healthy, participatory, and safe manner during the aging process, thereby enhancing the quality of life for older adults. |
2020–2030 Decade of Healthy Aging Action | 2020 | WHO | Aiming to achieve the noble goal of “Leaving No One Behind”, based on four action areas: eliminating age discrimination, building age-friendly environments, comprehensive health care, and long-term care. |
Case | Orange Technology Type | Orange Computing Type | Presentation Form | Explanation |
---|---|---|---|---|
PARO (2022) [51] | Happiness technology | Affective computing | Robot | Generating autonomous responses to stimuli, conducting learning, and interacting with users. |
Zenbo (2016) [52] | Happiness technology | Affective computing | Robot | Providing education, personal assistance, and entertainment through voice commands. |
PECOLA (2022) [53] | Health technology, warm technology | Biological signal processing and health information | Robot | Applying facial recognition to assist the elderly with diet analysis, sleep monitoring, and issuing fall alerts. |
RoBoHoN (2022) [54] | Happiness technology, warm technology | Advanced companion or assistive technology | Robot | Supporting facial recognition with projectors and cameras for emergency alerts and displaying emotional expressions. |
ELLI.Q (2022) [55] | Health technology, happiness technology | Advanced companion or assistive technology | Robot | Monitoring health, scheduling activities, and serving as an assistive companion for the elderly. |
A Socially Assistive Robot for Elderly Exercise (2019) [56] | Health technology | Biological signal processing and health information | Robot | Implementing a deep learning-based exercise system for senior sports training. |
Biostamp (2012) [57] | Health technology | Biological signal processing and health information | Wearable device | Featuring a stretchable electronic patch design for remotely monitoring users’ vital signs. |
CARE System (2017) [58] | Happiness technology | Affective computing | Interactive device | Retrieving useful information from family photos to recommend activities for elderly users. |
Case | Presentation Form | Interactive Form | Sensory Experience | Description |
---|---|---|---|---|
Tovertafel (2015) [72] | Interactive table | Contact interaction | Vision, hearing, touch | Interactive projection devices tailored for elderly dementia patients, featuring multiplayer games with unlimited user participation. |
Living Aquarium (2022) [73] | Interactive wall | Contact interaction | Touch, vision, hearing | Dual-projector virtual aquarium simulating underwater coral life, interacted with via hand gestures by the elderly. |
Digital Art Installation with Cognitive and Motivation Scores (2020) [11] | Interactive space | Distance interaction | Vision, hearing | Home-based digital art installation with seven scenes, employing depth cameras and microphones to capture elderly movements and sounds, presenting imaginative visual feedback. |
VENSTER (2017) [12] | Interactive table | Contact interaction | Vision, hearing, touch | Window-shaped art installation with standby, trigger, and interactive modes, allowing for touch-based interaction. |
Interactive Sensory Projection Systems (2022) [74] | Interactive table, wall, and Floor | Distance interaction and contact interaction | Vision, hearing, touch | Three distinct interactive systems offering varied interaction options, primarily through physical activity. |
FishCatcher (2013) [75] | Monitor | Distance interaction | Vision, hearing | Color-catching fish game using hand gestures for interaction, promoting multiplayer cooperation. |
Rassle (2018) [76] | Robot monitor | Contact interaction | Vision, hearing, touch | Teddy bear-shaped robot integrating tactile sensing for exercise and offering auditory and visual experiences to engage elderly users. |
Case | Sensing Device | Technical Application | Application Content | Description |
---|---|---|---|---|
Eye Tracking-Based Assistance System (2019) [19] | Camera | IoT, image processing, micro-control device | Smart furniture | Integrating gaze estimation, speech recognition, and IoT for interacting with smart furniture in the living space. |
Look to Speak (2020) [88] | Mobile device | Deep learning on Android | Instant messaging | Developing an Android app based on gaze estimation to aid users in interacting with others with no barriers. |
Eye-Tracking 3D MUI (2015) [89] | VR device, camera | Image processing, VR | Multimedia functionality and interactive gaming | Creating a multimedia user interface in a 3D environment, utilizing VR with cameras to track eye movements. |
HCI Control System of Wheelchairs (2021) [90] | Infrared light filter camera | Deep learning, image processing, micro-control device | Smart wheelchair | Allowing for users to control the direction of the wheelchair through the position of the head and the direction of gaze. |
Beam (2022) [91] | Mobile device | Deep learning on iOS | Interactive gaming | Utilizing Apple phones to predict gaze positions, allowing for users to utilize phone features and experience gaming fun. |
EyeMine (2018) [92] | Eye tracker | Image processing | Interactive gaming | Developing gaze estimation for the Minecraft game, supporting various eye-tracking devices. |
StarGaze (2018) [93] | Eye tracker | Image processing | Multimedia and interactive gaming | Designing a gaze estimation interface for in-bed computer operation for patients due to trauma or severe conditions. |
Robotic System for Painting with Eyes (2021) [94] | Eye tracker | Signal processing, image processing, robot arm | Artistic creation | Using gaze estimation with robotic arms to achieve artistic painting and creation goals by controlling the robotic arm through gaze trajectories. |
Hybrid Brain–Computer Interface (2021) [95] | EEG, headset retinal camera | Brainwave analysis, image processing, force feedback robotic arm | Assistive technology | Using a hybrid brain–computer interface, eye electrical signals, gaze estimation, and force feedback to operate a robotic arm for object manipulation. |
Expert No. | Position | Title | Expertise |
---|---|---|---|
Q1 | National University | Professor | Cross-domain integrated design, value-added design for picture books, digital imaging, photography |
Q2 | National University | Professor | Image processing, game design, human–computer interaction development |
Q3 | Elderly Learning Center | Lecturer | Elderly care and well-being, activity planning and management, digital learning education design |
Aspect | Interview Questions |
---|---|
Technology integration in elderly activities experience | Is integrating interactive systems suitable for elderly activity experiences? |
How can interest be generated among the elderly for the interactive system? | |
Interactive art and active aging for the elderly | Can interactive art experiences positively contribute to the physical and mental development of the elderly? |
What are your thoughts and suggestions on designing interactive art experience themes and scenarios for the elderly? | |
Application of gaze estimation in interactive art experience | What are your views on using gaze estimation as an interactive interface for elderly interactive art experiences? |
What details should be considered when developing systems using gaze estimation technology? |
Question No. | Content |
---|---|
R1 | I am more willing to try technology through eye interaction. |
R2 | I believe that eye interaction can capture my interest. |
R3 | In my leisure time, I think I will use this system frequently. |
R4 | I feel that interacting through eyes helps alleviate physical burden. |
R5 | I think using eye interaction makes the system more convenient. |
R6 | I believe that system interactions via eyes can meet functional requirements. |
R7 | I find eye interaction very easy. |
R8 | I am very confident in using the system. |
Question No. | Content |
---|---|
S1 | I really like the natural scenery presented in the visuals. |
S2 | I find the natural scenery presented in the visuals very beautiful. |
S3 | The experience process leaves a deep impression on me. |
S4 | I can easily immerse myself in the experience process. |
S5 | The interactive feedback makes me curious. |
S6 | After the experience, I feel more energetic. |
S7 | I feel very relaxed during the experience process. |
S8 | I find the experience content interesting. |
Question No. | Content |
---|---|
T1 | This experience activity is very interesting to me. |
T2 | I feel that this experience activity helps me relax. |
T3 | I feel that this experience activity helps me relieve stress. |
T4 | I feel that this is a meaningful experience activity. |
T5 | This experience activity enriches my daily life. |
T6 | I have a better understanding of the use of technology after system experiencing. |
T7 | After the experience ends, I feel more active in interacting with others. |
T8 | I think it helps to increase willingness to participate in activities after system experiencing. |
Deep sea | Shallow sea | Desert | Starry sky |
Garden | Forest | Rainforest | Cloud layer |
Theme | Interactive Species | Species Type | Symbolic Meaning | Illustration |
---|---|---|---|---|
Star Garden | Flowers | Sunflower | Hope, courage, vitality | |
Platycodon | Nobility, loyalty, beauty | |||
Carnation | Purity, friendship, elegance | |||
Rose | Passion, beauty, renewal | |||
Animal Forest | Animals | Brown bear | Strength, protection, calm | |
Deer | Gentleness, sensitivity, beauty | |||
Stag | Wisdom, leadership, grace | |||
Caribou | Resilience, courage, sturdiness | |||
Forest Trees | Trees | Cedar | Longevity, resilience, protection | |
Birch | Vigor, youth, vitality | |||
Pine | Tranquility, steadfastness, evergreen | |||
Cypress | Resilience, stability, nobility |
Stage | Process | Illustration | Eye Move | Explanation |
---|---|---|---|---|
Stage 1 | Head Adjustment | No | In the effective range, when there are no users detected, the mask in the camera view will display in red. | |
When a user approaches the system and places his/her face within the mask, the mask will display in green. If it remains green, a five-second countdown will begin, after which the formal experience will commence. | ||||
Stage 2 | Gaze Calibration | No | When the system cannot detect the user’s face within the mask position, the mask in the camera view will display in red and appear in the top-left corner. At this point, the system will pause operation until the user adjusts their head to a valid position. | |
The user is required to follow system prompts and keep their gaze fixed on the fish without moving their heads. After a 3 s countdown, the system initiates calibration. | ||||
The fish moves rhythmically across the screen, pausing for several seconds in each corner over time. As the user tracks the fish with his/her gaze, the system captures the user’s eye movement data during the fish’s pauses. | ||||
Smooth Pursuit | The fish moves rhythmically across the screen, pausing for several seconds in each corner over time. When the user tracks the fish with his/er gaze, the system captures the user’s eye movement data during these pauses. | |||
Stage 3 | Gaze Testing | No | When the system cannot detect the user’s face within the mask, the mask will display in red and appear in the top-left corner. At this point, the system pauses operation until the user adjusts his/her head to a valid position. | |
Operational prompts are displayed, and the user is required to follow these instructions. A white heatmap indicates the estimated gaze position of the user during the gaze estimation process. | ||||
Reactive Saccades | The user moves his/her gaze to randomly appearing light spots on the screen. When his/her gaze touches a spot, it will disappear, and the next spot will appear immediately. | |||
No | If not all spots are successfully triggered within the limited time in this stage, the system will return to the second stage for gaze estimation recalibration. | |||
Stage 4 | Topic Selection | No | The screen is evenly divided into previews of three themes. The top-left is the Star Garden theme, the top-right is the Animal Forest theme, the bottom-left is the Forest Trees theme, and the bottom-right is random. | |
Fixations | The user gazes at the area of the theme image he/she wishes to select. After one-second fixation on the chosen theme, he/she will enter theme experiencing. Selecting the question mark will enter one of the main themes. | |||
Stage 5 | Theme Experience—Star Garden | Fixations | While gazing at the screen, a random flower will bloom after one second of fixation, spreading outwards from the fixation point. | |
Scanning Saccades | When the user starts moving his/her gaze, the flowers of that type will scatter. | |||
If the user drags his/her gaze extensively, all four types of flowers will elegantly bloom along the trajectory. | ||||
No | The system analyzes the total fixation duration for each type of flower to determine the most focused flower type and its symbolic meaning for the theme. If the user has experienced all themes, go to Stage 6; else, return to Stage 3. | |||
Theme Experience—Animal Forest | No | In the scene, various animals will randomly appear, each exhibiting their own behaviors. The camera perspective will move back and forth from right to left. | ||
Fixations | When the user gazes at an animal, the animal will perceive this and react accordingly. | |||
No | Comparing the durations of gaze fixation on various animal types, the theme will attribute significance to the animal type receiving the most attention. If all themes have been experienced, go to Stage 6; else, return to stage 3. | |||
Theme Experience—Forest Trees | Scanning Saccades | The user’s gaze focus area can erase the scenery outside the mist observation window. The camera perspective will move back and forth from near to far. | ||
Fixations | When the user gazes at specific types of trees in the scene, those trees will be marked and sway. | |||
No | Comparing the durations of gaze fixation on various types of trees, the theme will attribute significance to the tree type receiving the most attention. If all themes have been experienced, the system proceeds to Stage 6; else, it returns to Stage 3. | |||
Stage 6 | Experience Results | No | Based on the user’s identified focus species from the experiences in the three themes, the names of these three species are used as keywords to generate a natural landscape image containing them on the screen. |
Interview Aspect | Interview Question | Expert’s Comment |
---|---|---|
Integration of technology into elderly activities | Is integrating interactive systems suitable for elderly activity experiences? |
|
How can interest be generated among the elderly for interactive systems? |
| |
Interactive art and active aging for the elderly | Can interactive art experiences positively contribute to the physical and mental development of the elderly? |
|
What are your thoughts and suggestions on designing interactive art experience themes and scenarios for the elderly? |
| |
Application of gaze estimation in interactive art experience | What are your views on using gaze estimation as an interactive interface for elderly interactive art experiences? |
|
What details should be considered when developing systems using gaze estimation technology? |
|
Basic Data | Categories | Number of Samples | Percentage |
---|---|---|---|
Sex | Male | 8 | 15% |
Female | 44 | 85% | |
Age | 50–60 | 13 | 25% |
61–70 | 18 | 35% | |
71–80 | 15 | 29% | |
over 80 | 6 | 11% | |
Ever used similar technological products? | Frequently | 33 | 8% |
Sometimes | 15 | 29% | |
Almost never | 4 | 63% |
Item No. | Min. | Max. | Mean | S. D. | Strongly Agree | Agree | Neutral | Disagree | Strongly Disagree | Agree or Above |
---|---|---|---|---|---|---|---|---|---|---|
R1 | 3 | 5 | 4.12 | 0.732 | 32.7% | 46.2% | 21.2% | 0.0% | 0.0% | 78.9% |
R2 | 3 | 5 | 3.98 | 0.700 | 23.1% | 51.9% | 25.0% | 0.0% | 0.0% | 75.0% |
R3 | 3 | 5 | 4.12 | 0.732 | 32.7% | 46.2% | 21.2% | 0.0% | 0.0% | 78.9% |
R4 | 2 | 5 | 4.04 | 0.928 | 38.5% | 32.7% | 23.1% | 5.8% | 0.0% | 71.2% |
R5 | 2 | 5 | 4.21 | 0.720 | 38.5% | 46.2% | 13.5% | 1.9% | 0.0% | 84.7% |
R6 | 2 | 5 | 4.10 | 0.721 | 28.8% | 53.8% | 15.4% | 1.9% | 0.0% | 82.6% |
R7 | 3 | 5 | 4.27 | 0.660 | 38.5% | 50.0% | 11.5% | 0.0% | 0.0% | 88.5% |
R8 | 3 | 5 | 4.21 | 0.723 | 38.5% | 44.2% | 17.3% | 0.0% | 0.0% | 82.7% |
Item No. | Min. | Max. | Mean | S. D. | Strongly Agree | Agree | Neutral | Disagree | Strongly Disagree | Agree or Above |
---|---|---|---|---|---|---|---|---|---|---|
S1 | 3 | 5 | 4.37 | 0.658 | 46.2% | 44.2% | 9.6% | 0.0% | 0.0% | 90.4% |
S2 | 3 | 5 | 4.38 | 0.718 | 51.9% | 34.6% | 13.5% | 0.0% | 0.0% | 86.5% |
S3 | 2 | 5 | 4.08 | 0.813 | 34.6% | 40.4% | 23.1% | 1.9% | 0.0% | 75.0% |
S4 | 2 | 5 | 3.75 | 0.813 | 21.2% | 34.6% | 42.3% | 1.9% | 0.0% | 55.8% |
S5 | 3 | 5 | 4.31 | 0.701 | 44.2% | 42.3% | 13.5% | 0.0% | 0.0% | 86.5% |
S6 | 3 | 5 | 4.06 | 0.777 | 32.7% | 40.4% | 26.9% | 0.0% | 0.0% | 73.1% |
S7 | 2 | 5 | 4.31 | 0.805 | 50.0% | 32.7% | 15.4% | 1.9% | 0.0% | 82.7% |
S8 | 3 | 5 | 4.52 | 0.641 | 59.6% | 32.7% | 7.7% | 0.0% | 0.0% | 92.3% |
Item No. | Min. | Max. | Mean | S. D. | Strongly Agree | Agree | Neutral | Disagree | Strongly Disagree | Agree or Above |
---|---|---|---|---|---|---|---|---|---|---|
T1 | 3 | 5 | 4.37 | 0.715 | 50.0% | 36.5% | 13.5% | 0.0% | 0.0% | 86.5% |
T2 | 3 | 5 | 4.29 | 0.776 | 48.1% | 32.7% | 19.2% | 0.0% | 0.0% | 80.8% |
T3 | 3 | 5 | 4.06 | 0.802 | 34.6% | 36.5% | 28.8% | 0.0% | 0.0% | 71.1% |
T4 | 3 | 5 | 4.35 | 0.711 | 48.1% | 38.5% | 13.5% | 0.0% | 0.0% | 86.6% |
T5 | 3 | 5 | 4.21 | 0.800 | 44.2% | 32.7% | 23.1% | 0.0% | 0.0% | 76.9% |
T6 | 3 | 5 | 4.19 | 0.841 | 46.2% | 26.9% | 26.9% | 0.0% | 0.0% | 73.1% |
T7 | 3 | 5 | 4.00 | 0.792 | 30.8% | 38.5% | 30.8% | 0.0% | 0.0% | 69.3% |
T8 | 3 | 5 | 4.37 | 0.658 | 46.2% | 44.2% | 9.6% | 0.0% | 0.0% | 90.4% |
Scale | Name of Measure or Test | Value | |
---|---|---|---|
User friendliness | KMO measure of sampling adequacy | 0.792 | |
Bartlett test of sphericity | Approx. chi-square | 171.686 | |
Degree of freedom | 28 | ||
Significance | 0.000 | ||
User satisfaction | KMO measure of sampling adequacy | 0.869 | |
Bartlett test of sphericity | Approx. chi-square | 223.881 | |
Degree of freedom | 28 | ||
Significance | 0.000 | ||
User experience | KMO measure of sampling adequacy | 0.905 | |
Bartlett test of sphericity | Approx. chi-square | 286.210 | |
Degree of freedom | 28 | ||
Significance | 0.000 |
Variables | Component | |
---|---|---|
1 | 2 | |
R1 | 0.840 | 0.246 |
R3 | 0.814 | 0.099 |
R6 | 0.749 | 0.325 |
R2 | 0.707 | 0.207 |
R7 | 0.108 | 0.807 |
R8 | 0.312 | 0.775 |
R5 | 0.423 | 0.722 |
R4 | 0.130 | 0.691 |
Extraction method: principal component analysis. Rotation method: varimax with Kaiser normalization. | ||
a. Rotation converged in 4 iterations. |
Variables | Component | |
---|---|---|
1 | 2 | |
S1 | 0.891 | 0.181 |
S2 | 0.822 | 0.292 |
S7 | 0.741 | 0.201 |
S8 | 0.625 | 0.298 |
S4 | 0.197 | 0.813 |
S5 | 0.521 | 0.798 |
S6 | 0.475 | 0.717 |
S3 | 0.517 | 0.590 |
Extraction method: principal component analysis. Rotation method: varimax with Kaiser normalization. | ||
a. Rotation converged in 3 iterations. |
Variables | Component | |
---|---|---|
1 | 2 | |
T2 | 0.888 | 0.273 |
T8 | 0.743 | 0.379 |
T1 | 0.794 | 0.424 |
T3 | 0.636 | 0.560 |
T4 | 0.616 | 0.538 |
T6 | 0.294 | 0.746 |
T5 | 0.322 | 0.797 |
T7 | 0.373 | 0.527 |
Extraction method: principal axis factoring. Rotation method: varimax with Kaiser normalization. | ||
a. Rotation converged in 3 iterations. |
Scale | Title of Latent Dimension | No. of Questions | Labels of the Questions of the Dimension |
---|---|---|---|
User friendliness | Intention to act (Group RC1) | 4 | (R1, R3, R6, R2) |
Usefulness (Group RC2) | 4 | (R7, R8, R5, R4) | |
User satisfaction | Pleasure and relaxation (Group SC1) | 5 | (S7, S8, S2, S1) |
Cognitive stimulation (Group SC2) | 5 | (S4, S6, S3, S5) | |
User experience | Experience satisfaction (Group TC1) | 4 | (T2, T1, T8, T3, T4) |
Enrichment in life (Group TC2) | 4 | (T5, T6, T7) |
Scale | Latent Dimension | Cronbach’s α Coefficient | Cronbach’s α Coefficient |
---|---|---|---|
User friendliness | Intention to act (Group RC1) | 0.826 | 0.850 |
Usefulness (Group RC2) | 0.786 | ||
User satisfaction | Pleasure and relaxation (Group SC1) | 0.873 | 0.897 |
Cognitive stimulation (Group SC2) | 0.820 | ||
User experience | Experience satisfaction (Group TC1) | 0.926 | 0.924 |
Enrichment in life (Group TC2) | 0.802 |
Scale | df | χ2 | χ2/df | cfi | RMSEA | RMSEA (90% CI) | |
---|---|---|---|---|---|---|---|
LO | HI | ||||||
User friendliness | 19 | 26.176 | 1.378 | 0.952 | 0.083 | 0.000 | 0.154 |
User satisfaction | 19 | 30.779 | 1.620 | 0.942 | 0.106 | 0.020 | 0.172 |
User experience | 19 | 22.018 | 1.159 | 0.989 | 0.054 | 0.000 | 0.135 |
Scale | Latent Dimension | Group of Related Questions | Construct Validity |
---|---|---|---|
User friendliness | Intention to act (Group RA) | RA = (R1, R3, R6, R2) | 0.829 |
Usefulness (Group RB) | RB = (R7, R8, R5, R4) | 0.801 | |
User satisfaction | Pleasure and relaxation (Group SA) | SA = (S7, S8, S2, S1) | 0.872 |
Cognitive stimulation (Group SB) | SB = (S4, S6, S3, S5) | 0.819 | |
User experience | Experience satisfaction (Group TA) | TA = (T2, T1, T8, T3, T4) | 0.929 |
Enrichment in life (Group TB) | TB = (T5, T6, T7) | 0.817 |
Question | Min. Value | Max. Value | Mean | S. D. | Strongly Agree (5) | Agree (4) | Average (3) | Disagree (2) | Strongly Disagree (1) | Agree or Above | |
---|---|---|---|---|---|---|---|---|---|---|---|
R1 | I am more willing to try technology through eye interaction. | 3 | 5 | 4.12 | 0.732 | 32.7% | 46.2% | 21.2% | 0.0% | 0.0% | 78.9% |
R3 | In my leisure time, I think I will use this system frequently. | 2 | 5 | 3.77 | 0.757 | 17.3% | 44.2% | 36.5% | 1.9% | 0.0% | 61.5% |
R6 | I believe that the system interacting through eyes can effectively meet functional requirements. | 2 | 5 | 4.10 | 0.721 | 28.8% | 53.8% | 15.4% | 1.9% | 0.0% | 82.6% |
R2 | I believe that eye interaction can capture my interest. | 3 | 5 | 3.98 | 0.700 | 23.1% | 51.9% | 25.0% | 0.0% | 0.0% | 75.0% |
Question | Min. Value | Max. Value | Mean | S. D. | Strongly Agree (5) | Agree (4) | Average (3) | Disagree (2) | Strongly Disagree (1) | Agree or Above | |
---|---|---|---|---|---|---|---|---|---|---|---|
R7 | I find eye interaction very easy. | 3 | 5 | 4.27 | 0.660 | 38.5% | 50.0% | 11.5% | 0.0% | 0.0% | 88.5% |
R8 | I am very confident in using the system. | 3 | 5 | 4.21 | 0.723 | 38.5% | 44.2% | 17.3% | 0.0% | 0.0% | 82.7% |
R5 | I think using eye interaction makes the system more convenient. | 2 | 5 | 4.21 | 0.720 | 38.5% | 46.2% | 13.5% | 1.9% | 0.0% | 84.7% |
R4 | I feel that interacting through eyes helps alleviate physical burden. | 2 | 5 | 4.04 | 0.928 | 38.5% | 32.7% | 23.1% | 5.8% | 0.0% | 71.2% |
Question | Min. Value | Max. Value | Mean | S. D. | Strongly Agree (5) | Agree (4) | Average (3) | Disagree (2) | Strongly Disagree (1) | Agree or Above | |
---|---|---|---|---|---|---|---|---|---|---|---|
S7 | I feel very relaxed during the experience process. | 2 | 5 | 4.31 | 0.805 | 50.0% | 32.7% | 15.4% | 1.9% | 0.0% | 82.7% |
S8 | I find the experience content interesting. | 3 | 5 | 4.52 | 0.641 | 59.6% | 32.7% | 7.7% | 0.0% | 0.0% | 92.3% |
S2 | I find the natural scenery presented in the visuals very beautiful. | 3 | 5 | 4.38 | 0.718 | 51.9% | 34.6% | 13.5% | 0.0% | 0.0% | 86.5% |
S1 | I really like the natural scenery presented in the visuals. | 3 | 5 | 4.37 | 0.658 | 46.2% | 44.2% | 9.6% | 0.0% | 0.0% | 90.4% |
Question | Min. Value | Max. Value | Mean | S. D. | Strongly Agree (5) | Agree (4) | Average (3) | Disagree (2) | Strongly Disagree (1) | Agree or Above | |
---|---|---|---|---|---|---|---|---|---|---|---|
S4 | I can easily immerse myself in the experience process. | 2 | 5 | 3.75 | 0.813 | 21.2% | 34.6% | 42.3% | 1.9% | 0.0% | 55.8% |
S6 | After the experience, I feel more energetic. | 3 | 5 | 4.06 | 0.777 | 32.7% | 40.4% | 26.9% | 0.0% | 0.0% | 73.1% |
S3 | The experience process leaves a deep impression on me. | 2 | 5 | 4.08 | 0.813 | 34.6% | 40.4% | 23.1% | 1.9% | 0.0% | 75.0% |
S5 | The interactive feedback makes me curious. | 3 | 5 | 4.31 | 0.701 | 44.2% | 42.3% | 13.5% | 0.0% | 0.0% | 86.5% |
Question | Min. Value | Max. Value | Mean | S. D. | Strongly Agree (5) | Agree (4) | Average (3) | Disagree (2) | Strongly Disagree (1) | Agree or Above | |
---|---|---|---|---|---|---|---|---|---|---|---|
T2 | I feel that this experience activity helps me relax. | 3 | 5 | 4.29 | 0.776 | 48.1% | 32.7% | 19.2% | 0.0% | 0.0% | 80.8% |
T1 | This experience activity is very interesting to me. | 3 | 5 | 4.37 | 0.715 | 50.0% | 36.5% | 13.5% | 0.0% | 0.0% | 86.5% |
T8 | After the experience ends, I think it helps to increase willingness to participate in activities. | 3 | 5 | 4.37 | 0.658 | 46.2% | 44.2% | 9.6% | 0.0% | 0.0% | 90.4% |
T3 | I feel that this experience activity helps me relieve stress. | 3 | 5 | 4.06 | 0.802 | 34.6% | 36.5% | 28.8% | 0.0% | 0.0% | 71.1% |
T4 | I feel that this is a meaningful experience activity. | 3 | 5 | 4.35 | 0.711 | 48.1% | 38.5% | 13.5% | 0.0% | 0.0% | 86.6% |
Question | Min. Value | Max. Value | Mean | S. D. | Strongly Agree (5) | Agree (4) | Average (3) | Disagree (2) | Strongly Disagree (1) | Agree or Above | |
---|---|---|---|---|---|---|---|---|---|---|---|
T5 | This experience activity enriches my daily life. | 3 | 5 | 4.21 | 0.800 | 44.2% | 32.7% | 23.1% | 0.0% | 0.0% | 76.9% |
T6 | I have a better understanding of the use of technology after the system experience. | 3 | 5 | 4.19 | 0.841 | 46.2% | 26.9% | 26.9% | 0.0% | 0.0% | 73.1% |
T7 | After the experience ends, I feel more active in interacting with others. | 3 | 5 | 4.00 | 0.792 | 30.8% | 38.5% | 30.8% | 0.0% | 0.0% | 69.3% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, C.-M.; Hsu, W.-C. Design of a Gaze-Controlled Interactive Art System for the Elderly to Enjoy Life. Sensors 2024, 24, 5155. https://doi.org/10.3390/s24165155
Wang C-M, Hsu W-C. Design of a Gaze-Controlled Interactive Art System for the Elderly to Enjoy Life. Sensors. 2024; 24(16):5155. https://doi.org/10.3390/s24165155
Chicago/Turabian StyleWang, Chao-Ming, and Wei-Chih Hsu. 2024. "Design of a Gaze-Controlled Interactive Art System for the Elderly to Enjoy Life" Sensors 24, no. 16: 5155. https://doi.org/10.3390/s24165155
APA StyleWang, C. -M., & Hsu, W. -C. (2024). Design of a Gaze-Controlled Interactive Art System for the Elderly to Enjoy Life. Sensors, 24(16), 5155. https://doi.org/10.3390/s24165155