Next Article in Journal
Toward Memory-Efficient Analog Design Using Precomputed Lookup Tables
Previous Article in Journal
A 28/56 Gb/s NRZ/PAM-4 Dual-Mode Transmitter with Eye-Opening Enhancement in 28 nm CMOS
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluation of Haptic Textures for Tangible Interfaces for the Tactile Internet

by
Nikolaos Tzimos
1,
George Voutsakelis
1,
Sotirios Kontogiannis
2 and
Georgios Kokkonis
3,*
1
Department of Business Administration, University of Western Macedonia, 51100 Grevena, Greece
2
Department of Mathematics, University of Ioannina, 45110 Ioannina, Greece
3
Department of Informatio and Electronic Engineering, International Hellenic University, 57400 Thessaloniki, Greece
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(18), 3775; https://doi.org/10.3390/electronics13183775
Submission received: 12 August 2024 / Revised: 15 September 2024 / Accepted: 18 September 2024 / Published: 23 September 2024
(This article belongs to the Section Computer Science & Engineering)

Abstract

:
Every texture in the real world provides us with the essential information to identify the physical characteristics of real objects. In addition to sight, humans use the sense of touch to explore their environment. Through haptic interaction we obtain unique and distinct information about the texture and the shape of objects. In this paper, we enhance X3D 3D graphics files with haptic features to create 3D objects with haptic feedback. We propose haptic attributes such as static and dynamic friction, stiffness, and maximum altitude that provide the optimal user experience in a virtual haptic environment. After numerous optimization attempts on the haptic textures, we propose various haptic geometrical textures for creating a virtual 3D haptic environment for the tactile Internet. These tangible geometrical textures can be attached to any geometric shape, enhancing the haptic sense. We conducted a study of user interaction with a virtual environment consisting of 3D objects enhanced with haptic textures to evaluate performance and user experience. The goal is to evaluate the realism and recognition accuracy of each generated texture. The findings of the study aid visually impaired individuals to better understand their physical environment, using haptic devices in conjunction with the enhanced haptic textures.

1. Introduction

Knowledge creation is achieved through social and physical interaction. The perception and recognition of objects in the physical environment require tactile interaction. The sense of touch helps us obtain information from the object’s texture [1]. The natural world consists of numerous textures; therefore, to realistically represent its objects in a virtual environment, the concept of texture is essential.
The perception of the world also occurs through traditional visual and auditory interaction channels. Touch is the least explored sense from a technological point of view, due to its complexity. It is considered an intermediate state concerning the other senses, in terms of its temporal and spatial resolution [1]. The characteristics of physical textures determine the parameters of haptic texture. Texture and shape information can be converted into haptic stimuli and transmitted to a remote user via haptic devices. Such information refers to tactile sensations such as pressure, texture, softness, moisture, thermal properties, and sensations induced by friction such as slip and grip, as well as by object characteristics such as shape, edges, and relief [2]. Texture perception is a bimodal phenomenon, perceived through visual and haptic senses. It is degraded if observed through only one of these senses [3].
Due to the complexity of the receptive mechanisms embedded in the human skin, research on digital haptic perception is still far from achieving physical perception. Human haptic perception is divided into dermal, achieved through the skin and its mechanisms, and kinesthetic, through end organs located in muscles, tendons, and joints, which are stimulated by bodily movements and forces [4]. Haptic perception is further divided into active, where the user controls their actions, and passive, where a device imposes forces on the user [4]. Under the skin, there are sensors called mechanoreceptors, which provide information to the brain. They can be classified based on their distance from the skin surface and their degree of adaptation to external stimuli [4]. Kinesthetic perception, or proprioception, refers to the perception of limb movement, position, and force achieved by muscle and joint receptors [2,5,6].
A haptic feedback application for the blind, designed for interaction and understanding of the virtual environment, can have many positive effects on how blind individuals interact with physical world objects. Applications that use haptic vibrations to provide information about the distance, size, and direction of objects in a virtual environment help users gain a better perception of the space around them and how they navigate within it. Blind users can detect or avoid objects and practice recognizing and adapting to environmental changes. For example, if an application provides different vibrations for textures such as smooth, rough, hard, or their variations, users can practice recognizing these textures and learn to adjust their interactions with different surfaces, which can be useful when encountering new or changing physical environments. Similar applications contribute to improving users’ movement coordination, which can then be transferred to interactions with physical objects. This enhances their ability and confidence in devising exploration and navigation strategies in the physical world. Overall, a haptic feedback application for the blind is a valuable tool for training and improving the skills needed for daily interaction with the physical environment, creating a more comprehensive and satisfying user experience.
The general purpose of this study is to enhance users’ tactile interaction in virtual environments. The study aims, through a series of laboratory measurements and experiments, to determine the most suitable tactile texture parameters for three-dimensional objects in order to achieve optimal perceptual performance during user interaction. Subsequently, the accuracy of recognition and identification for each produced tactile texture will be studied and evaluated with the help of a force feedback tactile device. The goal of the experiment is not to study the texture parameters, but rather to assess users’ ability to distinguish and perceive differentiable textures in a virtual scene.
This research paper stands out by providing a comprehensive comparative evaluation of nine different patterns. This analysis is crucial for understanding the practical implications and user experiences interacting with each pattern, an area that has not been extensively covered in previous research. The innovation of this research lies in its user-centered evaluation of nine different texture patterns for Virtual Reality applications, the innovative use of blindness simulation tests, and the detailed analysis of user recognition and identification rates. By providing both qualitative and quantitative data, the paper offers a rich, multidimensional understanding of the usability and effectiveness of the nine patterns, making a significant contribution to the field of accessibility technology. The difference from previous research is that we attempt to add tactile characteristics to three-dimensional objects in a virtual scene so that they can be perceived by users through haptic feedback. Future research could build on the results of this study, with the aim of using these specific patterns to represent objects in more complex scenes.

2. Related Work

Research has studied the effect of vibratory signals during haptic interaction. Vibrations can be enforced on the user’s fingertips during virtual interaction, conveying textural information without direct contact with the object surface [7,8]. Many haptic applications through touch screens use vibratory feedback, which is very effective for distinguishing different texture patterns [9,10,11]. Haptic technology also includes electrostatic and ultrasonic haptic displays. The first case focuses on the friction modulation between the human finger and the touch screen [12]. Using such a haptic display, Afonso Castico and Paulo Cardoso [9] study the possibility of perceiving haptic textures as a representation of real-world objects. After a series of experiments, they found that uniform and regular textures can be better represented in a haptic environment compared to textures with a synthetic or irregular shape. Thamizhisai Periyaswamy and Md Rashedul addressed current techniques for haptic rendering of textile materials [2]. Their research focused on ways to measure the physical characteristics of fabrics and how to simulate them virtually on a haptic display. Youcan Yan et al. propose a novel texture recognition method by designing a soft arc-shaped touch sensor based on magnetic technology, effectively recognizing both Braille characters and textiles with high accuracy [13]. Many studies have focused on creating haptic textures with varying degrees of roughness and slipperiness. They use the technique of data capture (modeling) of real textures and subsequent reproduction using force, displacement, or acceleration data [7,8,14,15], to create virtual haptic interactions that appear realistic. Heather Culbertson and Katherine J. Kuchenbecker exploit the modeling technique to evaluate the similarity between real and virtual surfaces and the importance of perceptual properties, such as slipperiness, hardness, and roughness, in the realism of virtual haptic surfaces [14]. Yu-cheng Li et al. study the modeling of haptic perception by proposing a multimodal haptic rendering algorithm based on neural networks [15]. The study demonstrated that the hardness and roughness features obtained by the algorithm capture the physical object sensation, determining and affecting the realism performance of virtual samples generated from the physical surfaces. Recent studies are concerned with delineating a perceptual space of tactile features [3,11,16]. Waseem Hassan et al. arrive at a four-dimensional space, where feature pairs such as rough–smooth, flat–bump, sticky–slippery, and hard–soft are formed. They also use a neural network model to classify each new texture based on the relationship between its tactile features and its image features [3]. Similarly, Sunung Mun et al. create a three-dimensional perceptual space using data from an electromagnetic vibration experiment [14]. Zhiyu Shao et al., after a series of experiments, arrived at a multi-level model for perceptual compliance, expressing the relationship between interaction features as input, and its effects (object deformation etc.) perceived by the user as output [17]. In their research, Roberta L. Klatzky et al. study friction detection and matching a friction pattern with the visual image that produced it [5]. The data show a clear distinction between friction-pattern detection and the ability to match a visual source. Texture through friction appears sufficient to trigger pattern detection but the pattern information appears insufficient to allow for pattern identification. Therefore, changes in the value of friction are a crucial parameter in the user’s perception. Changes in friction on a surface are ways of transmitting information about patterns and texture to the user [5]. Two important texture properties that depend on friction are roughness and slipperiness. Papadopoulos K. et al. investigated the perceptual sensitivity to alterations in haptic properties, such as friction and surface hardness, through the haptic device Phantom Omni [18]. Their research also focused on the contrasting perceptions between visually impaired individuals and those with sight. The results revealed that surfaces with different friction and hardness values demanded shorter integration time and had relatively better perception scores. Papadopoulos K. et al. also investigated the perceptual ability of users interacting with different pattern textures through a haptic force feedback device. In addition, Papadopoulos K. et al. concluded that visually impaired participants differed from sighted participants in terms of interaction time on surfaces with different friction and different hardness values.
The development of Virtual Reality allows users to interact with 3D objects in virtual environments. The integration of haptic feedback enables the development of new applications in user interface design. Preference is an important usability factor that affects user experience and performance. Research has concluded that the sense of touch improves user experience and determines the preference of a tactile object [8,19,20,21].

3. Materials and Methods

In this study, we investigate whether blind users and users with their eyes closed can identify different haptic surfaces. Moreover, we investigate whether the user can identify geometric patterns based solely on their haptic sense. As an evaluation score, we measure interaction time and identification success rate. Interaction time is defined as the time needed for the user to identify a specific geometric pattern.
Furthermore, we propose geometric patterns, emphasizing specific haptic parameters, so that the 3D object’s textures become distinct during this interaction. Texture characteristics can vary depending on the application and environment. Some key texture characteristics commonly used in 3D models and haptic applications are roughness, stiffness, reflectivity, friction, microstructure, temperature, moisture, and surface patterning. These features can be combined and adapted to create an algorithm resulting in realistic textures that enhance the experience of interacting in digital and physical environments. The proposed algorithm can be integrated into 3D models and used by haptic devices. The algorithm is executed only at the point of contact between the haptic device and the virtual 3D object.
A digital 3D model is reproduced in a virtual environment. The chosen model is a rectangular shape with dimensions 0.5 cm × 0.5 cm × 0.2 cm. We chose the rectangle to achieve a fairly large surface in the main scene of the app, with angles to aid perception. By adding different geometric patterns, 3D haptic properties are added to the 3D object. During interaction with the user, the haptic experience is enhanced. These patterns are PNG image files drawn in gray-scale. There were nine different patterns in total, representing nine test categories, respectively. In several tactile applications for the blind, various geometric patterns or their variations are used to facilitate the recognition and interaction with objects and information through touch. The choice of geometric pattern depends on the application’s goal and the users’ needs. Patterns must be clear and readable through touch to provide the required information in the most effective way [22]. We attempted to select geometric shapes aimed at being easily readable through touch. Some of the most common basic shapes are the following:
  • The square, which is easily recognizable due to its clear straight lines and angles. Squares and rectangles are useful for creating clear and understandable patterns. In tactile diagrams and maps, squares and rectangles can help in recognizing different areas or categories.
  • The circle, recognized by the absence of angles and its continuous curve. Circles are often used in applications to mark points and nodes.
  • The triangle, characterized by its three angles and straight lines. Triangles are used to indicate directions or to show relationships between elements.
  • Parallel and horizontal lines that we encounter in the physical world and are easily recognized by parallel and perpendicular finger movements. These lines are used to create textures or provide guidance regarding direction and movement.
  • Points and small shapes are used to mark specific points or positions. Small shapes, such as dots or asterisks, can be used to provide information or to highlight important points. The chosen geometric patterns are depicted in Figure 1.
The system’s hardware consists of a main computer and haptic hardware. The computer is responsible for building, controlling, and coordinating the virtual environment. The computer unit in this study is a Windows 10 environment, with an Intel(R) Core(TM) i5-3570 CPU @3.40 GHz processor and 8 GB memory. Manufactured under license from Lenovo PC, product certified in Sarvar, Hungary.
The haptic device used in the experimental procedure is the 3D Systems Touch Haptic Device (Phantom Omni). It is a professional haptic force feedback device with impedance. Its small size, compact footprint, portability, and removable stylus with two integrated momentary stylus switches are some of its important features [18,23,24]. It has 3-degrees-of-freedom (3-DOF) force feedback. The 6-DOF position sensor provides a bigger workspace [18,23,24,25]. During installation, the necessary calibration was performed to ensure that the device has accurate 3D position data and the haptic workspace, the accessible physical space of haptic devices, i.e., the entire working range of the device, was set.
To haptically explore the created images, the open source H3DAPI environment, which supports the above haptic device, was used. H3D uses the X3D for designing graphical scenes [26]. The X3D is based on the XML markup language, which H3D uses to load the graphical scene. H3D uses the X3D extension files in XML syntax rules to pass haptic properties to objects being displayed. Each 3D object in H3D is referred to as a node. Nodes are the key element in H3D. Haptic attributes, such as stiffness and static and dynamic friction, can be assigned to each node using XML tags [27].
To interact with a 3D object, a surface node with haptic properties must be added. In our case, the DepthMapSurface node is applied to a surface with friction, where a texture decides how deep the surface feels. The haptic rendering algorithm used was the same in all pattern tests. Among the texture characteristics mentioned, the combination of roughness, stiffness, friction, and patterning determines the final parameters of the pattern to be used. In these tests, appropriate values were selected that defined the degree of roughness, friction, and height modulation of elements on a surface. The parameters chosen for optimal perceptual ability during haptic interaction were stiffness = “0.5”, maxDepth = “0.01”, staticFriction = “0.5”, dynamicFriction = “0.4”, and whiteIsOut = “true”. These values were derived from a series of tests in the laboratory. For optimal haptic feedback, they largely depend on the application, the type of object, and the desired user experience. The best approach for adjusting stiffness and friction involves continuous testing with real users and ongoing improvement based on feedback, which led us to the final result.
Virtual objects acquire realistic properties such as mass, hardness, texture, and elasticity. Haptic user interaction is achieved through various haptic devices, to which the user gives commands to manipulate the object in the 3D environment. Through the haptic device, the user receives force and tactile feedback from the virtual environment. In a similar project, George Kokkonis et al. used RGB-depth cameras to transform real-world objects into 3D virtual objects [28]. By assigning predefined patterns to them and creating haptic textures, they focused on the ease of distinguishing objects in a virtual scene without any cues for the rendering of each pattern individually during interaction.

4. Experimental Design

The experimental procedure was first explained in detail to two men with total blindness. After explaining the details and procedures of the experiment, they started by touching the device and the stylus to perceive its shape and mass. By starting to interact with the application, they gained knowledge about the boundaries of the virtual scene. They interacted on the screen with 3D geometric objects such as a cube, a parallelogram, and a sphere. After becoming familiar with the device and the dimensions of the virtual environment, they perceived the object and its geometry quite well. Then, the cube was used to apply the textures studied to its surface. The users showed difficulty in perceiving the materiality correspondence shape that the textures represented, so they were given information about the pattern of each texture they interacted with. The observation results showed little difference in the perception capabilities of the two subjects. They performed well on the uniformity recognition test but showed difficulty in the identification test, and classified the textures according to the difficulty they faced. They rated patterns 1, 4, 5, 8, and 9 as particularly difficult and patterns 2, 3, 6, and 7 as less difficult. Additionally, they could not distinguish the difference between patterns 2 and 3 which were the parallel lines. The total interaction time for each user did not exceed 45 min. Participants stated at the end that more interaction time would affect their performance and familiarity with the device. Encouragingly, they quickly developed a satisfactory understanding of the virtual space and the object’s dimensions. Difficulty and some dissatisfaction were expressed with pattern identification, leading to the abandonment of the effort after a few minutes of unsuccessful attempts. The users pointed out that the structure and the device did not help activate the skin contact of the fingers (palpation), which is the main way they perceive the physical world. Instead, it helps activate the elbow and shoulder muscles through force feedback. This is to be expected, as the Touch device belongs to the kinesthetic rather than the cutaneous haptic technology. It is worth noting that this fact benefited one of the two users, who described himself as a musician with experience in using musical instruments, which helped him adapt certain muscle groups of the hand more quickly and efficiently and to better utilize his perception. Finally, they described the application as very useful for future development, fun, and original. Based on these facts, we proceeded with the experimental procedure with sighted users.
The use of sighted users in pilot tests of an application for blind people may seem contradictory, but there are several benefits to it. This method can impact the arrangement of graphical elements. Sighted users can check if the images are correctly placed and set the virtual scene. Additionally, our application includes different vibrations as a key element of tactile interaction (various texture patterns), allowing sighted users to verify if the vibrations are strong and distinguishable or if adjustments are needed to ensure they are clear and understandable. At the same time, they can verify if the visual feedback matches the tactile experience and if it enhances the understanding of the feedback. For example, in any type of interaction (e.g., vibrations for different types of taps or movements), sighted users can assess if the vibrations appropriately correspond to the actions and if they are realistic [29]. Studies have shown that while blind users have more experience of haptic interaction, the difference between the tactile acuity of sighted and blind people is simply too small to matter [30].
The experimental study involved a total of 35 participants, 27 males and 8 females, all of them students, specifically from a technological sciences department. The experiment was conducted in a room with enough silence, where the participant was comfortably seated in front of a table equipped with the Touch device and a computer screen (Figure 2a). At first, after the participants had given their consent and were informed about the objectives of the study, they were presented with the virtual 3D object without any pattern on it. After being informed about the use of the haptic device and the method of interaction with it, they were invited to explore the 3D object, initially with the surface being smooth, while later on, with the above 9 patterns being loaded onto the surface in turn, every 45 s. While users interacted with the patterns and observed how the device responded to them, they were given additional information and updates to make it easier for them to perceive these haptic textures. During this phase, each pattern was presented at a different scale (Figure 2b) and the user was asked to freely explore the virtual surface of the object, while judging whether it matched the realistic image of the texture pattern represented each time.
After familiarizing themselves in the first stage with the haptic device and the textures of the 9 patterns, each participant was presented with the same object without being able to see the graphic texture, and having to identify it through the haptic sense. For the haptic only task, the setup shown in Figure 3a was used, where the virtual surface was divided into 4 imaginary, isometric sections (Figure 3b).
Each time, one of the 36 different combinations of patterns loaded. Each combination contained 3 different patterns, where one of them was presented twice. Altogether, 36 combinations of patterns were created, 4 for each geometric pattern (4 × 9) (Figure 4). The user was asked to identify the same geometric patterns among the other two geometric patterns. After recognizing the same patterns, the user was asked to identify the geometric patterns using only tactile cues. The combinations of patterns were presented in random order and each combination was presented twice. During the experiment, the identification score and the interaction time were measured. Finally, each user who experienced the full range of the 9 textures was emailed an evaluation in the form of a questionnaire.
The seven phases of the experimental procedures are presented in the diagram below (Figure 5). Phase A was a learning session with blind users. Phase B was a testing session with blind users. Phase C was the process of constructing the final experiment based on observations from phases A and B. Phase D was a learning session with non-blind users. Phase E was a testing session with both visual and tactile feedback from users. Phase F involved interaction solely with tactile feedback from users and the recognition and identification test. Phase G was the evaluation process with the help of a questionnaire from the participants.

5. Results

The aim of the study was to better understand how users perceive haptic technology by interacting with objects in 3D environments. The results of users’ interaction with the 36 different combinations of patterns and the questionnaire evaluation are analyzed below. For each pattern combination, two interactions were performed. In total (36 × 2) 72 interactions constitute the experiment. Some of the 35 users interacted with more than one combination. The findings of the experiment determine users’ ability to distinguish similar texture patterns within a 3D scene, as well as their ability to recognize these patterns.
User responses and scores on the evaluation form are analyzed separately. According to the structure of the experiment, users interact with different scales of nine texture patterns. After the end of the experiment, users answered a series of questions about (a) user experience with the haptic interaction device, (b) interaction time, (c) perception of realism in the graphic design for each pattern, (d) degree of ease/difficulty of recognition for each pattern, and (e) their preference for a texture pattern.

5.1. Analysis of the Experimental Procedure Results

Users were asked to identify the two identical patterns from a set of four patterns (Figure 3b) and then match each of the nine patterns from the list (Figure 1). Through the experimental procedure, we evaluated participants’ performance in identifying similar and dissimilar patterns in a haptic 3D scene and their ability to identify these patterns from the nine we studied. Each correct answer was scored out of 100. The mean score of identifying similar patterns among others and the mean score of recognition accuracy for each pattern was calculated.
Figure 6 and Table 1 present the percentage of correct answers versus the number of trials for the uniformity detection score and the recognition score.

5.2. Analysis of the Results of the Evaluation Questionnaire

Twenty-five (25) out of thirty-five (35) participants responded to the evaluation form that was sent to them by email after the end of the experiment. The users were asked to complete a questionnaire on five usability questions rated on a Likert scale. The questions given to the users are depicted in Appendix A.
Seventeen (17) of the respondents were male and eight (8) were female. Most of them (80%) were aged 18–25 and students of the Department of Information and Electronic Engineering. These 25 participants showed high positivity rates regarding the user-friendliness of the haptic device. Specifically, they were asked to rate the user-friendliness of the haptic interface based on the Likert scale from 5 (very much) to 1 (not at all). The results are presented in Figure 7a. Similarly, a high percentage felt that more interaction time would help improve the perception and recognition of the textures (Figure 7b).
By analyzing the evaluation responses of the 25 participants, the percentage of the total responses’ average scores for each question was measured. Since responses were given on a Likert scale of 5 (very much) to 1 (not at all), the total score is indicated by the sum (SUM) of the scale responses for each question, and the maximum possible score (MAXSUM) can, thus, be measured:
MAXSUM = 25(participants) × 5(maximum scale score)
Thus, the average score percentage is,
RATE = (SUM × 100)/MAXSUM
Τhe overall results of the evaluation form are depicted in Figure 8 and Table 2. Rate1 refers to the realistic perception rendered by the graphical pattern design during interaction. Rate2 measures the easiness of identifying the texture among other textures, and Rate3 measures which pattern provides a better user experience.

6. Discussion

The experimental procedure showed an average pattern uniformity detection rate of 54.2% and an average pattern recognition rate of 29.2% for the nine texture patterns studied. Therefore, we conclude that during the interaction, participants perceived the diversity of textures, but had difficulty accurately identifying the patterns behind them. The patterns that showed a high rate of uniformity detection (>50) were patterns 2, 3, and 9, in contrast to patterns 4, 7, and 8 which showed a low rate (<50). Patterns 1, 5, and 6 had a detection rate of 50% (Figure 6, Table 1).
In terms of identification rate, pattern 2 had the highest identification rate, followed by pattern 3. The rest of the patterns showed a low identification rate, especially pattern 8, which was not identified at all. Patterns with parallel lines were easier to identify since most participants applied the haptic stylus to parallel horizontal movements without encountering resistance, which led them to distinguish these patterns. Pattern 8 (triangle pattern) was not identified at all. Several users found it difficult to identify it, stating that they did not perceive the triangle shape compared to other patterns which involved a circle shape (pattern 6) or a square shape (pattern 7) (Figure 6, Table 1).
Looking at Table 1, we see there is a difference in the Average Pattern Uniformity Detection score and the Average Pattern Recognition score for all patterns. A small deviation is observed in all the patterns (<50), while a large deviation is observed only in pattern 9 (>50). Several users often mistook pattern 9 for pattern 5 or vice versa.
For almost all patterns the interaction time was the same. However, users interacted more with pattern 7 (190 s on average), followed by patterns 1, 8, and 9 at the same interaction time (176 s on average), as well as with patterns 2, 3, and 4 (141 s on average). Pattern 6 required the least time, just 136 s on average. Each user used their technique and rhythm for each pattern. Factors such as psychological state, fatigue, lack of interest, difficulty causing frustration, etc., influenced the individual’s willingness to devote quality time to the interaction (Table 1).

6.1. Specific Analysis

As mentioned, 36 combinations of patterns were created for the nine geometric patterns, and presented to the participants in random order, with each combination presented twice. Analyzing the 72 interactions separately found the following:
Pattern 1 was more easily recognized when used in combination with pattern 2 and 5 rather than with pattern 8 and 9.
Pattern 2 was more easily recognized when used in combination with any other pattern.
Pattern 3 was more easily recognized when used in combination with any pattern rather than with patterns 8 and 9.
Pattern 4 was not easily recognized when used in combination with patterns 5 and 7.
For patterns 5 and 6, associations with other patterns did not affect participants’ decisions.
Pattern 7 was not easily recognized when used in combination with patterns 2 and 6.
Pattern 8 was not easily recognized when used in combination with any patterns, especially with patterns 1, 9, 3, and 6.
Pattern 9 was not easily recognized when used in combination with any patterns, and especially with patterns 1, 3, and 6.
The experiment reveals that when more than one tactile texture pattern is present in a virtual 3D scene, users can easily separate them, but cannot easily recognize them. The presence of patterns 2, 3, and 9, among others, facilitated the perception of pattern differentiation in the scene, whereas patterns 4, 7, and 8 disrupted it. Patterns 2 and 3 were identified in a higher proportion while the others confused the users, especially patterns 8 and 9. The presence of pattern 9 helped with the perception of differentiation, but was hardly recognized by the participants and often confused with pattern 5. The presence of patterns 8 and 9 in the scene disrupted the identification process of all the patterns that were likely to be present. For the remaining patterns (1, 4, 5, 6, 7), no clear conclusions could be drawn from the experimental procedure.

6.2. Questionnaire Analysis

Of the 25 participants, 82.4% found the Haptic Touch device quite easy to use. Additionally, 88% believe they would have achieved better perceptual results with more interaction time. Users who interacted with more than one pattern combination improved their performance during the experimental process. Some users interacted with up to four different combinations of patterns. Some expressed fatigue and negativity about additional interaction. Additional interaction would probably negatively affect their performance.
In the investigation of the realistic perception attributed to the design of the graphic pattern during the interaction, all test patterns showed a high percentage (>60%), with patterns 2 and 3 having the highest match rate and patterns 8 and 9 the lowest. In the recognition survey, the percentages were lower for each trial, with patterns 2 and 3 having the highest percentages and patterns 8, 9, and 4 having the lowest (<60%). Finally, in the survey for the most preferred pattern in a virtual scene, participants highly preferred patterns 2, 3, and 6, followed by patterns 1, 5, and 7, while patterns 4, 8, and 9 were last in the order of preference (Figure 8, Table 2).
Considering the questionnaire outcomes and participants’ statements post experiment, it can be concluded that users found the interaction interesting, unique, and a pleasant experience. They were excited by the haptic contact with the 3D object. They described the force feedback they felt as accurate and very responsive. Most of them stated they needed more training and practice time to improve their perception. The Touch device was described as fairly easy to use and technologically advanced. However, they were not delighted with its high purchase price. Some users showed negativity as the interaction time increased. The description of the patterns and the visual contact with them helped during the educational process. Participants recognized the parallel line patterns (2 and 3) easily. The geometric patterns (8 and 9) were placed last in the order of preference. Also, the performance improved proportionally with interaction time.

7. Conclusions

Research and studies on haptic texture patterns are still in their infancy. In the future, the results and conclusions of such studies will contribute to the sustainability and improvement of daily life areas, especially for people with blindness. This study presented the ability to recognize and identify haptic geometric patterns. A series of experiments and interactions highlighted the main perceptual features of users. A commercial 3D force feedback haptic device was used for the interaction between users and the virtual 3D object. Nine different haptic textures were evaluated through the perceptual experiment, making it clear that there is still much to be done to improve the specific textures in terms of design and haptic characteristics.
Based on the feedback received during the tests, participants showed particular enthusiasm for haptic technology and mentioned its potential applications in various daily life areas. There is unanimity about the enormous potential and prospects of haptic technology, while some limitations are discernible in both the device’s use and human perception sensitivity. The complex nature of human skin makes it very difficult to capture and transfer haptic information correctly. Future research could build on this study’s results, with a view to using the specific patterns as representation of objects in more complex scenes. For better performance in identifying virtual objects, more haptic features and actuators should be tested, which is the direction of our future work. Despite improvements, user differentiation factors must be considered when developing new haptic technologies. It would also be interesting to study the relationship between haptic textures and the emotions experienced by users during interactions. Developing virtual 3D objects and models easily recognizable in detail through haptics will improve areas such as culture, tourism, telemedicine, education, e-commerce, etc., by raising their level of accessibility.

Author Contributions

Conception and design of experiments: N.T.; performance of experiments: N.T. and G.K.; analysis of data: N.T. and G.V.; writing of the manuscript: N.T., G.V., S.K. and G.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy restrictions.

Acknowledgments

The authors would like to express their sincere gratitude to all the individuals that participated in this study.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

  • How easy was it to use the Touch Haptic Device during the interaction?
  • Would more interaction time help in better perception and recognition of textures?
  • Is there a relationship between the haptic texture and the realistic design it represents?
  • Did you easily recognize the texture among others?
  • Would the presence of such a texture on a 3D object in a virtual environment make the object easier to recognize?

References

  1. Nyasulu, T.D.; Du, S.; Steyn, N.; Dong, E. A Study of Cutaneous Perception Parameters for Designing Haptic Symbols towards Information Transfer. Electronics 2021, 10, 2147. [Google Scholar] [CrossRef]
  2. Periyaswamy, T.; Islam, R. Tactile Rendering of Textile Materials. J. Text. Sci. Fash. Technol. (JTSFT) 2022, 9, 1–10. [Google Scholar] [CrossRef]
  3. Hassan, W.; Joolee, B.J.; Steyn, N.; Jeon, S. Establishing haptic texture attribute space and predicting haptic attributes from image features using 1D-CNN. Sci. Rep. 2023, 13, 11684. [Google Scholar] [CrossRef] [PubMed]
  4. Rodríguez, J.L.; Velázquez, R.; Del-Valle-Soto, C.; Gutiérrez, S.; Varona, J.; Enríquez-Zarate, J. Active and Passive Haptic Perception of Shape: Passive Haptics Can Support Navigation. Electronics 2019, 8, 355. [Google Scholar] [CrossRef]
  5. Klatzky, R.L.; Nayak, A.; Stephen, I.; Dijour, D.; Tan, H.Z. Detection and Identification of Pattern Information on an Electrostatic Friction Display. IEEE Trans. Haptics 2019, 12, 665–670. [Google Scholar] [CrossRef]
  6. See, A.R.; Choco, J.A.G.; Chandramohan, K. Touch, Texture and Haptic Feedback: A Review on How We Feel the World around Us. Appl. Sci. 2022, 12, 4686. [Google Scholar] [CrossRef]
  7. Halabi, O.; Khattak, G. Generating haptic texture using solid noise. Displays 2021, 69, 102048. [Google Scholar] [CrossRef]
  8. Vicente, M.A.P.; Parada, L.R.; Ares, P.F.M.; Gonzales, F.A. Haptic Hybrid Prototyping (HHP): An AR Application for Texture Evaluation with Semantic Content in Product Design. Appl. Sci. 2019, 9, 5081. [Google Scholar] [CrossRef]
  9. Castiço, A.; Cardoso, P. Usability Tests for Texture Comparison in an ElectroadhesionBased Haptic Device. Multimodal Technol. Interact. 2022, 6, 108. [Google Scholar] [CrossRef]
  10. Kokkonis, G.; Moysiadis, V.; Kollatou, T.; Kontogiannis, S.; Aiden, J. Design Tactile Interfaces with Vibration Patterns in HTML5 for Smartphone Users with Visual Impairments. Int. J. Multimed. Its Appl. (IJMA) 2019, 11, 1–9. [Google Scholar]
  11. Mun, S.; Lee, H.; Choi, S. Perceptual Space of Regular Homogeneous Haptic Textures Rendered Using Electrovibration. In Proceedings of the IEEE World Haptics Conference (WHC), Tokyo, Japan, 9 December 2019. [Google Scholar]
  12. Saga, S.; Kurogi, J. Sensing and Rendering Method of 2-Dimensional Haptic Texture. Sensors 2021, 21, 5523. [Google Scholar] [CrossRef] [PubMed]
  13. Yan, Y.; Hu, Z.; Shen, Y.; Pan, J. Surface Texture Recognition by Deep Learning-Enhanced Tactile Sensing. Adv. Intell. Syst. 2021, 4, 2100076. [Google Scholar] [CrossRef]
  14. Culbertson, H.; Kuchenbecker, K.J. Importance of Matching Physical Friction, Hardness, and Texture in Creating Realistic Haptic Virtual Surfaces. IEEE Trans. Haptics 2016, 10, 63–74. [Google Scholar] [CrossRef] [PubMed]
  15. Li, Y.; Wang, F.; Tao, L.; Wu, J. Multi-Modal Haptic Rendering Based on Genetic Algorithm. Electronics 2022, 11, 3878. [Google Scholar] [CrossRef]
  16. Hu, H.; Song, A. Haptic Texture Rendering of 2D Image Based on Adaptive Fractional Differential Method. Appl. Sci. 2022, 12, 12346. [Google Scholar] [CrossRef]
  17. Shao, Z.; Wu, J.; Ouyang, Q.; He, C.; Cao, Z. Multi-Layered Perceptual Model for Haptic Perception of Compliance. Electronics 2019, 8, 1497. [Google Scholar] [CrossRef]
  18. Papadopoulos, K.; Koustriava, E.; Georgoula, E.; Kalpia, V. Individuals with and without Visual Impairments Use a Force Feedback Device to Identify the Friction and Hardness of Haptic Surfaces. Sensors 2022, 22, 9745. [Google Scholar] [CrossRef]
  19. Kim, H.; Han, H.; Hyun, K.H. HAPshop: How Haptic Information Affects Consumers’ Purchase Intentions toward Online Products. SSRN 2023. [Google Scholar] [CrossRef]
  20. Park, W.; Jamil, M.H.; Gebremedhin, R.G.; Eid, M. Effects of Tactile Textures on Preference in Visuo-Tactile Exploration. ACM Trans. Appl. Percept. 2021, 18, 1–13. [Google Scholar] [CrossRef]
  21. Henriques, A.C.; Winkler, I. The Advancement of Virtual Reality in Automotive Market Research: Challenges and Opportunities. Appl. Sci. 2021, 11, 11610. [Google Scholar] [CrossRef]
  22. Watanabe, T.; Yamaguchi, T.; Koda, S.; Minatani, K. Tactile Map Automated Creation System Using OpenStreetMap. In Proceedings of the 14th International Conference on Computers Helping People with Special Needs, Paris, France, 9–11 July 2014. [Google Scholar]
  23. Tang, Y.; Liu, S.; Deng, Y.; Zhang, Y.; Yin, L.; Zheng, W. Construction of force haptic reappearance system based on Geomagic Touch haptic device. Comput. Methods Programs Biomed. 2020, 190, 105344. [Google Scholar] [CrossRef] [PubMed]
  24. Massie, T.H.; Salisbury, J.K. The PHANTOM Haptic Interface: A Device for Probing Virtual Objects. In Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Chicago, IL, USA, 6–11 November 1994. [Google Scholar]
  25. 3D Systems, Touch Haptic Device. Available online: https://www.3dsystems.com/haptics-devices/touch (accessed on 10 April 2024).
  26. SenseGraphics AB. H3D API—Haptics Software Development Platform. Available online: http://www.h3dapi.org (accessed on 10 April 2024).
  27. Lup, F.H.; Sopin, I. Haptics and Extensible 3D in Web-Based Environments for E-Learning and Simulation. In Proceedings of the Fourth International Conference on Web Information Systems and Technologies (WEBIST 2008), Madeira, Portugal, 4 July 2008. [Google Scholar]
  28. Kokkonis, G.; Psanis, K.; Kontogiannis, S.; Asiminidis, C. Design Tactile Interfaces with Enhanced Depth Images With Patterns and Textures for Visually Impaired People. Int. J. Trend Sci. Res. Dev. 2018, 3, 1174–1178. [Google Scholar] [CrossRef]
  29. Palani, H.P.; Fink, P.D.S.; Giudice, N.A. Comparing Map Learning between Touchscreen-Based Visual and Haptic Displays: A Behavioral Evaluation with Blind and Sighted Users. Multimodal Technol. Interact. 2021, 6, 1. [Google Scholar] [CrossRef]
  30. Oh, U.; Findlater, L.A. Performance Comparison of On-Hand versus On-Phone Nonvisual Input by Blind and Sighted Users. ACM Trans. Access. Comput. (TACCESS) 2015, 7, 1–20. [Google Scholar] [CrossRef]
Figure 1. The nine texture patterns we study, representing nine categories of tests.
Figure 1. The nine texture patterns we study, representing nine categories of tests.
Electronics 13 03775 g001
Figure 2. (a) The Touch device and a computer monitor. (b) A pattern for the training phase with different scales.
Figure 2. (a) The Touch device and a computer monitor. (b) A pattern for the training phase with different scales.
Electronics 13 03775 g002
Figure 3. Experimental setup for haptic pattern texture evaluation without visual graphical textures of the haptic texture (a) and with visual graphical textures of the haptic texture (b).
Figure 3. Experimental setup for haptic pattern texture evaluation without visual graphical textures of the haptic texture (a) and with visual graphical textures of the haptic texture (b).
Electronics 13 03775 g003
Figure 4. Example 4 of the 36 different pattern combinations.
Figure 4. Example 4 of the 36 different pattern combinations.
Electronics 13 03775 g004
Figure 5. The seven phases of the experiment.
Figure 5. The seven phases of the experiment.
Electronics 13 03775 g005
Figure 6. Average score for uniformity detection and pattern recognition.
Figure 6. Average score for uniformity detection and pattern recognition.
Electronics 13 03775 g006
Figure 7. (a) User-friendliness of haptic device. (b) More interaction time.
Figure 7. (a) User-friendliness of haptic device. (b) More interaction time.
Electronics 13 03775 g007
Figure 8. Rate1, Rate2, and Rate3.
Figure 8. Rate1, Rate2, and Rate3.
Electronics 13 03775 g008
Table 1. Average score for uniformity detection and pattern recognition.
Table 1. Average score for uniformity detection and pattern recognition.
PatternsAverage Detection Score
Pattern Uniformity among Others
Average Recognition Score
Pattern
Average Interaction Time
(s)
Pattern 1 *5012.5176
Pattern 287.575142
Pattern 362.550141
Pattern 437.525140
Pattern 55025151
Pattern 65037.5136
Pattern 737.525190
Pattern 8250176
Pattern 987.512.5176
* Match to the patterns according to Figure 1.
Table 2. Results of the evaluation questionnaire.
Table 2. Results of the evaluation questionnaire.
PatternsGraphic Design Realism
Rate1
Identification
Rate2
Preference
Rate3
Pattern 1 *73.6%67.2%72%
Pattern 288.8%86.4%84.8%
Pattern 387.2%80%82.4%
Pattern 470.4%59.2%63.2%
Pattern 577.6%65.6%79.2%
Pattern 678.4%71.2%84%
Pattern 776.8%68%77.6%
Pattern 868.8%52.8%67.2%
Pattern 962.4%55.2%64%
* Match to the patterns according to Figure 1.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tzimos, N.; Voutsakelis, G.; Kontogiannis, S.; Kokkonis, G. Evaluation of Haptic Textures for Tangible Interfaces for the Tactile Internet. Electronics 2024, 13, 3775. https://doi.org/10.3390/electronics13183775

AMA Style

Tzimos N, Voutsakelis G, Kontogiannis S, Kokkonis G. Evaluation of Haptic Textures for Tangible Interfaces for the Tactile Internet. Electronics. 2024; 13(18):3775. https://doi.org/10.3390/electronics13183775

Chicago/Turabian Style

Tzimos, Nikolaos, George Voutsakelis, Sotirios Kontogiannis, and Georgios Kokkonis. 2024. "Evaluation of Haptic Textures for Tangible Interfaces for the Tactile Internet" Electronics 13, no. 18: 3775. https://doi.org/10.3390/electronics13183775

APA Style

Tzimos, N., Voutsakelis, G., Kontogiannis, S., & Kokkonis, G. (2024). Evaluation of Haptic Textures for Tangible Interfaces for the Tactile Internet. Electronics, 13(18), 3775. https://doi.org/10.3390/electronics13183775

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop